Last modified by Artur on 2025/09/30 12:30

From version 12.2
edited by Helena
on 2025/06/06 17:10
Change comment: There is no comment for this version
To version 5.2
edited by Helena
on 2025/06/06 16:57
Change comment: There is no comment for this version

Summary

Details

Page properties
Tags
... ... @@ -1,1 +1,0 @@
1 -SDMX Technical Specification|Statistical data and metadata exchange
Content
... ... @@ -10,11 +10,11 @@
10 10  
11 11  = 1 Introduction =
12 12  
13 -The [[Statistical Data and Metadata Exchange>>doc:sdmx:Glossary.Statistical data and metadata exchange.WebHome]] ([[SDMX>>doc:sdmx:Glossary.Statistical data and metadata exchange.WebHome]]) initiative (https:~/~/www.sdmx.org) sets standards that can facilitate the exchange of statistical data and metadata using modern information technology.
13 +The Statistical Data and Metadata Exchange (SDMX) initiative (https:~/~/www.sdmx.org) sets standards that can facilitate the exchange of statistical data and metadata using modern information technology.
14 14  
15 -The [[SDMX Technical Specifications>>doc:sdmx:Glossary.SDMX Technical Specification.WebHome]] are organised into several discrete sections.
15 +The SDMX Technical Specifications are organised into several discrete sections.
16 16  
17 -The following are published on the [[SDMX>>doc:sdmx:Glossary.Statistical data and metadata exchange.WebHome]] website  ([[__https:~~/~~/www.sdmx.org__>>https://https:www.sdmx.org]]):
17 +The following are published on the SDMX website ([[__https:~~/~~/www.sdmx.org__>>https://https:www.sdmx.org]]):
18 18  
19 19  **Section 1** **Framework for SDMX Technical Standards** – this document providing an introduction to the technical standards.
20 20  
... ... @@ -84,8 +84,7 @@
84 84  
85 85  SDMX version 3.0 introduces new features, improvements and changes to the Standard in the following key areas:
86 86  
87 -(% class="wikigeneratedid" id="HInformationModel" %)
88 -**Information Model**
87 +==== Information Model ====
89 89  
90 90  * Simplification and improvement of the reference metadata model
91 91  * Support for microdata
... ... @@ -95,13 +95,11 @@
95 95  * Improvements to code hierarchies for data discovery
96 96  * Improvements to constraints
97 97  
98 -(% class="wikigeneratedid" id="HVersioningofStructuralMetadataArtefacts" %)
99 -**Versioning of Structural Metadata Artefacts**
97 +==== Versioning of Structural Metadata Artefacts ====
100 100  
101 101  • Adoption of the three-number semantic versioning standard for structural metadata artefacts ([[__https:~~/~~/semver.org__>>https://https:semver.org]])
102 102  
103 -(% class="wikigeneratedid" id="HRESTWebServicesApplicationProgrammingInterface28API29" %)
104 -**REST Web Services Application Programming Interface (API)**
101 +==== REST Web Services Application Programming Interface (API) ====
105 105  
106 106  * Change to a single ‘structure’ resource for structure queries simplifying the REST API specification by reducing the number of resources to five
107 107  * Improvements to data queries
... ... @@ -108,13 +108,11 @@
108 108  * Improvements to reference metadata queries
109 109  * Support for structural metadata maintenance using HTTP PUT, POST and DELETE verbs
110 110  
111 -(% class="wikigeneratedid" id="HSOAPWebServicesAPI" %)
112 -**SOAP Web Services API**
108 +==== SOAP Web Services API ====
113 113  
114 114  • The SOAP web services API has been deprecated with version 3.0 standardising on REST
115 115  
116 -(% class="wikigeneratedid" id="HXML2CJSON2CCSVandEDITransmissionformats" %)
117 -**XML, JSON, CSV and EDI Transmission formats**
112 +==== XML, JSON, CSV and EDI Transmission formats ====
118 118  
119 119  * The SDMX-ML, SDMX-JSON and SDMX-CSV specifications have been extended and modified where needed to support the new features and changes such as reference metadata and microdata
120 120  * Obsolete SDMX-ML data message variants including Generic, Compact, Utility and Cross-sectional have been deprecated standardising on Structure Specific Data as the sole XML format for data exchange
... ... @@ -141,10 +141,10 @@
141 141  (% class="wikigeneratedid" id="HInformationModel-1" %)
142 142  **Information Model**
143 143  
144 -* Addition of Dimension Constraint property to a Dataflow
145 -* Addition of evolving structure property to a Data Structure Definition
146 -* Remove version property on Categorisation
147 -* Simplification of Constraints o Removal of Advanced Release Calendar
139 +* Addition of Dimension Constraint property to a Dataflow//  //
140 +* Addition of evolving structure property to a Data Structure Definition//  //
141 +* Remove version property on Categorisation//  //
142 +* Simplification of Constraints o Removal of Advanced Release Calendar//  //
148 148  
149 149  o Removal of Role, Data Constraints only restrict data that can be reported// //o Restrict constraint targets to Identifiable structures (not URLs) o Addition of Availability Constraint to define actual data
150 150  
... ... @@ -206,7 +206,7 @@
206 206  
207 207  The formal objects in the information model are presented schematically in Figure 1, and are discussed in more detail elsewhere in this document.
208 208  
209 -[[image:SDMX 3.1 Section 1.png]]
204 +[[image:SDMX_3-1-0_SECTION_1_FINAL_6728d8d4.png||height="829" width="606"]]
210 210  
211 211  **Figure 1: High Level Schematic of Major Artefacts in the SDMX 3.0 Information Model**
212 212  
... ... @@ -240,20 +240,29 @@
240 240  * //**Dataflow Definition:**// In SDMX, data sets are reported or disseminated according to a data flow definition. The data flow definition identifies the data structure definition and may be associated with one or more subject matter domains via a Categorisation (this facilitates the search for data according to organised category schemes). Constraints, in terms of reporting periodicity or sub set of possible keys that are allowed in a data set, may be attached to the data flow definition.
241 241  * //**Metadataflow Definition:**// A metadata flow definition is very similar to a data flow definition, but describes, categorises, and constrains metadata sets.
242 242  * //**Data Provider: **//An organization which produces data is termed a data provider.
243 -* //**Metadata Provider: **//An organization which produces reference metadata is termed a metadata provider.
238 +* //**Metadata Provider: **//An organization which produces reference metadata is termed a metadata provider.//  //
244 244  * //**Provision Agreement (Metadata Provision Agreement):**// The set of information which describes the way in which data sets and metadata sets are provided by a data/metadata provider. A provision agreement can be constrained in much the same way as a data or metadata flow definition. Thus, a data provider can express the fact that it provides a particular data flow covering a specific set of countries and topics, Importantly, the actual source of registered data or metadata is attached to the provision agreement (in terms of a URL). The term “agreement” is used because this information can be understood as the basis of a “service-level agreement”. In SDMX, however, this is informational metadata to support the technical systems, as opposed to any sort of contractual information (which is outside the scope of a technical specification). In version 3.0, metadata provision agreement and data provision agreement are two separate artefacts.
245 245  * //**Data Constraint:**// Used to restrict content (such as enumerations) and are used by provision agreements, data flows, data structure definitions in order to provide a set of reporting restrictions in the context of a collection
246 246  * //**Metadata Constraint:**// Used to restrict content (such as enumerations) and are used by metadata provision agreements, metadata flows, metadata structure definitions in order to provide a set of reporting restrictions in the context of a collection
247 -* • //**Available Data Constraint:**// Used to report the set of Component values that have data reported against them in the context of a Data Query. This structure allows a user to know what valid filters can be applied to a cube of data, such that the resulting cube will contain data.
248 -* • //**Structure Map: **//Structure maps describes a mapping between data structure definitions or dataflows for the purpose of transforming a data set into a different structure. The mapping rules are defined using one or more component maps which each map in turn describes how one or more components from the source data structure definition map to one or more components in that of the target. Represent maps act as lookup tables and specific provision is made for mapping dates and times.
249 -* • //**Representation Map:**// Representation maps describe mappings between source value(s) and target value(s) where the values are restricted to those in a code list, value list or be of a certain type such as integer or string.
250 -* • //**Item Scheme Map:**// An item scheme map describes mapping rules between any item scheme with the exception of code lists and value lists which use representation maps. The version 3.0 information model provides four item scheme maps: organisation scheme map, concept scheme map, category scheme map and reporting taxonomy map. Organisation scheme map and reporting scheme map have been omitted from the information model schematic in Figure 1.
251 -* • //**Reporting Taxonomy: **//A reporting taxonomy allows an organisation to link (possibly in a hierarchical way) a number of cube or data flow definitions which together form a complete “report” of data or metadata. This supports primary reporting which often comprises multiple cubes of heterogeneous data, but may also support other collection and reporting functions. It also supports the specification of publications such as a yearbook, in terms of the data or metadata contained in the publication.
252 -* • //**Process:**// The process class provides a way to model statistical processes as a set of interconnected //process steps.// Although not central to the exchange and dissemination of statistical data and metadata, having a shared description of processing allows for the interoperable exchange and dissemination of reference metadata sets which describe processes-related concepts.
253 -* • //**Hierarchy**//: Describes complex code hierarchies principally for data discovery purposes. The codes themselves are referenced from the code lists in which they are maintained.
254 -* • //**Hierarchy Association**//: A hierarchy association links a hierarchy to something that needs it like a dimension. Furthermore, the linking can be specified in the context of another object such as a dimension in the context of a dataflow. Thus, a dimension in a data structure definition could have different hierarchies depending on the dataflow.
255 -* • //**Transformation Scheme:**// A transformation scheme is a set of Validation and Transformation Language (VTL) transformations aimed at obtaining some meaningful results for the user (e.g., the validation of one or more data sets). The set of transformations is meant to be executed together (in the same run) and may contain 597 any number of transformations in order to produce any number of results. Thus, a transformation scheme can be considered as a VTL ‘program’.
256 256  
243 +• //**Available Data Constraint:**// Used to report the set of Component values that have data reported against them in the context of a Data Query. This structure allows a user to know what valid filters can be applied to a cube of data, such that the resulting cube will contain data.
244 +
245 +• //**Structure Map: **//Structure maps describes a mapping between data structure definitions or dataflows for the purpose of transforming a data set into a different structure. The mapping rules are defined using one or more component maps which each map in turn describes how one or more components from the source data structure definition map to one or more components in that of the target. Represent maps act as lookup tables and specific provision is made for mapping dates and times.
246 +
247 +• //**Representation Map:**// Representation maps describe mappings between source value(s) and target value(s) where the values are restricted to those in a code list, value list or be of a certain type such as integer or string.
248 +
249 +• //**Item Scheme Map:**// An item scheme map describes mapping rules between any item scheme with the exception of code lists and value lists which use representation maps. The version 3.0 information model provides four item scheme maps: organisation scheme map, concept scheme map, category scheme map and reporting taxonomy map. Organisation scheme map and reporting scheme map have been omitted from the information model schematic in Figure 1.
250 +
251 +• //**Reporting Taxonomy: **//A reporting taxonomy allows an organisation to link (possibly in a hierarchical way) a number of cube or data flow definitions which together form a complete “report” of data or metadata. This supports primary reporting which often comprises multiple cubes of heterogeneous data, but may also support other collection and reporting functions. It also supports the specification of publications such as a yearbook, in terms of the data or metadata contained in the publication.
252 +
253 +• //**Process:**// The process class provides a way to model statistical processes as a set of interconnected //process steps.// Although not central to the exchange and dissemination of statistical data and metadata, having a shared description of processing allows for the interoperable exchange and dissemination of reference metadata sets which describe processes-related concepts.
254 +
255 +• //**Hierarchy**//: Describes complex code hierarchies principally for data discovery purposes. The codes themselves are referenced from the code lists in which they are maintained.
256 +
257 +• //**Hierarchy Association**//: A hierarchy association links a hierarchy to something that needs it like a dimension. Furthermore, the linking can be specified in the context of another object such as a dimension in the context of a dataflow. Thus, a dimension in a data structure definition could have different hierarchies depending on the dataflow.
258 +
259 +• //**Transformation Scheme:**// A transformation scheme is a set of Validation and Transformation Language (VTL) transformations aimed at obtaining some meaningful results for the user (e.g., the validation of one or more data sets). The set of transformations is meant to be executed together (in the same run) and may contain 597 any number of transformations in order to produce any number of results. Thus, a transformation scheme can be considered as a VTL ‘program’.
260 +
257 257  == 3.5 SDMX Registry Services ==
258 258  
259 259  In order to provide visibility into the large amount of data and metadata which exists within the SDMX model of statistical exchange, it is felt that an architecture based on a set of registry services is potentially useful. A “registry” – as understood in webservices terminology – is an application which maintains and stores metadata for querying, and which can be used by any other application in the network with sufficient access privileges (though note that the mechanism of access control is outside of the scope of the SDMX standard). It can be understood as the index of a distributed database or metadata repository which is made up of all the data provider’s data sets and reference metadata sets within a statistical community, located across the Internet or similar network.
SDMX 3.1 Section 1.png
Author
... ... @@ -1,1 +1,0 @@
1 -xwiki:XWiki.helena
Size
... ... @@ -1,1 +1,0 @@
1 -101.7 KB
Content
SDMX_3-1-0_SECTION_1_FINAL_6728d8d4.png
Author
... ... @@ -1,0 +1,1 @@
1 +xwiki:XWiki.helena
Size
... ... @@ -1,0 +1,1 @@
1 +403.7 KB
Content
SUZ.Methodology.Code.MethodologyClass[0]