Last modified by Artur on 2025/08/19 10:43

From version 7.15
edited by Helena
on 2025/05/21 22:27
Change comment: There is no comment for this version
To version 5.6
edited by Helena
on 2025/05/21 21:52
Change comment: There is no comment for this version

Summary

Details

Page properties
Content
... ... @@ -69,15 +69,13 @@
69 69  
70 70  To allow for applications which only understand time series data, variations of these formats have been introduced in the form of two data messages; //GenericTimeSeriesData// and //StructureSpecificTimeSeriesData//. It is important to note that these variations are built on the same root structure and can be processed in the same manner as the base format so that they do NOT introduce additional processing requirements.
71 71  
72 -(% class="wikigeneratedid" id="HStructureDefinition" %)
73 -**//Structure Definition//**
72 +=== //Structure Definition// ===
74 74  
75 75  The SDMX-ML Structure Message supports the use of annotations to the structure, which is not supported by the SDMX-EDI syntax.
76 76  
77 77  The SDMX-ML Structure Message allows for the structures on which a Data Structure Definition depends – that is, codelists and concepts – to be either included in the message or to be referenced by the message containing the data structure definition. XML syntax is designed to leverage URIs and other Internet-based referencing mechanisms, and these are used in the SDMX-ML message. This option is not available to those using the SDMX-EDI structure message.
78 78  
79 -(% class="wikigeneratedid" id="HValidation" %)
80 -**//Validation//**
78 +=== //Validation// ===
81 81  
82 82  SDMX-EDI – as is typical of EDIFACT syntax messages – leaves validation to dedicated applications (“validation” being the checking of syntax, data typing, and adherence of the data message to the structure as described in the structural definition.)
83 83  
... ... @@ -85,22 +85,19 @@
85 85  
86 86  The SDMX-ML DSD-specific messages will allow validation of XML syntax and datatyping to be performed with a generic XML parser, and enforce agreement between the structural definition and the data to a moderate degree with the same tool.
87 87  
88 -(% class="wikigeneratedid" id="HUpdateandDeleteMessagesandDocumentationMessages" %)
89 -//Update and Delete Messages and Documentation Messages//
86 +=== //Update and Delete Messages and Documentation Messages// ===
90 90  
91 91  All SDMX data messages allow for both delete messages and messages consisting of only data or only documentation.
92 92  
93 -(% class="wikigeneratedid" id="HCharacterEncodings" %)
94 -**//Character Encodings//**
90 +=== //Character Encodings// ===
95 95  
96 96  All SDMX-ML messages use the UTF-8 encoding, while SDMX-EDI uses the ISO 8879-1 character encoding. There is a greater capacity with UTF-8 to express some character sets (see the “APPENDIX: MAP OF ISO 8859-1 (UNOC) CHARACTER SET (LATIN 1 OR “WESTERN”) in the document “SYNTAX AND DOCUMENTATION VERSION 2.0”.) Many transformation tools are available which allow XML instances with UTF-8 encodings to be expressed as ISO 8879-1-encoded characters, and to transform UTF-8 into ISO 8879-1. Such tools should be used when transforming SDMX-ML messages into SDMX-EDI messages and vice-versa.
97 97  
98 -(% class="wikigeneratedid" id="HDataTyping" %)
99 -**//Data Typing//**
94 +=== //Data Typing// ===
100 100  
101 101  The XML syntax and EDIFACT syntax have different data-typing mechanisms. The section below provides a set of conventions to be observed when support for messages in both syntaxes is required. For more information on the SDMX-ML representations of data, see below.
102 102  
103 -=== 3.3.2 Data Types ===
98 +==== 3.3.2 Data Types ====
104 104  
105 105  The XML syntax has a very different mechanism for data-typing than the EDIFACT syntax, and this difference may create some difficulties for applications which support both EDIFACT-based and XML-based SDMX data formats. This section provides a set of conventions for the expression in data in all formats, to allow for clean interoperability between them.
106 106  
... ... @@ -414,7 +414,7 @@
414 414  
415 415  This is used to unambiguously state that a date-time represents an observation at a single point in time. Therefore, if one wants to use SDMX for data which is measured at a distinct point in time rather than being reported over a period, the date-time representation can be used.
416 416  
417 -Representation: xs:dateTime (YYYY-MM-DDThh:mm:ss)[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[1~]^^>>path:#_ftn1]]
412 +Representation: xs:dateTime (YYYY-MM-DDThh:mm:ss)[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[1~]^^>>path:#_ftn1]]
418 418  
419 419  === 4.2.6 Standard Reporting Period ===
420 420  
... ... @@ -463,7 +463,7 @@
463 463  Period Duration: P7D (seven days)
464 464  Limit per year: 53
465 465  Representation: common:ReportingWeekType (YYYY-Www, e.g. 2000-W53)
466 -Notes: There are either 52 or 53 weeks in a reporting year. This is based on the ISO 8601 definition of a week (Monday - Saturday), where the first week of a reporting year is defined as the week with the first Thursday on or after the reporting year start day.[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[2~]^^>>path:#_ftn2]](%%) The reporting week is always represented as two digits, therefore 1-9 are 0 padded (e.g. 01). This allows the values to be sorted chronologically using textual sorting methods.
461 +Notes: There are either 52 or 53 weeks in a reporting year. This is based on the ISO 8601 definition of a week (Monday - Saturday), where the first week of a reporting year is defined as the week with the first Thursday on or after the reporting year start day.[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[2~]^^>>path:#_ftn2]](%%) The reporting week is always represented as two digits, therefore 1-9 are 0 padded (e.g. 01). This allows the values to be sorted chronologically using textual sorting methods.
467 467  
468 468  **Reporting Day**:
469 469  Period Indicator: D
... ... @@ -486,7 +486,7 @@
486 486  ~1. If [REPORTING_YEAR_START_DATE] is a Friday, Saturday, or Sunday:**
487 487  Add^^3^^ (P3D, P2D, or P1D respectively) to the [REPORTING_YEAR_START_DATE]. The result is the [REPORTING_YEAR_BASE].
488 488  
489 -2. **If [REPORTING_YEAR_START_DATE] is a Monday, Tuesday, Wednesday, or Thursday:**
484 +​​​​​​​2. **If [REPORTING_YEAR_START_DATE] is a Monday, Tuesday, Wednesday, or Thursday:**
490 490  Add^^3^^ (P0D, -P1D, -P2D, or -P3D respectively) to the [REPORTING_YEAR_START_DATE]. The result is the [REPORTING_YEAR_BASE].
491 491  b) **Else:** 
492 492  The [REPORTING_YEAR_START_DATE] is the [REPORTING_YEAR_BASE]
... ... @@ -502,7 +502,7 @@
502 502  g) If the [PERIOD_INDICATOR] is D, the [PERIOD_DURATION] is P1D.
503 503  
504 504  **3. Determine [PERIOD_START]:**
505 -Subtract one from the [PERIOD_VALUE] and multiply this by the [PERIOD_DURATION]. Add[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[3~]^^>>path:#_ftn3]](%%) this to the [REPORTING_YEAR_BASE]. The result is the [PERIOD_START].
500 +Subtract one from the [PERIOD_VALUE] and multiply this by the [PERIOD_DURATION]. Add[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[3~]^^>>path:#_ftn3]](%%) this to the [REPORTING_YEAR_BASE]. The result is the [PERIOD_START].
506 506  
507 507  **4. Determine the [PERIOD_END]:**
508 508  Multiply the [PERIOD_VALUE] by the [PERIOD_DURATION]. Add^^3^^ this to the [REPORTING_YEAR_BASE] add^^3^^ -P1D. The result is the [PERIOD_END].
... ... @@ -526,62 +526,72 @@
526 526  The actual calendar range covered by 2010-Q2 (assuming the reporting year begins July 1) is 2010-10-01T00:00:00/2010-12-31T23:59:59
527 527  
528 528  **2011-W36, REPORTING_YEAR_START_DAY = ~-~-07-01 (July 1)**
524 +
529 529  ~1. [REPORTING_YEAR_START_DATE] = 2010-07-01
526 +
530 530  a) 2011-07-01 = Friday
528 +
531 531  2011-07-01 + P3D = 2011-07-04
530 +
532 532  [REPORTING_YEAR_BASE] = 2011-07-04
533 -2. [PERIOD_DURATION] = P7D
534 -3. (36-1) * P7D = P245D
532 +
533 +1. [PERIOD_DURATION] = P7D
534 +1. (36-1) * P7D = P245D
535 +
535 535  2011-07-04 + P245D = 2012-03-05
537 +
536 536  [PERIOD_START] = 2012-03-05
539 +
537 537  4. 36 * P7D = P252D
541 +
538 538  2011-07-04 + P252D =2012-03-12
543 +
539 539  2012-03-12 + -P1D = 2012-03-11
545 +
540 540  [PERIOD_END] = 2012-03-11
541 541  
542 542  The actual calendar range covered by 2011-W36 (assuming the reporting year begins July 1) is 2012-03-05T00:00:00/2012-03-11T23:59:59
543 543  
544 -=== 4.2.7 Distinct Range ===
550 +==== 4.2.7 Distinct Range ====
545 545  
546 546  In the case that the reporting period does not fit into one of the prescribe periods above, a distinct time range can be used. The value of these ranges is based on the ISO 8601 time interval format of start/duration. Start can be expressed as either an ISO 8601 date or a date-time, and duration is expressed as an ISO 8601 duration. However, the duration can only be postive.
547 547  
548 -=== 4.2.8 Time Format ===
554 +==== 4.2.8 Time Format ====
549 549  
550 550  In version 2.0 of SDMX there is a recommendation to use the time format attribute to gives additional information on the way time is represented in the message. Following an appraisal of its usefulness this is no longer required. However, it is still possible, if required , to include the time format attribute in SDMX-ML. 
551 551  
552 -(% style="width:1049.29px" %)
553 -|**Code**|(% style="width:926px" %)**Format**
554 -|**OTP**|(% style="width:926px" %)Observational Time Period: Superset of all SDMX time formats (Gregorian Time Period, Reporting Time Period, and Time Range)
555 -|**STP**|(% style="width:926px" %)Standard Time Period: Superset of Gregorian and Reporting Time Periods
556 -|**GTP**|(% style="width:926px" %)Superset of all Gregorian Time Periods and date-time
557 -|**RTP**|(% style="width:926px" %)Superset of all Reporting Time Periods
558 -|**TR**|(% style="width:926px" %)Time Range: Start time and duration (YYYY-MMDD(Thh:mm:ss)?/<duration>)
559 -|**GY**|(% style="width:926px" %)Gregorian Year (YYYY)
560 -|**GTM**|(% style="width:926px" %)Gregorian Year Month (YYYY-MM)
561 -|**GD**|(% style="width:926px" %)Gregorian Day (YYYY-MM-DD)
562 -|**DT**|(% style="width:926px" %)Distinct Point: date-time (YYYY-MM-DDThh:mm:ss)
563 -|**RY**|(% style="width:926px" %)Reporting Year (YYYY-A1)
564 -|**RS**|(% style="width:926px" %)Reporting Semester (YYYY-Ss)
565 -|**RT**|(% style="width:926px" %)Reporting Trimester (YYYY-Tt)
566 -|**RQ**|(% style="width:926px" %)Reporting Quarter (YYYY-Qq)
567 -|**RM**|(% style="width:926px" %)Reporting Month (YYYY-Mmm)
568 -|**Code**|(% style="width:926px" %)**Format**
569 -|**RW**|(% style="width:926px" %)Reporting Week (YYYY-Www)
570 -|**RD**|(% style="width:926px" %)Reporting Day (YYYY-Dddd)
558 +|**Code**|**Format**
559 +|**OTP**|Observational Time Period: Superset of all SDMX time formats (Gregorian Time Period, Reporting Time Period, and Time Range)
560 +|**STP**|Standard Time Period: Superset of Gregorian and Reporting Time Periods
561 +|**GTP**|Superset of all Gregorian Time Periods and date-time
562 +|**RTP**|Superset of all Reporting Time Periods
563 +|**TR**|Time Range: Start time and duration (YYYY-MMDD(Thh:mm:ss)?/<duration>)
564 +|**GY**|Gregorian Year (YYYY)
565 +|**GTM**|Gregorian Year Month (YYYY-MM)
566 +|**GD**|Gregorian Day (YYYY-MM-DD)
567 +|**DT**|Distinct Point: date-time (YYYY-MM-DDThh:mm:ss)
568 +|**RY**|Reporting Year (YYYY-A1)
569 +|**RS**|Reporting Semester (YYYY-Ss)
570 +|**RT**|Reporting Trimester (YYYY-Tt)
571 +|**RQ**|Reporting Quarter (YYYY-Qq)
572 +|**RM**|Reporting Month (YYYY-Mmm)
573 +|**Code**|**Format**
574 +|**RW**|Reporting Week (YYYY-Www)
575 +|**RD**|Reporting Day (YYYY-Dddd)
571 571  
572 -**Table 1: SDMX-ML Time Format Codes**
577 + **Table 1: SDMX-ML Time Format Codes**
573 573  
574 -=== 4.2.9 Transformation between SDMX-ML and SDMX-EDI ===
579 +==== 4.2.9 Transformation between SDMX-ML and SDMX-EDI ====
575 575  
576 576  When converting SDMX-ML data structure definitions to SDMX-EDI data structure definitions, only the identifier of the time format attribute will be retained. The representation of the attribute will be converted from the SDMX-ML format to the fixed SDMX-EDI code list. If the SDMX-ML data structure definition does not define a time format attribute, then one will be automatically created with the identifier "TIME_FORMAT".
577 577  
578 -When converting SDMX-ML data to SDMX-EDI, the source time format attribute will be irrelevant. Since the SDMX-ML time representation types are not ambiguous, the target time format can be determined from the source time value directly. For example, if the SDMX-ML time is 2000-Q2 the SDMX-EDI format will always be 608/708 (depending on whether the target series contains one observation or a range of observations).
583 +When converting SDMX-ML data to SDMX-EDI, the source time format attribute will be irrelevant. Since the SDMX-ML time representation types are not ambiguous, the target time format can be determined from the source time value directly. For example, if the SDMX-ML time is 2000-Q2 the SDMX-EDI format will always be 608/708 (depending on whether the target series contains one observation or a range of observations)
579 579  
580 580  When converting a data structure definition originating in SDMX-EDI, the time format attribute should be ignored, as it serves no purpose in SDMX-ML.
581 581  
582 582  When converting data from SDMX-EDI to SDMX-ML, the source time format is only necessary to determine the format of the target time value. For example, a source time format of will result in a target time in the format YYYY-Ss whereas a source format of will result in a target time value in the format YYYY-Qq.
583 583  
584 -=== 4.2.10 Time Zones ===
589 +==== 4.2.10 Time Zones ====
585 585  
586 586  In alignment with ISO 8601, SDMX allows the specification of a time zone on all time periods and on the reporting year start day. If a time zone is provided on a reporting year start day, then the same time zone (or none) should be reported for each reporting time period. If the reporting year start day and the reporting period time zone differ, the time zone of the reporting period will take precedence. Examples of each format with time zones are as follows (time zone indicated in bold):
587 587  
... ... @@ -602,7 +602,7 @@
602 602  
603 603  According to ISO 8601, a date without a time-zone is considered "local time". SDMX assumes that local time is that of the sender of the message. In this version of SDMX, an optional field is added to the sender definition in the header for specifying a time zone. This field has a default value of 'Z' (UTC). This determination of local time applies for all dates in a message.
604 604  
605 -=== 4.2.11 Representing Time Spans Elsewhere ===
610 +==== 4.2.11 Representing Time Spans Elsewhere ====
606 606  
607 607  It has been possible since SDMX 2.0 for a Component to specify a representation of a time span. Depending on the format of the data message, this resulted in either an element with 2 XML attributes for holding the start time and the duration or two separate XML attributes based on the underlying Component identifier. For example if REF_PERIOD were given a representation of time span, then in the Compact data format, it would be represented by two XML attributes; REF_PERIODStartTime (holding the start) and REF_PERIOD (holding the duration). If a new simple type is introduced in the SDMX schemas that can hold ISO 8601 time intervals, then this will no longer be necessary. What was represented as this:
608 608  
... ... @@ -612,29 +612,30 @@
612 612  
613 613  <Series REF_PERIOD="2000-01-01T00:00:00/P2M"/>
614 614  
615 -=== 4.2.12 Notes on Formats ===
620 +==== 4.2.12 Notes on Formats ====
616 616  
617 617  There is no ambiguity in these formats so that for any given value of time, the category of the period (and thus the intended time period range) is always clear. It should also be noted that by utilizing the ISO 8601 format, and a format loosely based on it for the report periods, the values of time can easily be sorted chronologically without additional parsing.
618 618  
619 -=== 4.2.13 Effect on Time Ranges ===
624 +==== 4.2.13 Effect on Time Ranges ====
620 620  
621 621  All SDMX-ML data messages are capable of functioning in a manner similar to SDMX-EDI if the Dimension at the observation level is time: the time period for the first observation can be stated and the rest of the observations can omit the time value as it can be derived from the start time and the frequency. Since the frequency can be determined based on the actual format of the time value for everything but distinct points in time and time ranges, this makes is even simpler to process as the interval between time ranges is known directly from the time value.
622 622  
623 -=== 4.2.14 Time in Query Messages ===
628 +==== 4.2.14 Time in Query Messages ====
624 624  
625 625  When querying for time values, the value of a time parameter can be provided as any of the Observational Time Period formats and must be paired with an operator. In addition, an explicit value for the reporting year start day can be provided, or this can be set to "Any". This section will detail how systems processing query messages should interpret these parameters.
626 626  
627 627  Fundamental to processing a time value parameter in a query message is understanding that all time periods should be handled as a distinct range of time. Since the time parameter in the query is paired with an operator, this is also effectively represents a distinct range of time. Therefore, a system processing the query must simply match the data where the time period for requested parameter is encompassed by the time period resulting from value of the query parameter. The following table details how the operators should be interpreted for any time period provided as a parameter.
628 628  
629 -(% style="width:1024.29px" %)
630 -|(% style="width:238px" %)**Operator**|(% style="width:782px" %)**Rule**
631 -|(% style="width:238px" %)Greater Than|(% style="width:782px" %)Any data after the last moment of the period
632 -|(% style="width:238px" %)Less Than|(% style="width:782px" %)Any data before the first moment of the period
633 -|(% style="width:238px" %)Greater Than or Equal To|(% style="width:782px" %)(((
634 -Any data on or after the first moment of the period
634 +|**Operator**|**Rule**
635 +|Greater Than|Any data after the last moment of the period
636 +|Less Than|Any data before the first moment of the period
637 +|Greater Than or Equal To|(((
638 +Any data on or after the first moment of
639 +
640 +the period
635 635  )))
636 -|(% style="width:238px" %)Less Than or Equal To|(% style="width:782px" %)Any data on or before the last moment of the period
637 -|(% style="width:238px" %)Equal To|(% style="width:782px" %)Any data which falls on or after the first moment of the period and before or on the last moment of the period
642 +|Less Than or Equal To|Any data on or before the last moment of the period
643 +|Equal To|Any data which falls on or after the first moment of the period and before or on the last moment of the period
638 638  
639 639  Reporting Time Periods as query parameters are handled based on whether the value of the reportingYearStartDay XML attribute is an explicit month and day or "Any":
640 640  
... ... @@ -647,7 +647,9 @@
647 647  **Examples:**
648 648  
649 649  **Gregorian Period**
656 +
650 650  Query Parameter: Greater than 2010
658 +
651 651  Literal Interpretation: Any data where the start period occurs after 2010-1231T23:59:59.
652 652  
653 653  Example Matches:
... ... @@ -665,11 +665,15 @@
665 665  * 2010-D185 or later (reporting year start day ~-~-07-01 or later)
666 666  
667 667  **Reporting Period with explicit start day**
676 +
668 668  Query Parameter: Greater than or equal to 2009-Q3, reporting year start day = "-07-01"
678 +
669 669  Literal Interpretation: Any data where the start period occurs on after 2010-0101T00:00:00 (Note that in this case 2009-Q3 is converted to the explicit date range of 2010-01-01/2010-03-31 because of the reporting year start day value). Example Matches: Same as previous example
670 670  
671 671  **Reporting Period with "Any" start day**
682 +
672 672  Query Parameter: Greater than or equal to 2010-Q3, reporting year start day = "Any"
684 +
673 673  Literal Interpretation: Any data with a reporting period where the start period is on or after the start period of 2010-Q3 for the same reporting year start day, or and data where the start period is on or after 2010-07-01. Example Matches:
674 674  
675 675  * 2011 or later
... ... @@ -681,10 +681,13 @@
681 681  * 2010-T3 (any reporting year start day)
682 682  * 2010-Q3 or later (any reporting year start day)
683 683  * 2010-M07 or later (any reporting year start day)
684 -* 2010-W27 or later (reporting year start day ~-~-01-01){{footnote}}2010-Q3 (with a reporting year start day of --01-01) starts on 2010-07-01. This is day 4 of week 26, therefore the first week matched is week 27.{{/footnote}}  2010-D182 or later (reporting year start day ~-~-01-01)
685 -* 2010-W28 or later (reporting year start day ~-~-07-01){{footnote}}2010-Q3 (with a reporting year start day of --07-01) starts on 2011-01-01. This is day 6 of week 27, therefore the first week matched is week 28.{{/footnote}}
686 -* 2010-D185 or later (reporting year start day ~-~-07-01)
696 +* 2010-W27 or later (reporting year start day ~-~-01-01)^^4^^  2010-D182 or later (reporting year start day ~-~-01-01)
697 +* 2010-W28 or later (reporting year start day ~-~-07-01)^^5^^
687 687  
699 +^^4^^ 2010-Q3 (with a reporting year start day of ~-~-01-01) starts on 2010-07-01. This is day 4 of week 26, therefore the first week matched is week 27.
700 +
701 + 2010-D185 or later (reporting year start day ~-~-07-01)
702 +
688 688  == 4.3 Structural Metadata Querying Best Practices ==
689 689  
690 690  When querying for structural metadata, the ability to state how references should be resolved is quite powerful. However, this mechanism is not always necessary and can create an undue burden on the systems processing the queries if it is not used properly.
... ... @@ -701,6 +701,8 @@
701 701  
702 702  This mechanism is an “early binding” one – everything with a versioned identity is a known quantity, and will not change. It is worth pointing out that in some cases relationships are essentially one-way references: an illustrative case is that of Categories. While a Category may be referenced by many dataflows and metadata flows, the addition of more references from flow objects does not version the Category. This is because the flows are not properties of the Categories – they merely make references to it. If the name of a Category changed, or its subCategories changed, then versioning would be necessary.
703 703  
719 +^^5^^ 2010-Q3 (with a reporting year start day of ~-~-07-01) starts on 2011-01-01. This is day 6 of week 27, therefore the first week matched is week 28.
720 +
704 704  Versioning operates at the level of versionable and maintainable objects in the SDMX information model. If any of the children of objects at these levels change, then the objects themselves are versioned.
705 705  
706 706  One area which is much impacted by this versioning scheme is the ability to reference external objects. With the many dependencies within the various structural objects in SDMX, it is useful to have a scheme for external referencing. This is done at the level of maintainable objects (DSDs, code lists, concept schemes, etc.) In an SDMX-ML Structure Message, whenever an “isExternalReference” attribute is set to true, then the application must resolve the address provided in the associated “uri” attribute and use the SDMX-ML Structure Message stored at that location for the full definition of the object in question. Alternately, if a registry “urn” attribute has been provided, the registry can be used to supply the full details of the object.
... ... @@ -723,13 +723,13 @@
723 723  
724 724  [[image:1747836776649-282.jpeg]]
725 725  
726 -**Figure 1: Schematic of the Metadata Structure Definition**
743 +1. **1: Schematic of the Metadata Structure Definition**
727 727  
728 728  The MSD comprises the specification of the object types to which metadata can be reported in a Metadata Set (Metadata Target(s)), and the Report Structure(s) comprising the Metadata Attributes that identify the Concept for which metadata may be reported in the Metadata Set. Importantly, one Report Structure references the Metadata Target for which it is relevant. One Report Structure can reference many Metadata Target i.e. the same Report Structure can be used for different target objects.
729 729  
730 730  [[image:1747836776655-364.jpeg]]
731 731  
732 -**Figure 2: Example MSD showing Metadata Targets**
749 +1. **2: Example MSD showing Metadata Targets**
733 733  
734 734  Note that the SDMX-ML schemas have explicit XML elements for each identifiable object type because identifying, for instance, a Maintainable Object has different properties from an Identifiable Object which must also include the agencyId, version, and id of the Maintainable Object in which it resides.
735 735  
... ... @@ -739,10 +739,8 @@
739 739  
740 740  [[image:1747836776658-510.jpeg]]
741 741  
742 -**Figure 3: Example MSD showing specification of three Metadata Attributes**
759 +**Figure 3: Example MSD showing specification of three Metadata Attributes **This example shows the following hierarchy of Metadata Attributes:
743 743  
744 -This example shows the following hierarchy of Metadata Attributes:
745 -
746 746  Source – this is presentational and no metadata is expected to be reported at this level
747 747  
748 748  * Source Type
... ... @@ -756,7 +756,10 @@
756 756  
757 757   **Figure 4: Example Metadata Set **This example shows:
758 758  
759 -1. The reference to the MSD, Metadata Report, and Metadata Target (MetadataTargetValue)
774 +1. The reference to the MSD, Metadata Report, and Metadata Target
775 +
776 +(MetadataTargetValue)
777 +
760 760  1. The reported metadata attributes (AttributeValueSet)
761 761  
762 762  = 6 Maintenance Agencies =
... ... @@ -813,9 +813,8 @@
813 813  
814 814  The Information Model for this is shown below:
815 815  
816 -[[image:1747855024745-946.png]]
817 817  
818 -**Figure 8: Information Model Extract for Concept Role**
835 + **Figure 8: Information Model Extract for Concept Role**
819 819  
820 820  It is possible to specify zero or more concept roles for a Dimension, Measure Dimension and Data Attribute (but not the ReportingYearStartDay). The Time Dimension, Primary Measure, and the  Attribute ReportingYearStartDay have explicitly defined roles and cannot be further specified with additional concept roles.
821 821  
... ... @@ -835,14 +835,13 @@
835 835  
836 836  The Cross-Domain Concept Scheme maintained by SDMX contains concept role concepts (FREQ chosen as an example).
837 837  
838 -[[image:1747855054559-410.png]]
855 +[[image:1747836776691-440.jpeg]]
839 839  
840 -
841 841  Whether this is a role or not depends upon the application understanding that FREQ in the Cross-Domain Concept Scheme is a role of Frequency.
842 842  
843 843  Using a Concept Scheme that is not the Cross-Domain Concept Scheme where it is required to assign a role using the Cross-Domain Concept Scheme. Again FREQ is chosen as the example.
844 844  
845 -[[image:1747855075263-887.png]]
861 +[[image:1747836776693-898.jpeg]]
846 846  
847 847  
848 848  This explicitly states that this Dimension is playing a role identified by the FREQ concept in the Cross-Domain Concept Scheme. Again the application needs to understand what FREQ in the Cross-Domain Concept Scheme implies in terms of a role.
... ... @@ -892,7 +892,7 @@
892 892  
893 893  == 8.3 Rules for a Content Constraint ==
894 894  
895 -=== 8.3.1 (% style="color:inherit; font-family:inherit; font-size:max(21px, min(23px, 17.4444px + 0.462963vw))" %)Scope of a Content Constraint(%%) ===
911 +=== 8.3.1 Scope of a Content Constraint ===
896 896  
897 897  A Content Constraint is used specify the content of a data or metadata source in terms of the component values or the keys.
898 898  
... ... @@ -931,54 +931,54 @@
931 931  
932 932  In view of the flexibility of constraints attachment, clear rules on their usage are required. These are elaborated below.
933 933  
934 -=== 8.3.2 Multiple Content Constraints ===
950 +=== 8.3.2 Multiple Content Constraints ===
935 935  
936 936  There can be many Content Constraints for any Constrainable Artefact (e.g. DSD), subject to the following restrictions:
937 937  
938 -==== 8.3.2.1 Cube Region ====
954 +**8.3.2.1 Cube Region**
939 939  
940 940  1. The constraint can contain multiple Member Selections (e.g. Dimension) but:
941 941  1. A specific  Member Selection (e.g. Dimension FREQ)  can only be contained in one Content Constraint for any one attached object (e.g. a specific DSD or specific Dataflow)
942 942  
943 -==== 8.3.2.2 Key Set ====
959 +**8.3.2.2 Key Set**
944 944  
945 945  Key Sets will be processed in the order they appear in the Constraint and wildcards can be used (e.g. any key position not reference explicitly is deemed to be “all values”). As the Key Sets can be “included” or “excluded” it is recommended that Key Sets with wildcards are declared before KeySets with specific series keys. This will minimize the risk that keys are inadvertently included or excluded.  
946 946  
947 -=== 8.3.3 Inheritance of a Content Constraint ===
963 +=== 8.3.3 Inheritance of a Content Constraint ===
948 948  
949 -==== 8.3.3.1 Attachment levels of a Content Constraint ====
965 +**8.3.3.1 Attachment levels of a Content Constraint**
950 950  
951 951  There are three levels of constraint attachment for which these inheritance rules apply:
952 952  
953 -* DSD/MSD – top level
954 -** Dataflow/Metadataflow – second level
955 -*** Provision Agreement – third level
969 + DSD/MSD – top level o Dataflow/Metadataflow – second level
956 956  
971 +§ Provision Agreement – third level
972 +
957 957  Note that these rules do not apply to the Simple Datasoucre or Queryable Datasource: the Content Constraint(s) attached to these artefacts are resolved for this artefact only and do not take into account Constraints attached to other artefacts (e.g. Provision Agreement. Dataflow, DSD).
958 958  
959 959  It is not necessary for a Content Constraint to be attached to higher level artifact. e.g. it is valid to have a Content Constraint for a Provision Agreement where there are no constraints attached the relevant dataflow or DSD.
960 960  
961 -==== 8.3.3.2 Cascade rules for processing Constraints ====
977 +**8.3.3.2 Cascade rules for processing Constraints**
962 962  
963 963  The processing of the constraints on either Dataflow/Metadataflow or Provision Agreement must take into account the constraints declared at higher levels. The rules for the lower level constraints (attached to Dataflow/ Metadataflow and Provision Agreement) are detailed below.
964 964  
965 965  Note that there can be a situation where a constraint is specified at a lower level before a constraint is specified at a higher level. Therefore, it is possible that a higher level constraint makes a lower level constraint invalid. SDMX makes no rules on how such a conflict should be handled when processing the constraint for attachment. However, the cascade rules on evaluating constraints for usage are clear - the higher level constraint takes precedence in any conflicts that result in a less restrictive specification at the lower level.
966 966  
967 -==== 8.3.3.3 Cube Region ====
983 +**8.3.3.3 Cube Region**
968 968  
969 969  1. It is not necessary to have a constraint on the higher level artifact (e.g. DSD referenced by the Dataflow) but if there is such a constraint at the higher level(s) then:
970 -a. The lower level constraint cannot be less restrictive than the constraint specified for the same Member Selection (e.g. Dimension) at the next higher level which constraints that Member Selection (e.g. if the Dimension FREQ is constrained to A, Q in a DSD then the constraint at the Dataflow or Provision Agreement cannot be A, Q, M or even just M – it can only further constrain A,Q).
971 -b. The constraint at the lower level for any one Member Selection further constrains the content for the same Member Selection at the higher level(s).
986 +11. The lower level constraint cannot be less restrictive than the constraint specified for the same Member Selection (e.g. Dimension) at the next higher level which constraints that Member Selection (e.g. if the Dimension FREQ is constrained to A, Q in a DSD then the constraint at the Dataflow or Provision Agreement cannot be A, Q, M or even just M – it can only further constrain A,Q).
987 +11. The constraint at the lower level for any one Member Selection further constrains the content for the same Member Selection at the higher level(s).
972 972  1. Any Member Selection which is not referenced in a Content Constraint is deemed to be constrained according to the Content Constraint specified at the next higher level which constraints that Member Selection.
973 973  1. If there is a conflict when resolving the constraint in terms of a lower-level constraint being less restrictive than a higher-level constraint then the constraint at the higher-level is used.
974 974  
975 975  Note that it is possible for a Content Constraint at a higher level to constrain, say, four Dimensions in a single constraint, and a Content Constraint at a lower level to constrain the same four in two, three, or four Content Constraints.
976 976  
977 -==== 8.3.3.4 Key Set ====
993 +**8.3.3.4 Key Set**
978 978  
979 979  1. It is not necessary to have a constraint on the higher level artefact (e.g. DSD referenced by the Dataflow) but if there is such a constraint at the higher level(s) then:
980 -a. The lower level constraint cannot be less restrictive than the constraint specified at the higher level.
981 -b. The constraint at the lower level for any one Member Selection further constrains the keys specified at the higher level(s).
996 +11. The lower level constraint cannot be less restrictive than the constraint specified at the higher level.
997 +11. The constraint at the lower level for any one Member Selection further constrains the keys specified at the higher level(s).
982 982  1. Any Member Selection which is not referenced in a Content Constraint is deemed to be constrained according to the Content Constraint specified at the next higher level which constraints that Member Selection.
983 983  1. If there is a conflict when resolving the keys in the constraint at two levels, in terms of a lower-level constraint being less restrictive than a higher-level constraint, then the offending keys specified at the lower level are not deemed part of the constraint.
984 984  
... ... @@ -992,12 +992,11 @@
992 992  1. At the lower level inherit all keys that match with the higher level constraint.
993 993  1. If there are keys in the lower level constraint that are not inherited then the key is invalid (i.e. it is less restrictive).
994 994  
995 -=== 8.3.4 Constraints Examples ===
1011 +**8.3.4 Constraints Examples**
996 996  
997 997  The following scenario is used.
998 998  
999 -(% class="wikigeneratedid" id="HDSD" %)
1000 -__DSD__
1015 +=== DSD ===
1001 1001  
1002 1002  This contains the following Dimensions:
1003 1003  
... ... @@ -1009,47 +1009,111 @@
1009 1009  In the DSD common code lists are used and the requirement is to restrict these at various levels to specify the actual code that are valid for the object to which the Content Constraint is attached.
1010 1010  
1011 1011  
1012 -[[image:1747855493531-357.png]]
1027 +|(((
1028 +
1029 +)))
1013 1013  
1014 -**Figure 10: Example Scenario for Constraints**
1031 +|(((
1032 +
1033 +)))
1015 1015  
1035 +|(((
1036 +
1037 +)))
1038 +
1039 +|(((
1040 +**Figure**
1041 +)))
1042 +
1043 +|(((
1044 +**10**
1045 +)))
1046 +
1047 +|(((
1048 +**:**
1049 +)))
1050 +
1051 +|(((
1052 +**~ Example Sce**
1053 +)))
1054 +
1055 +|(((
1056 +**nario for Constraints**
1057 +)))
1058 +
1059 +|(((
1060 +**~ **
1061 +)))
1062 +
1063 +
1064 +
1016 1016  Constraints are declared as follows:
1017 1017  
1018 -[[image:1747855462293-368.png]]
1019 1019  
1020 -**Figure 11: Example Content Constraints**
1068 +|(((
1069 +
1070 +)))
1021 1021  
1072 +|(((
1073 +
1074 +)))
1075 +
1076 +|(((
1077 +
1078 +)))
1079 +
1080 +|(((
1081 +**Figure**
1082 +)))
1083 +
1084 +|(((
1085 +**11**
1086 +)))
1087 +
1088 +|(((
1089 +**:**
1090 +)))
1091 +
1092 +|(((
1093 +**~ Example Content Constraints**
1094 +)))
1095 +
1096 +|(((
1097 +**~ **
1098 +)))
1099 +
1100 +
1101 +
1022 1022  **Notes:**
1023 1023  
1024 -1. AGE is constrained for the DSD and is further restricted for the Dataflow CENSUS_CUBE1.
1104 +1. AGE is constrained for the DSD and is further restricted for the Dataflow
1105 +
1106 +CENSUS_CUBE1.
1107 +
1025 1025  1. The same Constraint applies to both Provision Agreements.
1026 1026  
1027 1027  The cascade rules elaborated above result as follows:
1028 1028  
1029 -__DSD__
1112 +DSD
1030 1030  
1031 1031  ~1. Constrained by eliminating code 001 from the code list for the AGE Dimension.
1032 1032  
1033 -(% class="wikigeneratedid" id="HDataflowCENSUS_CUBE1" %)
1034 -__Dataflow CENSUS_CUBE1__
1116 +=== Dataflow CENSUS_CUBE1 ===
1035 1035  
1036 1036  1. Constrained by restricting the code list for the AGE Dimension to codes 002 and 003(note that this is a more restrictive constraint than that declared for the DSD which specifies all codes except code 001).
1037 1037  1. Restricts the CAS codes to 003 and 004.
1038 1038  
1039 -(% class="wikigeneratedid" id="HDataflowCENSUS_CUBE2" %)
1040 -__Dataflow CENSUS_CUBE2__
1121 +=== Dataflow CENSUS_CUBE2 ===
1041 1041  
1042 1042  1. Restricts the code list for the CAS Dimension to codes TOT and NAP.
1043 1043  1. Inherits the AGE constraint applied at the level of the DSD.
1044 1044  
1045 -(% class="wikigeneratedid" id="HProvisionAgreementsCENSUS_CUBE1_IT" %)
1046 -__Provision Agreements CENSUS_CUBE1_IT__
1126 +=== Provision Agreements CENSUS_CUBE1_IT ===
1047 1047  
1048 1048  1. Restricts the codes for the GEO Dimension to IT and its children.
1049 1049  1. Inherits the constraints from Dataflow CENSUS_CUBE1  for the AGE and CAS Dimensions.
1050 1050  
1051 -(% class="wikigeneratedid" id="HProvisionAgreementsCENSUS_CUBE2_IT" %)
1052 -__Provision Agreements CENSUS_CUBE2_IT__
1131 +=== Provision Agreements CENSUS_CUBE2_IT ===
1053 1053  
1054 1054  1. Restricts the codes for the GEO Dimension to IT and its children.
1055 1055  1. Inherits the constraints from Dataflow CENSUS_CUBE2 for the CAS Dimension.
... ... @@ -1057,20 +1057,17 @@
1057 1057  
1058 1058  The constraints are defined as follows:
1059 1059  
1060 -(% class="wikigeneratedid" id="HDSDConstraint" %)
1061 -__DSD Constraint__
1139 +=== DSD Constraint ===
1062 1062  
1063 1063  [[image:1747836776698-720.jpeg]]
1064 1064  
1065 -(% class="wikigeneratedid" id="HDataflowConstraints" %)
1066 -__Dataflow Constraints__
1143 +=== Dataflow Constraints ===
1067 1067  
1068 1068  [[image:1747836776701-360.jpeg]]
1069 1069  
1070 1070  === [[image:1747836776707-834.jpeg]] ===
1071 1071  
1072 -(% class="wikigeneratedid" id="HProvisionAgreementConstraint" %)
1073 -__Provision Agreement Constraint__
1149 +=== Provision Agreement Constraint ===
1074 1074  
1075 1075  [[image:1747836776710-262.jpeg]]
1076 1076  
... ... @@ -1082,7 +1082,7 @@
1082 1082  
1083 1083  == 9.2 Groups and Dimension Groups ==
1084 1084  
1085 -=== 9.2.1 Issue ===
1161 +=== 9.2.1 Issue ===
1086 1086  
1087 1087  Version 2.1 introduces a more granular mechanism for specifying the relationship between a Data Attribute and the Dimensions to which the attribute applies. The technical construct for this is the Dimension Group. This Dimension Group has no direct equivalent in versions 2.0 and 1.0 and so the application transforming data from a version 2.1 data set to a version 2.0 or version 1.0 data set must decide to which construct the attribute value, whose Attribute is declared in a Dimension Group, should be attached. The closest construct is the “Series” attachment level and in many cases this is the correct construct to use.
1088 1088  
... ... @@ -1095,7 +1095,7 @@
1095 1095  
1096 1096  If the conditions defined in 9.2.1are true then on conversion to a version 2.0 or 1.0 DSD (Key Family) the Component/Attribute.attachmentLevel must be set to “Group” and the Component/Attribute/AttachmentGroup” is used to identify the Group. Note that under rule(1) in 1.2.1 this group will have been defined in the V 2.1 DSD and so will be present in the V 2.0 transformation.
1097 1097  
1098 -=== 9.2.3 Data ===
1174 +=== 9.2.3 Data ===
1099 1099  
1100 1100  If the conditions defined in 9.2.1are true then, on conversion from a 2.1 data set to a 2.0 or 1.0 dataset the attribute value will be placed in the relevant <Group>. If these conditions are not true then the attribute value will be placed in the <Series>.
1101 1101  
... ... @@ -1107,7 +1107,7 @@
1107 1107  
1108 1108  == 10.1 Introduction ==
1109 1109  
1110 -The Validation and Transformation Language (VTL) supports the definition of Transformations, which are algorithms to calculate new data starting from already existing ones[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[4~]^^>>path:#_ftn4]](%%). The purpose of the VTL in the SDMX context is to enable the:
1186 +The Validation and Transformation Language (VTL) supports the definition of Transformations, which are algorithms to calculate new data starting from already existing ones[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[4~]^^>>path:#_ftn4]](%%). The purpose of the VTL in the SDMX context is to enable the:
1111 1111  
1112 1112  * definition of validation and transformation algorithms, in order to specify how to calculate new data  from existing ones;
1113 1113  * exchange of the definition of VTL algorithms, also together the definition of the data structures of the involved data (for example, exchange the data structures of a reporting framework together with the validation rules to be applied, exchange the input and output data structures of a calculation task together with the VTL Transformations describing the calculation algorithms);
... ... @@ -1129,8 +1129,10 @@
1129 1129  
1130 1130  The alias of a SDMX artefact can be its URN (Universal Resource Name), an abbreviation of its URN or another user-defined name.
1131 1131  
1132 -In any case, the aliases used in the VTL transformations have to be mapped to the SDMX artefacts through the VtlMappingScheme and VtlMapping classes (see the section of the SDMX IM relevant to the VTL). A VtlMapping allows specifying the aliases to be used in the VTL transformations, rulesets[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[5~]^^>>path:#_ftn5]](%%) or user defined operators[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[6~]^^>>path:#_ftn6]](%%)  to reference SDMX artefacts. A VtlMappingScheme is a container for zero or more VtlMapping. 
1208 +In any case, the aliases used in the VTL transformations have to be mapped to the
1133 1133  
1210 +SDMX artefacts through the VtlMappingScheme and VtlMapping classes (see the section of the SDMX IM relevant to the VTL). A VtlMapping allows specifying the aliases to be used in the VTL transformations, rulesets[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[5~]^^>>path:#_ftn5]](%%) or user defined operators[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[6~]^^>>path:#_ftn6]](%%)  to reference SDMX artefacts. A VtlMappingScheme is a container for zero or more VtlMapping. 
1211 +
1134 1134  The correspondence between an alias and a SDMX artefact must be one-to-one, meaning that a generic alias  identifies one and just one SDMX artefact while a SDMX artefact is identified by one and just one alias. In other words, within a VtlMappingScheme an artefact can have just one alias and different artefacts cannot have the same alias.
1135 1135  
1136 1136  The references through the URN and the abbreviated URN are described in the following paragraphs.
... ... @@ -1139,7 +1139,7 @@
1139 1139  
1140 1140  This approach has the advantage that in the VTL code the URN of the referenced artefacts is directly intelligible by a human reader but has the drawback that the references are verbose.
1141 1141  
1142 -The SDMX URN[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[7~]^^>>path:#_ftn7]](%%) is the concatenation of the following parts, separated by special symbols like dot, equal, asterisk, comma, and parenthesis:^^ ^^
1220 +The SDMX URN[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[7~]^^>>path:#_ftn7]](%%) is the concatenation of the following parts, separated by special symbols like dot, equal, asterisk, comma, and parenthesis:^^ ^^
1143 1143  
1144 1144  * SDMXprefix                                                                                   
1145 1145  * SDMX-IM-package-name             
... ... @@ -1147,7 +1147,7 @@
1147 1147  * agency-id                                                                          
1148 1148  * maintainedobject-id
1149 1149  * maintainedobject-version
1150 -* container-object-id [[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[8~]^^>>path:#_ftn8]]
1228 +* container-object-id [[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[8~]^^>>path:#_ftn8]]
1151 1151  * object-id
1152 1152  
1153 1153  The generic structure of the URN is the following:
... ... @@ -1166,13 +1166,13 @@
1166 1166  
1167 1167  The **agency-id** is the acronym of the agency that owns the definition of the artefact, for example for the Eurostat artefacts the agency-id is “ESTAT”). The agency-id can be composite (for example AgencyA.Dept1.Unit2).
1168 1168  
1169 -The **maintainedobject-id** is the name of the maintained object which the artefact belongs to, and in case the artefact itself is maintainable[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[9~]^^>>path:#_ftn9]](%%), coincides with the name of the artefact. Therefore the maintainedobject-id depends on the class of the artefact:
1247 +The **maintainedobject-id** is the name of the maintained object which the artefact belongs to, and in case the artefact itself is maintainable[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[9~]^^>>path:#_ftn9]](%%), coincides with the name of the artefact. Therefore the maintainedobject-id depends on the class of the artefact:
1170 1170  
1171 -* if the artefact is a Dataflow, which is a maintainable class,  the maintainedobject-id is the Dataflow name (dataflow-id);
1172 -* if the artefact is a Dimension, MeasureDimension, TimeDimension, PrimaryMeasure or DataAttribute, which are not maintainable and belong to the DataStructure maintainable class, the maintainedobject-id is the name of the DataStructure (dataStructure-id) which the artefact belongs to;
1173 -* if the artefact is a Concept, which is not maintainable and belongs to the ConceptScheme maintainable class, ,, ,,the maintainedobject-id is the name of the ConceptScheme (conceptScheme-id) which the artefact belongs to;
1174 -* if the artefact is a ConceptScheme, which is a maintainable class, ,, ,,the maintainedobject-id is the name of the ConceptScheme (conceptScheme-id);
1175 -* if the artefact is a Codelist, which is a maintainable class,  the maintainedobject-id is the Codelist name (codelist-id).
1249 +* if the artefact is a ,,Dataflow,,, which is a maintainable class,  the maintainedobject-id is the Dataflow name (dataflow-id);
1250 +* if the artefact is a Dimension, MeasureDimension, TimeDimension, PrimaryMeasure or DataAttribute, which are not maintainable and belong to the ,,DataStructure,, maintainable class, the maintainedobject-id is the name of the DataStructure (dataStructure-id) which the artefact belongs to;
1251 +* if the artefact is a ,,Concept,,, which is not maintainable and belongs to the ConceptScheme maintainable class, ,, ,,the maintainedobject-id is the name of the ConceptScheme (conceptScheme-id) which the artefact belongs to;
1252 +* if the artefact is a ,,ConceptScheme,,, which is a maintainable class, ,, ,,the maintainedobject-id is the name of the ConceptScheme (conceptScheme-id);
1253 +* if the artefact is a ,,Codelist, ,,which is a maintainable class,  the maintainedobject-id is the Codelist name (codelist-id).
1176 1176  
1177 1177  The **maintainedobject-version** is the version of the maintained object which the artefact belongs to (for example, possible versions are 1.0, 2.1, 3.1.2).
1178 1178  
... ... @@ -1186,7 +1186,7 @@
1186 1186  
1187 1187  * if the artefact is a ,,Concept ,,(the object-id is the name of the ,,Concept,,)
1188 1188  
1189 -For example, by using the URN, the VTL transformation that sums two SDMX dataflows DF1 and DF2 and assigns the result to a third persistent dataflow DFR, assuming that DF1, DF2  and  DFR are the maintainedobject-id of the three dataflows, that their version is 1.0 and their Agency is AG, would be written as[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[10~]^^>>path:#_ftn10]](%%):
1267 +For example, by using the URN, the VTL transformation that sums two SDMX dataflows DF1 and DF2 and assigns the result to a third persistent dataflow DFR, assuming that DF1, DF2  and  DFR are the maintainedobject-id of the three dataflows, that their version is 1.0 and their Agency is AG, would be written as[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[10~]^^>>path:#_ftn10]](%%):
1190 1190  
1191 1191  ‘urn:sdmx:org.sdmx.infomodel.datastructure.Dataflow=AG:DFR(1.0)’  <-
1192 1192  
... ... @@ -1204,14 +1204,14 @@
1204 1204  * The **SDMX-IM-package-name **can be omitted as well because it can be deduced from the class-name that follows it (the table of the SDMX-IM packages and classes that allows this deduction is in the SDMX 2.1 Standards - Section 5 -  Registry Specifications, paragraph 6.2.3). In particular, considering the object classes of the artefacts that VTL can reference, the package is: 
1205 1205  ** “datastructure” for the classes Dataflow, Dimension, MeasureDimension, TimeDimension, PrimaryMeasure, DataAttribute,  
1206 1206  ** “conceptscheme” for the classes Concept and ConceptScheme o “codelist” for the class Codelist.
1207 -* The **class-name** can be omitted as it can be deduced from the VTL invocation.  In particular, starting from the VTL class of the invoked artefact (e.g. dataset, component, identifier, measure, attribute, variable, valuedomain),  which is known given the syntax of the invoking VTL operator[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[11~]^^>>path:#_ftn11]](%%), the SDMX class can be deduced from the mapping rules between VTL and SDMX (see the section “Mapping between VTL and SDMX” hereinafter)[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[12~]^^>>path:#_ftn12]](%%).
1208 -* If the **agency-id** is not specified, it is assumed by default equal to the agency-id of the TransformationScheme, UserDefinedOperatorScheme or RulesetScheme from which the artefact is invoked. For example, the agency-id can be omitted if it is the same as the invoking T,,ransformationScheme,, and cannot be omitted if the artefact comes from another agency.[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[13~]^^>>path:#_ftn13]](%%)  Take also into account that, according to the VTL consistency rules, the agency of the result of a ,,Transformation,, must be the same as its ,,TransformationScheme,,, therefore the agency-id can be omitted for all the results (left part of ,,Transformation,, statements).
1285 +* The **class-name** can be omitted as it can be deduced from the VTL invocation.  In particular, starting from the VTL class of the invoked artefact (e.g. dataset, component, identifier, measure, attribute, variable, valuedomain),  which is known given the syntax of the invoking VTL operator[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[11~]^^>>path:#_ftn11]](%%), the SDMX class can be deduced from the mapping rules between VTL and SDMX (see the section “Mapping between VTL and SDMX” hereinafter)[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[12~]^^>>path:#_ftn12]](%%).
1286 +* If the **agency-id** is not specified, it is assumed by default equal to the agency-id of the TransformationScheme, UserDefinedOperatorScheme or RulesetScheme from which the artefact is invoked. For example, the agency-id can be omitted if it is the same as the invoking T,,ransformationScheme,, and cannot be omitted if the artefact comes from another agency.[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[13~]^^>>path:#_ftn13]](%%)  Take also into account that, according to the VTL consistency rules, the agency of the result of a ,,Transformation,, must be the same as its ,,TransformationScheme,,, therefore the agency-id can be omitted for all the results (left part of ,,Transformation,, statements).
1209 1209  * As for the **maintainedobject-id**, this is essential in some cases while in other cases it can be omitted: o if the referenced artefact is a ,,Dataflow,,, which is a maintainable class, the maintainedobject-id is the dataflow-id and obviously cannot be omitted;
1210 1210  ** if the referenced artefact is a Dimension, MeasureDimension, TimeDimension, PrimaryMeasure, DataAttribute, which are not maintainable and belong to the ,,DataStructure,, maintainable class, the maintainedobject-id is the dataStructure-id and can be omitted, given that these components are always invoked within the invocation of a ,,Dataflow,,, whose dataStructure-id can be deduced from the
1211 1211  
1212 1212  SDMX structural definitions;  o if the referenced artefact is a ,,Concept, ,,which is not maintainable and belong to the ,,ConceptScheme ,,maintainable class,,, ,,the maintained object is the conceptScheme-id and cannot be omitted;
1213 1213  
1214 -*
1292 +*
1215 1215  ** if the referenced artefact is a ,,ConceptScheme, ,,which is a,, ,,maintainable class,,, ,,the maintained object is the ,,conceptScheme-id,, and obviously cannot be omitted;
1216 1216  ** if the referenced artefact is a ,,Codelist, ,,which is a maintainable class, the maintainedobject-id is the ,,codelist-id,, and obviously cannot be omitted.
1217 1217  * When the maintainedobject-id is omitted, the **maintainedobject-version** is omitted too. When the maintainedobject-id is not omitted and the maintainedobject-version is omitted, the version 1.0 is assumed by default.,, ,,
... ... @@ -1232,11 +1232,11 @@
1232 1232  
1233 1233  DFR  :=  DF1 + DF2
1234 1234  
1235 -The references to the ,,Codelists,, can be simplified similarly. For example, given the non-abbreviated reference to the ,,Codelist,,  AG:CL_FREQ(1.0), which is[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[14~]^^>>path:#_ftn14]](%%):
1313 +The references to the ,,Codelists,, can be simplified similarly. For example, given the non-abbreviated reference to the ,,Codelist,,  AG:CL_FREQ(1.0), which is[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[14~]^^>>path:#_ftn14]](%%):
1236 1236  
1237 1237  ‘urn:sdmx:org.sdmx.infomodel.codelist.Codelist=AG:CL_FREQ(1.0)’
1238 1238  
1239 -if the ,,Codelist,, is referenced from a ruleset scheme belonging to the agency AG, omitting all the optional parts, the abbreviated reference would become simply[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[15~]^^>>path:#_ftn15]](%%):
1317 +if the ,,Codelist,, is referenced from a ruleset scheme belonging to the agency AG, omitting all the optional parts, the abbreviated reference would become simply[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[15~]^^>>path:#_ftn15]](%%):
1240 1240  
1241 1241  CL_FREQ
1242 1242  
... ... @@ -1246,7 +1246,7 @@
1246 1246  
1247 1247  SECTOR
1248 1248  
1249 -For example, the transformation for renaming the component SECTOR of the dataflow DF1 into SEC can be written as[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[16~]^^>>path:#_ftn16]](%%):
1327 +For example, the transformation for renaming the component SECTOR of the dataflow DF1 into SEC can be written as[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[16~]^^>>path:#_ftn16]](%%):
1250 1250  
1251 1251  ‘DFR(1.0)’ := ‘DF1(1.0)’ [rename SECTOR to SEC]
1252 1252  
... ... @@ -1280,9 +1280,9 @@
1280 1280  
1281 1281  The VTL Rulesets have a signature, in which the Value Domains or the Variables on which the Ruleset is defined are declared, and a body, which contains the rules. 
1282 1282  
1283 -In the signature, given the mapping between VTL and SDMX better described in the following paragraphs, a reference to a VTL Value Domain becomes a reference to a SDMX Codelist or to a SDMX ConceptScheme (for SDMX measure dimensions), while a reference to a VTL Represented Variable becomes a reference to a SDMX Concept, assuming for it a definite representation[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[17~]^^>>path:#_ftn17]](%%).
1361 +In the signature, given the mapping between VTL and SDMX better described in the following paragraphs, a reference to a VTL Value Domain becomes a reference to a SDMX Codelist or to a SDMX ConceptScheme (for SDMX measure dimensions), while a reference to a VTL Represented Variable becomes a reference to a SDMX Concept, assuming for it a definite representation[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[17~]^^>>path:#_ftn17]](%%).
1284 1284  
1285 -In general, for referencing SDMX Codelists and Concepts, the conventions described in the previous paragraphs apply. In the Ruleset syntax, the elements that reference SDMX artefacts are called “valueDomain” and “variable” for the Datapoint Rulesets and “ruleValueDomain”, “ruleVariable”, “condValueDomain” “condVariable” for the Hierarchical Rulesets). The syntax of the Ruleset signature allows also to define aliases of the elements above, these aliases are valid only within the specific ruleset definition statement and cannot be mapped to SDMX.[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[18~]^^>>path:#_ftn18]](%%)
1363 +In general, for referencing SDMX Codelists and Concepts, the conventions described in the previous paragraphs apply. In the Ruleset syntax, the elements that reference SDMX artefacts are called “valueDomain” and “variable” for the Datapoint Rulesets and “ruleValueDomain”, “ruleVariable”, “condValueDomain” “condVariable” for the Hierarchical Rulesets). The syntax of the Ruleset signature allows also to define aliases of the elements above, these aliases are valid only within the specific ruleset definition statement and cannot be mapped to SDMX.[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[18~]^^>>path:#_ftn18]](%%)
1286 1286  
1287 1287  In the body of the Rulesets, the Codes and in general all the Values can be written without any other specification, because the artefact  which the Values are referred (Codelist, ConceptScheme, Concept) to can be deduced from the Ruleset signature.
1288 1288  
... ... @@ -1296,15 +1296,15 @@
1296 1296  
1297 1297  Every time a SDMX object is referenced in a VTL Transformation as an input operand, there is the need to generate a VTL definition of the object, so that the VTL operations can take place. This can be made starting from the SDMX definition and applying a SDMX-VTL mapping method in the direction from SDMX to VTL. The possible mapping methods from SDMX to VTL are described in the following paragraphs and are conceived to allow the automatic deduction of the VTL definition of the object from the knowledge of the SDMX definition. 
1298 1298  
1299 -In the opposite direction, every time an object calculated by means of VTL must be treated as a SDMX object (for example for exchanging it through SDMX), there is the need of a SDMX definition of the object, so that the SDMX operations can take place.  The SDMX definition is needed for the VTL objects for which a SDMX use is envisaged[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[19~]^^>>path:#_ftn19]](%%).
1377 +In the opposite direction, every time an object calculated by means of VTL must be treated as a SDMX object (for example for exchanging it through SDMX), there is the need of a SDMX definition of the object, so that the SDMX operations can take place.  The SDMX definition is needed for the VTL objects for which a SDMX use is envisaged[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[19~]^^>>path:#_ftn19]](%%).
1300 1300  
1301 1301  The mapping methods from VTL to SDMX are described in the following paragraphs as well, however they do not allow the complete SDMX definition to be automatically deduced from the VTL definition,  more than all because the former typically contains additional information in respect to the latter. For example, the definition of a SDMX DSD includes also some mandatory information not available in VTL (like the concept scheme to which the SDMX components refer, the assignmentStatus and attributeRelationship for the DataAttributes and so on). Therefore the mapping methods from VTL to SDMX provide only a general guidance for generating SDMX definitions properly starting from the information available in VTL, independently of how the SDMX definition it is actually generated (manually, automatically or part and part). 
1302 1302  
1303 1303  === 10.3.2 General mapping of VTL and SDMX data structures ===
1304 1304  
1305 -This section makes reference to the VTL “Model for data and their structure”[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[20~]^^>>path:#_ftn20]](%%) and the correspondent SDMX “Data Structure Definition”[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[21~]^^>>path:#_ftn21]](%%).
1383 +This section makes reference to the VTL “Model for data and their structure”[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[20~]^^>>path:#_ftn20]](%%) and the correspondent SDMX “Data Structure Definition”[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[21~]^^>>path:#_ftn21]](%%).
1306 1306  
1307 -The main type of artefact that the VTL can manipulate is the VTL Data Set, which in general is mapped to the SDMX Dataflow. This means that a VTL Transformation, in the SDMX context, expresses the algorithm for calculating a derived Dataflow starting from some already existing Dataflows (either collected or derived).[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[22~]^^>>path:#_ftn22]](%%)
1385 +The main type of artefact that the VTL can manipulate is the VTL Data Set, which in general is mapped to the SDMX Dataflow. This means that a VTL Transformation, in the SDMX context, expresses the algorithm for calculating a derived Dataflow starting from some already existing Dataflows (either collected or derived).[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[22~]^^>>path:#_ftn22]](%%)
1308 1308  
1309 1309  While the VTL Transformations are defined in term of Dataflow definitions, they are assumed to be executed on instances of such Dataflows, provided at runtime to the VTL engine (the mechanism for identifying the instances to be processed are not part of the VTL specifications and depend on the implementation of the VTL-based systems).  As already said, the SDMX Datasets are instances of SDMX Dataflows, therefore a VTL Transformation defined on some SDMX Dataflows can be applied on some corresponding SDMX Datasets.
1310 1310  
... ... @@ -1314,7 +1314,7 @@
1314 1314  
1315 1315  SDMX DimensionComponent can be a Dimension, a TimeDimension or a MeasureDimension. Correspondingly, in the SDMX implementation of the VTL, the VTL Identifiers can be (optionally) distinguished in three sub-classes (Simple Identifier, Time Identifier, Measure Identifier) even if such a distinction is not evidenced in the VTL IM. 
1316 1316  
1317 -However, a VTL Data Structure can have any number of Identifiers, Measures and Attributes, while a SDMX 2.1 DataStructureDefinition can have any number of Dimensions and DataAttributes but just one PrimaryMeasure[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[23~]^^>>path:#_ftn23]](%%). This is due to a difference between SDMX 2.1 and VTL in the possible representation methods of the data that contain more measures.
1395 +However, a VTL Data Structure can have any number of Identifiers, Measures and Attributes, while a SDMX 2.1 DataStructureDefinition can have any number of Dimensions and DataAttributes but just one PrimaryMeasure[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[23~]^^>>path:#_ftn23]](%%). This is due to a difference between SDMX 2.1 and VTL in the possible representation methods of the data that contain more measures.
1318 1318  
1319 1319  As for SDMX, because the data structure cannot contain more than one measure component (i.e., the primaryMeasure), the representation of data having more measures is possible only by means of a particular dimension, called MeasureDimension, which is aimed at containing the name of the measure concepts, so that for each observation the value contained in the PrimaryMeasure component is the value of the measure concept reported in the MeasureDimension component. 
1320 1320  
... ... @@ -1378,7 +1378,7 @@
1378 1378  
1379 1379   The set of SDMX observations having the same values for all the Dimensions except than the MeasureDimension become one multi-measure VTL Data Point, having one Measure for each Concept Cj of the SDMX MeasureDimension;
1380 1380  
1381 -*
1459 +*
1382 1382  ** The values of the SDMX simple Dimensions, TimeDimension and DataAttributes not depending on the MeasureDimension (these components by definition have always the same values for all the observations of the set above) become the values of the corresponding VTL (simple) Identifiers, (time) Identifier and Attributes.
1383 1383  ** The value of the PrimaryMeasure of the SDMX observation belonging to the set above and having MeasureDimension=Cj becomes the value of the VTL Measure Cj
1384 1384  ** For the SDMX DataAttributes depending on the MeasureDimension, the value of the DataAttribute DA of the SDMX observation belonging to the set above and having MeasureDimension=Cj becomes the value of the VTL Attribute DA_Cj
... ... @@ -1385,7 +1385,7 @@
1385 1385  
1386 1386  **10.3.3.3 From SDMX DataAttributes to VTL Measures **
1387 1387  
1388 -*
1466 +*
1389 1389  ** In some cases it may happen that the DataAttributes of the SDMX DataStructure need to be managed as Measures in VTL. Therefore, a variant of both the methods above consists in transforming all the SDMX DataAttributes in VTL Measures. When DataAttributes are converted to Measures, the  two methods above are called Basic_A2M and Pivot_A2M (the suffix “A2M” stands for Attributes to Measures). Obviously, the resulting VTL data structure is, in general, multi-measure and does not contain Attributes.
1390 1390  
1391 1391  The Basic_A2M and Pivot_A2M behaves respectively like the Basic and Pivot methods, except that the final VTL components, which according to the Basic and Pivot methods would have had the role of Attribute, assume instead the role of Measure.
... ... @@ -1404,7 +1404,7 @@
1404 1404  
1405 1405  This mapping method cannot be applied for SDMX 2.1 if the VTL data structure has more than one measure component, given that the SDMX 2.1 DataStructureDefinition allows just one measure component (the
1406 1406  
1407 -PrimaryMeasure). In this case it becomes mandatory to specify a different 1958 mapping method through the VtlMappingScheme and VtlDataflowMapping 1959 classes.[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[24~]^^>>path:#_ftn24]](%%)
1485 +PrimaryMeasure). In this case it becomes mandatory to specify a different 1958 mapping method through the VtlMappingScheme and VtlDataflowMapping 1959 classes.[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[24~]^^>>path:#_ftn24]](%%)
1408 1408  
1409 1409  1960 Please note that the VTL measures can have any name while in SDMX 2.1 the 1961 MeasureComponent has the mandatory name “obs_value”, therefore the name of the VTL measure name must become “obs_value” in SDMX 2.1. 
1410 1410  
... ... @@ -1471,7 +1471,7 @@
1471 1471  
1472 1472   the values of the VTL identifiers become the values of the corresponding SDMX Dimensions, for all the observations of the set above
1473 1473  
1474 -*
1552 +*
1475 1475  ** the name of the j^^th^^ VTL measure (e.g. “Cj”) becomes the value of the SDMX MeasureDimension of the j^^th^^ observation of the set (i.e. the Concept Cj)
1476 1476  ** the value of the j^^th^^ VTL measure becomes the value of the SDMX PrimaryMeasure of the j^^th^^ observation of the set
1477 1477  ** the values of the VTL Attributes become the values of the corresponding SDMX DataAttributes (in principle for all the observations of the set above)
... ... @@ -1521,15 +1521,15 @@
1521 1521  
1522 1522   The VtlMappingScheme is a container for zero or more VtlDataflowMapping (besides possible mappings to artefacts other than dataflows).
1523 1523  
1524 -=== 10.3.6 Mapping dataflow subsets to distinct VTL data sets[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^**~[25~]**^^>>path:#_ftn25]](%%) ===
1602 +=== 10.3.6 Mapping dataflow subsets to distinct VTL data sets[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^**~[25~]**^^>>path:#_ftn25]](%%) ===
1525 1525  
1526 1526  Until now it as been assumed to map one SMDX Dataflow to one VTL dataset and vice-versa. This mapping one-to-one is not mandatory according to VTL because a VTL data set is meant to be a set of observations (data points) on a logical plane, having the same logical data structure and the same general meaning, independently of the possible physical representation or storage (see VTL 2.0 User Manual page
1527 1527  
1528 1528  24), therefore a SDMX Dataflow can be seen either as a unique set of data observations (corresponding to one VTL data set) or as the union of many sets of data observations (each one corresponding to a distinct VTL data set).
1529 1529  
1530 -As a matter of fact, in some cases it can be useful to define VTL operations involving definite parts of a SDMX Dataflow instead than the whole.[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[26~]^^>>path:#_ftn26]](%%)
1608 +As a matter of fact, in some cases it can be useful to define VTL operations involving definite parts of a SDMX Dataflow instead than the whole.[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[26~]^^>>path:#_ftn26]](%%)
1531 1531  
1532 -Therefore, in order to make the coding of  VTL operations simpler when applied on parts of SDMX Dataflows, it is allowed to map distinct parts of a SDMX Dataflow to distinct VTL data sets according to the following rules and conventions. This kind of mapping is possible both from SDMX to VTL and from VTL to SDMX, as better explained below.[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[27~]^^>>path:#_ftn27]](%%)
1610 +Therefore, in order to make the coding of  VTL operations simpler when applied on parts of SDMX Dataflows, it is allowed to map distinct parts of a SDMX Dataflow to distinct VTL data sets according to the following rules and conventions. This kind of mapping is possible both from SDMX to VTL and from VTL to SDMX, as better explained below.[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[27~]^^>>path:#_ftn27]](%%)
1533 1533  
1534 1534   Given a SDMX Dataflow and some predefined Dimensions of its
1535 1535  
... ... @@ -1541,14 +1541,14 @@
1541 1541  
1542 1542  In practice, this kind mapping is obtained like follows:
1543 1543  
1544 -* For a given SDMX dataflow, the user (VTL definer) declares  the dimension components on which the mapping will be based, in a given order.[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[28~]^^>>path:#_ftn28]](%%) Following the example above, imagine that the user declares the dimensions INDICATOR and COUNTRY.
1622 +* For a given SDMX dataflow, the user (VTL definer) declares  the dimension components on which the mapping will be based, in a given order.[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[28~]^^>>path:#_ftn28]](%%) Following the example above, imagine that the user declares the dimensions INDICATOR and COUNTRY.
1545 1545  * The VTL dataset is given a name using a special notation also called “ordered concatenation” and composed of the following parts: 
1546 1546  ** The reference to the SDMX dataflow (expressed according to the rules described in the previous paragraphs, i.e. URN, abbreviated
1547 1547  
1548 -URN or another alias); for example DF(1.0); o a slash (“/”) as a separator; [[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[29~]^^>>path:#_ftn29]]
1626 +URN or another alias); for example DF(1.0); o a slash (“/”) as a separator; [[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[29~]^^>>path:#_ftn29]]
1549 1549  
1550 -*
1551 -** The reference to a specific part of the SDMX dataflow above, expressed as the concatenation of the values that the SDMX dimensions declared above must have, separated by dots (“.”) and written in the order in which these dimensions are defined[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[30~]^^>>path:#_ftn30]](%%) . For example  POPULATION.USA would mean that such a VTL dataset is mapped to the SDMX observations for which the dimension  //INDICATOR// is equal to POPULATION and the dimension //COUNTRY// is equal to USA.
1628 +*
1629 +** The reference to a specific part of the SDMX dataflow above, expressed as the concatenation of the values that the SDMX dimensions declared above must have, separated by dots (“.”) and written in the order in which these dimensions are defined[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[30~]^^>>path:#_ftn30]](%%) . For example  POPULATION.USA would mean that such a VTL dataset is mapped to the SDMX observations for which the dimension  //INDICATOR// is equal to POPULATION and the dimension //COUNTRY// is equal to USA.
1552 1552  
1553 1553  In the VTL transformations, this kind of dataset name must be referenced between single quotes because the slash (“/”) is not a regular character according to the VTL rules.
1554 1554  
... ... @@ -1566,7 +1566,7 @@
1566 1566  
1567 1567  Let us now analyse the different meaning of this kind of mapping in the two mapping directions, i.e. from SDMX to VTL and from VTL to SDMX.
1568 1568  
1569 -As already said, the mapping from SDMX to VTL happens when the VTL datasets are operand of VTL transformations, instead the mapping from VTL to SDMX happens when the VTL datasets are result of VTL transformations[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[31~]^^>>path:#_ftn31]](%%) and need to be treated as SDMX objects. This kind of mapping can be applied independently in the two directions and the Dimensions on which the mapping is based can be different in the two directions: these Dimensions are defined in the ToVtlSpaceKey and in the FromVtlSpaceKey classes respectively.
1647 +As already said, the mapping from SDMX to VTL happens when the VTL datasets are operand of VTL transformations, instead the mapping from VTL to SDMX happens when the VTL datasets are result of VTL transformations[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[31~]^^>>path:#_ftn31]](%%) and need to be treated as SDMX objects. This kind of mapping can be applied independently in the two directions and the Dimensions on which the mapping is based can be different in the two directions: these Dimensions are defined in the ToVtlSpaceKey and in the FromVtlSpaceKey classes respectively.
1570 1570  
1571 1571  First, let us see what happens in the mapping direction from SDMX to VTL, i.e. when parts of a SDMX dataflow (e.g. DF1(1.0)) need to be mapped to distinct VTL datasets that are operand of some VTL transformations.
1572 1572  
... ... @@ -1576,7 +1576,7 @@
1576 1576  
1577 1577  //COUNTRYvalue//. For example, the VTL dataset ‘DF1(1.0)/POPULATION.USA’ would contain all the observations of DF1(1.0) having INDICATOR = POPULATION and COUNTRY = USA.
1578 1578  
1579 -In order to obtain the data structure of these VTL datasets from the SDMX one, it is assumed that the SDMX dimensions on which the mapping is based are dropped, i.e. not maintained in the VTL data structure; this is possible because their values are fixed for each one of the invoked VTL datasets[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[32~]^^>>path:#_ftn32]](%%). After that, the mapping method from SDMX to VTL specified for the dataflow DF1(1.0) is applied (i.e. basic, pivot …). 
1657 +In order to obtain the data structure of these VTL datasets from the SDMX one, it is assumed that the SDMX dimensions on which the mapping is based are dropped, i.e. not maintained in the VTL data structure; this is possible because their values are fixed for each one of the invoked VTL datasets[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[32~]^^>>path:#_ftn32]](%%). After that, the mapping method from SDMX to VTL specified for the dataflow DF1(1.0) is applied (i.e. basic, pivot …). 
1580 1580  
1581 1581  In the example above, for all the datasets of the kind
1582 1582  
... ... @@ -1596,7 +1596,7 @@
1596 1596  
1597 1597  …   …   …
1598 1598  
1599 -In fact the VTL operator “sub” has exactly the same behaviour. Therefore, mapping different parts of a SDMX dataflow to different VTL datasets in the direction from SDMX to VTL through the ordered concatenation notation is equivalent to a proper use of the operator “**sub**” on such a dataflow. [[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[33~]^^>>path:#_ftn33]]
1677 +In fact the VTL operator “sub” has exactly the same behaviour. Therefore, mapping different parts of a SDMX dataflow to different VTL datasets in the direction from SDMX to VTL through the ordered concatenation notation is equivalent to a proper use of the operator “**sub**” on such a dataflow. [[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[33~]^^>>path:#_ftn33]]
1600 1600  
1601 1601  In the direction from SDMX to VTL it is allowed to omit the value of one or more Dimensions on which the mapping is based, but maintaining all the separating dots (therefore it may happen to find two or more consecutive dots and dots in the beginning or in the end). The absence of value means that for the corresponding Dimension all the values are kept and the Dimension is not dropped.
1602 1602  
... ... @@ -1619,12 +1619,12 @@
1619 1619  
1620 1620  For example, let us assume that the VTL programmer wants to calculate the SDMX dataflow DF2(1.0) having the Dimensions TIME_PERIOD, INDICATOR, and COUNTRY and that such a programmer finds it convenient to calculate separately the parts of DF2(1.0) that have different combinations of values for INDICATOR and COUNTRY:
1621 1621  
1622 -* each part is calculated as a  VTL derived dataset, result of a dedicated VTL transformation; [[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[34~]^^>>path:#_ftn34]](%%)
1623 -* the data structure of all these VTL datasets has the TIME_PERIOD identifier and does not have the INDICATOR and COUNTRY identifiers.[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[35~]^^>>path:#_ftn35]]
1700 +* each part is calculated as a  VTL derived dataset, result of a dedicated VTL transformation; [[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[34~]^^>>path:#_ftn34]](%%)
1701 +* the data structure of all these VTL datasets has the TIME_PERIOD identifier and does not have the INDICATOR and COUNTRY identifiers.[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[35~]^^>>path:#_ftn35]]
1624 1624  
1625 -Under these hypothesis, such derived VTL datasets can be mapped to DF2(1.0) by declaring the Dimensions INDICATOR and COUNTRY as mapping dimensions[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[36~]^^>>path:#_ftn36]](%%).
1703 +Under these hypothesis, such derived VTL datasets can be mapped to DF2(1.0) by declaring the Dimensions INDICATOR and COUNTRY as mapping dimensions[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[36~]^^>>path:#_ftn36]](%%).
1626 1626  
1627 -The corresponding VTL transformations, assuming that the result needs to be persistent, would be of this kind:^^ ^^[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[37~]^^>>path:#_ftn37]]
1705 +The corresponding VTL transformations, assuming that the result needs to be persistent, would be of this kind:^^ ^^[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[37~]^^>>path:#_ftn37]]
1628 1628  
1629 1629  ‘DF2(1.0)///INDICATORvalue//.//COUNTRYvalue//’  <-  expression
1630 1630  
... ... @@ -1691,9 +1691,9 @@
1691 1691  
1692 1692  …);
1693 1693  
1694 -In other words, starting from the datasets explicitly calculated through VTL (in the example ‘DF2(1.0)/GDPPERCAPITA.USA’ and so on), the first step consists in calculating other (non-persistent) VTL datasets (in the example DF2bis_GDPPERCAPITA_USA and so on) by adding the identifiers INDICATOR and COUNTRY with the desired values (//INDICATORvalue// and //COUNTRYvalue)//. Finally, all these non-persistent data sets are united and give the final result DF2(1.0)[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[38~]^^>>path:#_ftn38]](%%), which can be mapped one-to-one to the homonymous SDMX dataflow having the dimension components TIME_PERIOD, INDICATOR and COUNTRY.
1772 +In other words, starting from the datasets explicitly calculated through VTL (in the example ‘DF2(1.0)/GDPPERCAPITA.USA’ and so on), the first step consists in calculating other (non-persistent) VTL datasets (in the example DF2bis_GDPPERCAPITA_USA and so on) by adding the identifiers INDICATOR and COUNTRY with the desired values (//INDICATORvalue// and //COUNTRYvalue)//. Finally, all these non-persistent data sets are united and give the final result DF2(1.0)[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[38~]^^>>path:#_ftn38]](%%), which can be mapped one-to-one to the homonymous SDMX dataflow having the dimension components TIME_PERIOD, INDICATOR and COUNTRY.
1695 1695  
1696 -Therefore, mapping different VTL datasets having the same data structure to different parts of a SDMX dataflow, i.e. in the direction from VTL to SDMX, through the ordered concatenation notation is equivalent to a proper use of the operators “calc” and “union” on such datasets. [[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[39~]^^>>path:#_ftn39]](%%)[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[40~]^^>>path:#_ftn40]]
1774 +Therefore, mapping different VTL datasets having the same data structure to different parts of a SDMX dataflow, i.e. in the direction from VTL to SDMX, through the ordered concatenation notation is equivalent to a proper use of the operators “calc” and “union” on such datasets. [[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[39~]^^>>path:#_ftn39]](%%)[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[40~]^^>>path:#_ftn40]]
1697 1697  
1698 1698  It is worth noting that in the direction from VTL to SDMX it is mandatory to specify the value for every Dimension on which the mapping is based (in other word, in the name of the calculated VTL dataset is not possible to omit the value of some of the Dimensions).
1699 1699  
... ... @@ -1742,7 +1742,7 @@
1742 1742  
1743 1743  Domain) is not identifiable. As a consequence, the definition of the VTL rulesets, which in VTL can refer either to enumerated or non-enumerated value domains, in SDMX can refer only to enumerated Value Domains (i.e. to SDMX Codelists). 
1744 1744  
1745 -As for the mapping between VTL variables and SDMX Concepts, it should be noted that these artefacts do not coincide perfectly. In fact, the VTL variables are  represented variables, defined always on the same Value Domain (“Representation” in SDMX) independently of the data set / data structure in which they appear[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[41~]^^>>path:#_ftn41]](%%), while the SDMX Concepts can have different Representations in different DataStructures.[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[42~]^^>>path:#_ftn42]](%%) This means that one SDMX Concept can correspond to many VTL Variables, one for each representation the Concept has.
1823 +As for the mapping between VTL variables and SDMX Concepts, it should be noted that these artefacts do not coincide perfectly. In fact, the VTL variables are  represented variables, defined always on the same Value Domain (“Representation” in SDMX) independently of the data set / data structure in which they appear[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[41~]^^>>path:#_ftn41]](%%), while the SDMX Concepts can have different Representations in different DataStructures.[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[42~]^^>>path:#_ftn42]](%%) This means that one SDMX Concept can correspond to many VTL Variables, one for each representation the Concept has.
1746 1746  
1747 1747  Therefore, it is important to be aware that some VTL operations (for example the binary operations at data set level) are consistent only if the components having the same names in the operated VTL data sets have also the same representation (i.e. the same Value Domain as for VTL).   For example, it is possible to obtain correct results from the VTL expression
1748 1748  
... ... @@ -2031,12 +2031,12 @@
2031 2031  “true” or “false”
2032 2032  )))
2033 2033  
2034 -
2035 -
2036 2036  **Figure 14 – Mappings from SDMX data types to VTL Basic Scalar Types**
2037 2037  
2038 -In case a different default conversion is desired, it can be achieved through the CustomTypeScheme and CustomType artefacts (see also the section Transformations and Expressions of the SDMX information model).
2114 +In case a different default conversion is desired, it can be achieved through the
2039 2039  
2116 +CustomTypeScheme and CustomType artefacts (see also the section Transformations and Expressions of the SDMX information model).
2117 +
2040 2040  The custom output formats can be specified by means of the VTL formatting mask described in the section “Type Conversion and Formatting Mask” of the VTL Reference Manual. Such a section describes the masks for the VTL basic scalar types “number”, “integer”, “date”, “time”, “time_period” and “duration” and gives examples. As for the types “string” and “boolean” the VTL conventions are extended with some other special characters as described in the following table.
2041 2041  
2042 2042  |(% colspan="2" %)**VTL special characters for the formatting masks**
... ... @@ -2087,7 +2087,7 @@
2087 2087  |N|fixed number of digits used in the preceding  textual representation of the month or the day
2088 2088  | |
2089 2089  
2090 -The default conversion, either standard or customized, can be used to deduce automatically the representation of the components of the result of a VTL transformation. In alternative, the representation of the resulting SDMX Dataflow can be given explicitly by providing its DataStructureDefinition. In other words, the representation specified in the DSD, if available, overrides any default conversion[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[43~]^^>>path:#_ftn43]](%%).
2168 +The default conversion, either standard or customized, can be used to deduce automatically the representation of the components of the result of a VTL transformation. In alternative, the representation of the resulting SDMX Dataflow can be given explicitly by providing its DataStructureDefinition. In other words, the representation specified in the DSD, if available, overrides any default conversion[[(% class="wikiinternallink wikiinternallink wikiinternallink wikiinternallink wikiinternallink" %)^^~[43~]^^>>path:#_ftn43]](%%).
2091 2091  
2092 2092  === 10.4.5 Null Values ===
2093 2093  
... ... @@ -2119,18 +2119,12 @@
2119 2119  
2120 2120  For implementing an SDMX compliant Web Service the standardised WSDL file should be used that describes the expected request/response structure. The request message of the operation contains a wrapper element (e.g. “GetGenericData”) that wraps a tag called “GenericDataQuery”, which is the actual SDMX query XML message that contains the query to be processed by the Web Service. In the same way the response is formulated in a wrapper element “GetGenericDataResponse”.
2121 2121  
2122 -As defined in the SOAP specification, the root element of a SOAP message is the Envelope, which contains an optional Header and a mandatory Body. These are illustrated below along with the Body contents according to the WSDL:
2200 +As defined in the SOAP specification, the root element of a SOAP message is the Envelope, which contains an optional Header and a mandatory Body. These are illustrated below along with the Body contents according to the WSDL:
2123 2123  
2124 -[[image:1747854006117-843.png]]
2125 -
2126 2126  The problem that initiated the present analysis refers to the difference in the way SOAP requests are when trying to implement the aforementioned Web Service in .NET framework.
2127 2127  
2128 2128  Building such a Web Service using the .NET framework is done by exposing a method (i.e. the getGenericData in the example) with an XML document argument (lets name it “Query”). **The difference that appears in Microsoft .Net implementations is that there is a need for an extra XML container around the SDMX GenericDataQuery.** This is the expected behavior since the framework is let to publish automatically the Web Service as a remote procedure call, thus wraps each parameter into an extra element. The .NET request is illustrated below:
2129 2129  
2130 -[[image:1747854039499-443.png]]
2131 -
2132 -[[image:1747854067769-691.png]]
2133 -
2134 2134  Furthermore this extra element is also inserted in the automatically generated WSDL from the framework. Therefore this particularity requires custom clients for the .NET Web Services that is not an interoperable solution.
2135 2135  
2136 2136  == 11.2 Solution ==
... ... @@ -2151,30 +2151,20 @@
2151 2151  
2152 2152  To understand how the **XmlAnyElement** attribute works we present the following two web methods:
2153 2153  
2154 -[[image:1747854096778-844.png]]
2226 +In this method the **input** parameter is decorated with the **XmlAnyElement** parameter. This is a hint that this parameter will be de-serialized from an **xsd:any** element. Since the attribute is not passed any parameters, it means that the entire XML element for this parameter in the SOAP message will be in the Infoset that is represented by this **XmlElement** parameter.
2155 2155  
2156 -In this method the **input** parameter is decorated with the **XmlAnyElement** parameter. This is a hint that this parameter will be de-serialized from an **xsd:any** element. Since the attribute is not passed any parameters, it means that the entire XML element for this parameter in the SOAP message will be in the Infoset that is represented by this **XmlElement** parameter.
2228 +The difference between the two is that for the first method, **SubmitXml**, the
2157 2157  
2158 -[[image:1747854127303-270.png]]
2230 +XmlSerializer will expect an element named **input** to be an immediate child of the **SubmitXml** element in the SOAP body. The second method, **SubmitXmlAny**, will not care what the name of the child of the **SubmitXmlAny** element is. It will plug whatever XML is included into the input parameter. The message style from ASP.NET Help for the two methods is shown below. First we look at the message for the method without the **XmlAnyElement** attribute.
2159 2159  
2160 -The difference between the two is that for the first method, **SubmitXml**, the XmlSerializer will expect an element named **input** to be an immediate child of the **SubmitXml** element in the SOAP body. The second method, **SubmitXmlAny**, will not care what the name of the child of the **SubmitXmlAny** element is. It will plug whatever XML is included into the input parameter. The message style from ASP.NET Help for the two methods is shown below. First we look at the message for the method without the **XmlAnyElement** attribute.
2161 -
2162 -[[image:1747854163928-581.png]]
2163 -
2164 2164  Now we look at the message for the method that uses the **XmlAnyElement** attribute.
2165 2165  
2166 -[[image:1747854190641-364.png]]
2167 -
2168 -[[image:1747854236732-512.png]]
2169 -
2170 2170  The method decorated with the **XmlAnyElement** attribute has one fewer wrapping elements. Only an element with the name of the method wraps what is passed to the **input** parameter.
2171 2171  
2172 -For more information please consult: [[http:~~/~~/msdn.microsoft.com/en-us/library/aa480498.aspx>>http://msdn.microsoft.com/en-us/library/aa480498.aspx]]
2236 +For more information please consult:  [[http:~~/~~/msdn.microsoft.com/en>>url:http://msdn.microsoft.com/en-us/library/aa480498.aspx]][[->>url:http://msdn.microsoft.com/en-us/library/aa480498.aspx]][[us/library/aa480498.aspx>>url:http://msdn.microsoft.com/en-us/library/aa480498.aspx]][[url:http://msdn.microsoft.com/en-us/library/aa480498.aspx]]
2173 2173  
2174 2174  Furthermore at this point the problem with the different requests has been solved. However there is still the difference in the produced WSDL that has to be taken care. The automatic generated WSDL now doesn’t insert the extra element, but defines the content of the operation wrapper element as “xsd:any” type.
2175 2175  
2176 -[[image:1747854286398-614.png]]
2177 -
2178 2178  Without a common WSDL still the solution doesn’t enforce interoperability. In order to
2179 2179  
2180 2180  “fix” the WSDL, there two approaches. The first is to intervene in the generation process. This is a complicated approach, compared to the second approach, which overrides the generation process and returns the envisioned WSDL for the SDMX Web Service.
... ... @@ -2187,27 +2187,16 @@
2187 2187  
2188 2188  In the context of the SDMX Web Service, applying the above solution translates into the following:
2189 2189  
2190 -[[image:1747854385465-132.png]]
2191 -
2192 2192  The SOAP request/response will then be as follows:
2193 2193  
2194 2194  **GenericData Request**
2195 2195  
2196 -[[image:1747854406014-782.png]]
2197 -
2198 2198  **GenericData Response**
2199 2199  
2200 -[[image:1747854424488-855.png]]
2201 -
2202 2202  For overriding the automatically produced WSDL, in the solution explorer right click the project and select “Add” -> “New item…”. Then select the “Global Application Class”. This will create “.asax” class file in which the following code should replace the existing empty method:
2203 2203  
2204 -[[image:1747854453895-524.png]]
2205 -
2206 -[[image:1747854476631-125.png]]
2207 -
2208 2208  The SDMX_WSDL.wsdl should reside in the in the root directory of the application. After applying this solution the returned WSDL is the envisioned. Thus in the request message definition contains:
2209 2209  
2210 -[[image:1747854493363-776.png]]
2211 2211  
2212 2212  ----
2213 2213  
... ... @@ -2300,5 +2300,3 @@
2300 2300  [[~[42~]>>path:#_ftnref42]] A Concept becomes a Component in a DataStructureDefinition, and Components can have different LocalRepresentations in different DataStructureDefinitions, also overriding the (possible) base representation of the Concept.
2301 2301  
2302 2302  [[~[43~]>>path:#_ftnref43]] The representation given in the DSD should obviously be compatible with the VTL data type.
2303 -
2304 -{{putFootnotes/}}
1747854006117-843.png
Author
... ... @@ -1,1 +1,0 @@
1 -xwiki:XWiki.helena
Size
... ... @@ -1,1 +1,0 @@
1 -18.7 KB
Content
1747854039499-443.png
Author
... ... @@ -1,1 +1,0 @@
1 -xwiki:XWiki.helena
Size
... ... @@ -1,1 +1,0 @@
1 -3.0 KB
Content
1747854067769-691.png
Author
... ... @@ -1,1 +1,0 @@
1 -xwiki:XWiki.helena
Size
... ... @@ -1,1 +1,0 @@
1 -21.4 KB
Content
1747854096778-844.png
Author
... ... @@ -1,1 +1,0 @@
1 -xwiki:XWiki.helena
Size
... ... @@ -1,1 +1,0 @@
1 -10.6 KB
Content
1747854127303-270.png
Author
... ... @@ -1,1 +1,0 @@
1 -xwiki:XWiki.helena
Size
... ... @@ -1,1 +1,0 @@
1 -12.1 KB
Content
1747854163928-581.png
Author
... ... @@ -1,1 +1,0 @@
1 -xwiki:XWiki.helena
Size
... ... @@ -1,1 +1,0 @@
1 -29.8 KB
Content
1747854190641-364.png
Author
... ... @@ -1,1 +1,0 @@
1 -xwiki:XWiki.helena
Size
... ... @@ -1,1 +1,0 @@
1 -31.5 KB
Content
1747854236732-512.png
Author
... ... @@ -1,1 +1,0 @@
1 -xwiki:XWiki.helena
Size
... ... @@ -1,1 +1,0 @@
1 -2.0 KB
Content
1747854286398-614.png
Author
... ... @@ -1,1 +1,0 @@
1 -xwiki:XWiki.helena
Size
... ... @@ -1,1 +1,0 @@
1 -55.0 KB
Content
1747854385465-132.png
Author
... ... @@ -1,1 +1,0 @@
1 -xwiki:XWiki.helena
Size
... ... @@ -1,1 +1,0 @@
1 -55.0 KB
Content
1747854406014-782.png
Author
... ... @@ -1,1 +1,0 @@
1 -xwiki:XWiki.helena
Size
... ... @@ -1,1 +1,0 @@
1 -55.0 KB
Content
1747854424488-855.png
Author
... ... @@ -1,1 +1,0 @@
1 -xwiki:XWiki.helena
Size
... ... @@ -1,1 +1,0 @@
1 -55.0 KB
Content
1747854453895-524.png
Author
... ... @@ -1,1 +1,0 @@
1 -xwiki:XWiki.helena
Size
... ... @@ -1,1 +1,0 @@
1 -55.0 KB
Content
1747854476631-125.png
Author
... ... @@ -1,1 +1,0 @@
1 -xwiki:XWiki.helena
Size
... ... @@ -1,1 +1,0 @@
1 -55.0 KB
Content
1747854493363-776.png
Author
... ... @@ -1,1 +1,0 @@
1 -xwiki:XWiki.helena
Size
... ... @@ -1,1 +1,0 @@
1 -55.0 KB
Content
1747855024745-946.png
Author
... ... @@ -1,1 +1,0 @@
1 -xwiki:XWiki.helena
Size
... ... @@ -1,1 +1,0 @@
1 -55.0 KB
Content
1747855054559-410.png
Author
... ... @@ -1,1 +1,0 @@
1 -xwiki:XWiki.helena
Size
... ... @@ -1,1 +1,0 @@
1 -55.0 KB
Content
1747855075263-887.png
Author
... ... @@ -1,1 +1,0 @@
1 -xwiki:XWiki.helena
Size
... ... @@ -1,1 +1,0 @@
1 -55.0 KB
Content
1747855402867-579.png
Author
... ... @@ -1,1 +1,0 @@
1 -xwiki:XWiki.helena
Size
... ... @@ -1,1 +1,0 @@
1 -55.0 KB
Content
1747855462293-368.png
Author
... ... @@ -1,1 +1,0 @@
1 -xwiki:XWiki.helena
Size
... ... @@ -1,1 +1,0 @@
1 -55.0 KB
Content
1747855493531-357.png
Author
... ... @@ -1,1 +1,0 @@
1 -xwiki:XWiki.helena
Size
... ... @@ -1,1 +1,0 @@
1 -55.0 KB
Content