Skip to content

NAP Delivery scenario's

Delivery scenario's NAP as registry only

In this scenario the data delivery is direct between each operator and service provider. Each //TODO

Functional responsibilities and dataflows

\ CPO DataPortal ServiceProvider

Publish OCPI convert DATEX II aggregate datasets

x x

x

image3

System functions required in the Dataportal to support this scenario

Delivery scenario's with NAP as data-portal

Operators providing DATEX II merging done by SP

Functional responsibilities and dataflows [functional-responsibilities-and-dataflows-s1]

\ CPO DataPortal ServiceProvider

Publish OCPI convert DATEX II aggregate datasets

x

x

x

image4

Required Actions in the Dataportal to support this scenario

Action 0: preparations

  1. Take inventory whether available data meets IDACs requirements
  2. If applicable: assess whether data fullfills national requirements
  3. Create DATEX II profile if national obligations require additional datafields in DATEX II
  4. Define mapping of incoming information elements to DATEX II in case of available additional information.
  5. Define validation rules for consistency between static and dynamic dataflows.
  6. Develop the conversion tooling.

Action 1: set up the datachain

  1. Registration of credentials for accessing CPO data-source among which:
    • datatype (location or status)
    • delivery type (push or pull)
    • update interval
  2. Define endpoint per CPO
  3. Provide endpoint-information per CPO to be registered in meta data catalogue

Action 2 Run the datachain

  1. receive data according to set parameters
  2. validate consistency between actual and dynamic datasets in line with the consistency checks defined in Action 0 step 5
  3. convert data according to the mapping defined in Action 0 step 4.
  4. Provide access credentials to service providers
  5. Monitor timeliness of delivery

Operators providing DATEX II merging done by dataportal

Functional responsibilities

\ CPO DataPortal ServiceProvider

Publish OCPI convert DATEX II aggregate datasets

x x

x

image5

Required Actions in the Dataportal to support this scenario [required-actions-in-the-dataportal-to-support-this-scenario-s1]

Action 0: preparations

  1. Take inventory whether available data meets IDACs requirements
  2. If applicable: assess whether data fulfills national requirements
  3. Create DATEX II profile if national obligations require additional data fields in DATEX II
  4. Define mapping of incoming information elements to DATEX II in case of available additional information.
  5. Define validation rules for consistency between static and dynamic dataflows.
  6. Develop the aggregation tooling.

Action 1: set up the datachain

  1. Registration of credentials for accessing CPO data-source among which:
    • datatype (location or status)
    • delivery type (push or pull)
    • update interval
  2. Define endpoint for aggregated data-sources at NAP
  3. Provide endpoint-information of the NAP dataportal, including the information of data available per CPO to be registered in meta data catalogue

Action 2 Run the datachain

  1. receive data according to set parameters
  2. validate consistency between actual and dynamic datasets in line with the consistency checks defined in Action 0 step 5
  3. aggregate data to one endpoint.
  4. Provide access credentials to service providers
  5. Monitor timeliness of delivery

Operators providing OCPI merging and conversion done by dataportal

Functional responsibilities [functional-responsibilities-s1]

\ CPO DataPortal ServiceProvider

Publish OCPI convert DATEX II aggregate datasets

x

x x

image6

Required Actions in the Dataportal to support this scenario [required-actions-in-the-dataportal-to-support-this-scenario-s2]

Action 0: preparations

  1. Take inventory whether available data meets IDACs requirements
  2. If applicable: assess whether data fullfills national requirements
  3. Create DATEX II profile if national obligations require additional datafields in DATEX II
  4. Define mapping of incoming information elements to DATEX II in case of available additional information.
  5. Define validation rules for consistency between static and dynamic dataflows.
  6. Develop the conversion tooling.
  7. Develop aggregation tooling

Action 1: set up the datachain

  1. Registration of credentials for accessing CPO data-source among which:
    • datatype (location or status)
    • delivery type (push or pull)
    • update interval
  2. Define endpoint for aggregated data-sources at NAP
  3. Provide endpoint-information of the NAP dataportal, including the information of data available per CPO to be registered in meta data catalogue

Action 2 Run the datachain

  1. receive data according to set parameters
  2. validate consistency between actual and dynamic datasets in line with the consistency checks defined in Action 0 step 5
  3. convert data according to the mapping defined in Action 0 step 4.
  4. aggregate incoming data to one endpoint
  5. Provide access credentials to service providers
  6. Monitor timeliness of delivery

Hybrid dataprovision by operators merging and conversion done by dataportal

Functional responsibilities [functional-responsibilities-s2]

\ CPO DataPortal ServiceProvider

Publish OCPI convert DATEX II aggregate datasets

x x

x x

image7

Required Actions in the Dataportal to support this scenario [required-actions-in-the-dataportal-to-support-this-scenario-s3]

Action 0: preparations

  1. Take inventory whether available data meets IDACs requirements
  2. If applicable: assess whether data fullfills national requirements
  3. Create DATEX II profile if national obligations require additional datafields in DATEX II
  4. Define mapping of incoming information elements to DATEX II in case of available additional information.
  5. Define validation rules for consistency between static and dynamic dataflows.
  6. Develop the conversion tooling.
  7. Develop the aggregation tooling, capable of receiving both native DATEX II from CPO's and internally converted datasets

Action 1: set up the datachain

  1. Registration of credentials for accessing CPO data-source among which:
    • datatype (location or status)
    • delivery type (push or pull)
    • update interval
  2. Define endpoint for aggregated data-sources at NAP
  3. Provide endpoint-information of the NAP dataportal, including the information of data available per CPO to be registered in meta data catalogue

Action 2 Run the datachain

  1. receive data according to set parameters
  2. validate consistency between actual and dynamic datasets in line with the consistency checks defined in Action 0 step 5
  3. convert OCPI data according to the mapping defined in Action 0 step 4.'
  4. Aggregate native DATEX II provided data with converted datasets and publish
  5. Provide access credentials to service providers
  6. Monitor timeliness of delivery
Go back to the previous page