The Skills & Education Data Space needs to interconnect thousands of different data bases and services. To ensure this the data space proposes a trusted decentralised protocol and a series of technical building blocks to implement the trusted data sharing.

The technical design of the Skills Data Space is based on key requirements, which have been identified in deliverable D3.1 – An interim report on requirements and design approaches, Desk research performed in spring 2023 and feedback collected from three Professional Workshops performed during summer and autumn 2023. The key outcomes were:

  1. The Skills Data Space should follow a human-centred approach, with a specific focus on consent, transparency and strong personal data protection.
  2. The Skills Data Space should enhance semantic and conceptual interoperability.
  3. The Skills Data Space should specifically address data quality during and between all the operations.
  4. The Skills Data Space should enable trusted, transparent and verified AI applications in all levels of the design.
  5. The Skills Data Space should be based on decentralized infrastructure.
  6. The Skills Data Space architecture and implementation should be open to new solutions and applications that meets the criteria 1-5.

The Skills Data Space technical design’s Human-Centred Approach is strongly related to the Personal Data Intermediary (PDI) Building Blocks (see section 6.4) . PDI’s are connected to the three technical pillars ‘Data Interoperability’, ‘Data Sovereignty & Trust’ and ‘Data Value Creation’ of the DSSC blueprint, which are discussed in more detail in the sections below. A key aspect in human centricity is the trust of users in the ecosystem.

The Skills Data Space connects organizational actors like employers, education providers, staffing offices, digital platforms, public services, public administration, entrepreneurs and person/identity actors like employees, students, job seekers, etc.
Most likely only few of the actors share same data structures. This addresses the need for semantic and conceptual interoperability.

In the following sections, semantic interoperability refers to cases where machines can operate between different data structures and/or vocabularies, while conceptual Interoperability refers to cases where the meaning is made interoperable. I.e., a skill like “ability to mount the containers” must be used in the right context, either logistics or computing, but not by mixing these contexts.

When writing this blueprint in September 2023, the advances in Language Models (Large/Small) shows that there is a lot of potential to overcome several semantic and conceptual interoperability challenges. Even though there is no exact Building Blocks for Language Models, they are specified in this blueprint because they are one of the most promising areas for their development.

Powered by BetterDocs