Learning Records are available in many formats, either standardized (xAPI, SCORM, IMS Caliper, cmi5) or proprietary (Google Classroom, MS Teams, csv, etc). This wide variety of formats is a barrier to many use cases of learning records as it prevents the easy combination and sharing of learning records datasets from multiple sources or organizations.
As a result, Inokufu was tasked, within the Prometheus-X ecosystem, to develop a Learning Records Converter which is a parser translating datasets of learning traces according to common xAPI profiles.
LRC facilitates a streamlined conversion process through a two-phase operation, which ensures that the input data is correctly interpreted, transformed, augmented, and validated to produce compliant JSON outputs. The first phase converts a Learning Record from various input formats, into a single xAPI format. The second phase converts the xAPI learning records to ensure that they comply with the xAPI DASES Profiles.
Here is an architecture diagram illustrating the approach of the LRC to parse an input Learning Record in various standard, custom or unknown formats into an output Learning Record according to DASES xAPI profile.
The aim of this first phase is to convert a Learning Record to xAPI. In order to do this, we will set up two consecutive processes: Input Data Validation, and Data Transformation.
This component's role is to identify the format of the input Learning Record, and to validate that the records are valid.
Each dataset of Learning Records will have a metadata attribute stating the input-format
of the learning record.
If the input-format
is known, the corresponding data descriptor will be loaded to validate Learning Records are compliant. Otherwise, every data descriptors will be loaded, and try to interpret learning records.
This component's role is to convert the validated input data into the xAPI
format.
Depending on the input-format
of the learning records dataset, the processing will differ as follows:
input-format
is xapi
(cmi5 is considered a xapi
profile), the conversion will be skipped.input-format
is standard (scorm, ims_caliper), the corresponding mapping will be used by the component to process the learning record. For each standard, there is a corresponding mapper which enables the formatting of the learning record into the xapi
format.input-format
is unknown, then the Data Transformation module will do its best to automatically map each item of a learning record into the xapi
format.The first phase of the LRC is built with community collaboration in mind. It allows for easy contributions and extensions to both the input and output formats. The community can develop and share their own data descriptors and converters, which can be seamlessly integrated into the LRC's ecosystem, thereby enhancing the application's versatility to handle various input and output formats.
Here is a detailed architecture diagram illustrating the first phase of the LRC, to parse an input Learning Record in various standard, custom or unknown formats into an output Learning Record according to the xAPI standard.
These two consecutive processes can be summarized by this flow chart.
The aim of this second phase is to transform the xAPI Learning Record according to DASES xAPI profiles.
Each profile is defined in a JSON-LD file and includes specific concepts, extensions, and statement templates that guide the transformation and validation of xAPI statements.
The DASES profiles in JSON-LD format are automatically downloaded and updated from their respective GitHub repositories as defined in the .env
file.
The LRC enriches xAPI statements with profile-specific data, validates statements against profile rules, and give recommendations for improving compliance with the profiles.
This ensures that the converted learning records are not just in xAPI format, but also adhere to the specific DASES profile standards, enhancing interoperability and consistency across different learning systems.
verb.id
: Set to the appropriate verb URI (e.g., "https://w3id.org/xapi/netc/verbs/accessed" for accessing a page)verb.display.en-US
: Human-readable description of the verbobject.definition.type
: Set to the appropriate activity type URIcontext.contextActivities.category
: Includes a reference to the associated profileThe LRC currently supports theses main profiles :
Clone the repository:
git clone [repository_url]
cd [project_directory]
Install pipenv if you haven't already:
pip install pipenv
Install the project dependencies:
pipenv install
Set up environment variables:
Create a .env
file in the project root by copying .env.default
:
cp .env.default .env
You can then modify the variables in .env
as needed.
Start the FastAPI server using the script defined in Pipfile:
pipenv run start
The API will be available at http://localhost:8000
.
To convert a trace, send a POST request to the /convert
endpoint:
POST /convert
Content-Type: application/json
{
"input_trace": {
// Your input trace data here
},
"input_format": "<input_format>"
}
Supported input formats:
Response format:
{
"output_trace": {
// Converted xAPI trace data
},
"recommendations": [
{
"rule": "presence",
"path": "$.result.completion",
"expected": "included",
"actual": "missing"
}
],
"meta": {
"input_format": "<input_format>",
"output_format": "<output_format>",
"profile": "<DASES profile found>" // Optional, present when a DASES profile is detected
}
}
The meta object contains essential information about the conversion process.
The /convert_custom
endpoint allows for flexible conversion of custom data formats using mapping files:
POST /convert_custom
Content-Type: multipart/form-data
data_file: <your_data_file>
mapping_file: <your_mapping_file>
config: { // Optional
"encoding": "utf-8",
"delimiter": ",",
"quotechar": "\"",
"escapechar": "\\",
"doublequote": true,
"skipinitialspace": true,
"lineterminator": "\r\n",
"quoting": "QUOTE_MINIMAL"
}
output_format: "xAPI" (default)
The endpoint supports:
Example mapping file structure:
The endpoint will:
Send a POST request to the /validate
endpoint:
POST /validate
Content-Type: application/json
{
"input_trace": {
// Your input trace data here
},
"input_format": "<input_format>" // Optional
}
Response format:
{
"input_format": "<detected_or_confirmed_format>"
}
Once the server is running, you can access the interactive API documentation:
/docs
/redoc
These interfaces provide detailed information about all available endpoints, request/response schemas, and allow you to test the API directly from your browser.
The project uses Ruff for linting and formatting. Ruff is configured in pyproject.toml
with strict settings:
To understand how mapping works or to create your own mapping, a document is available here.
An explanation of how the project is organised is available here.
The following table details the environment variables used in the project:
Variable | Description | Required | Default Value | Possible Values |
---|---|---|---|---|
ENVIRONMENT | Affects error handling and logging throughout the application | No | development | development, production |
LOG_LEVEL | Minimum logging level for the application | No | info | debug, info, warning, error, critical |
DOWNLOAD_TIMEOUT | Timeout for downloading profiles (in seconds) | No | 10 | Any positive integer |
CORS_ALLOWED_ORIGINS | Allowed origins for CORS | No | * | Comma-separated list of origins or * for all |
PROFILES_BASE_PATH | Base path for storing profile files | Yes | data/dases_profiles | Any valid directory path |
PROFILES_NAMES | Names of the profiles to be used | Yes | lms,forum,assessment | Comma-separated list of profile names |
PROFILE_LMS_URL | URL for the LMS profile JSON-LD file | Yes | https://raw.githubusercontent.com/gaia-x-dases/xapi-lms/master/profile/profile.jsonld | Any valid URL |
PROFILE_FORUM_URL | URL for the Forum profile JSON-LD file | Yes | https://raw.githubusercontent.com/gaia-x-dases/xapi-forum/master/profile/base.jsonld | Any valid URL |
PROFILE_ASSESSMENT_URL | URL for the Assessment profile JSON-LD file | Yes | https://raw.githubusercontent.com/gaia-x-dases/xapi-assessment/add-mandatory-statements/profile/profile.jsonld | Any valid URL |
Note: The URLs for the profiles are examples and may change. Always use the most up-to-date URLs for your project.
Refer to .env.default
for a complete list of configurable environment variables and their default values.
The API uses standard HTTP status codes:
Status Code | Description | Possible Causes |
---|---|---|
400 | Bad Request | Invalid input format, malformed JSON |
404 | Not Found | Invalid endpoint, resource doesn't exist (profile file) |
422 | Validation Error | Format validation failed |
500 | Internal Server Error | Server-side processing error |
Notes:
detail
field with human-readable messageWe welcome and appreciate contributions from the community! There are two ways to contribute to this project:
master
branch.Before submitting your pull request, please ensure that your code follows our coding and documentation standards. Don't forget to include tests for your changes!
Please note this project is work in progress.
As a preparatory work for the development of the Learning Records Converter, Inokufu has conducted an exhaustive state of the art and quantitative study about the interoperability of Learning records.
This study is available here
https://gaia-x.eu/gaia-x-framework/
https://dataspace.prometheus-x.org/building-blocks/interoperability/learning-records