This page is part of the Data Access Framework (v1.6.0: STU 2 Ballot 1) based on FHIR v1.6.0. . For a full list of available versions, see the Directory of published versions
DAF-Research IG describes four capabilities C1, C2, C3, C4, each one of which is intended to help improve the data infrastructure for PCORnet and in the larger context a Learning Health System. This part of the IG provides additional guidance for the implementation of each of the four capabilities. This section is not normative and is only intended to provide guidance to implementers.
Implementing C1 capability involves four steps.
Note: Data Source actor can be implemented by system like an EMR natively or alternately it can be implemented as a layer (additional software module) on top of an EMR system. In either case the implementation has to meet the Data Source Conformance requirements. The next few paragraphs will provide details for each step above.
The Data Source actor has to support the creation of a DAF-Task resource instance. This can be achieved using a FHIR API using the POST operation or using a graphical user interface which allows an end user to create the Task instance. This task instance has to have the following data
This Task instance would then be persisted for execution. The actual execution of the task can be controlled using a scheduled timer or a manual kick off. Note: If this is a task that is set up to repeat at a regular frequency, this step can be skipped after the first time.
The Task created earlier is Step 1 is executed at some point of time automatically or manually and the following actions are expected to happen.
The PCORnet CDM is a consensus artifact that has been adopted by PCORnet as a model for Data Marts which can then be queried by Researchers. Since this is a different data model than FHIR the following guidance can be used to extract data so that PCORnet CDM can be appropriately populated. However data extraction programs have to be aware that vendors may be supporting just DAF-Core or a subset of DAF-Core for their initial implementation and hence may not have all the PCORnet CDM data elements available.
As one can see there are a few new resources that would be proposed and created for an effective mapping of PCORnet CDM to FHIR and vice versa. These New Resources will be proposed to the appropriate HL7 WGs based on pilot implementations and feedback. Similarly extensions required will be proposed and added to the profiles after pilot implementations are completed. Profiles which are not DAF-Core are annotated accordingly in the table below.
PCORnet CDM Table Name | Recommended Profile for Data Extraction |
---|---|
DIAGNOSIS, CONDITION | Condition* |
LAB_RESULT_CM | DiagnosticReport-Results |
ENCOUNTER | Encounter* |
Prescribing | MedicationOrder |
DISPENSING | MedicationDispense* |
LAB_RESULT_CM | Observation |
VITALS | Observation-Vitalsigns |
DEMOGRAPHIC | Patient* |
PROCEDURES | Procedure |
PRO CM | Questionaire Profile - TBD |
ENROLLMENT | Potential New Resource - TBD |
PCORNET_TRIAL | Potential New Resource - TBD |
DEATH | Potential New Resource/Profile - TBD |
DEATH_CAUSE | Potential New Resource/Profile - TBD |
HARVEST | New Resource - TBD |
Some PCORnet sites are using OMOP model as a source or destination and hence a mapping from FHIR to OMOP would be useful for these sites. The following is a mapping that was developed by the DAF pilot sites and can be a starting point for the implementation of C1 capability. Profiles which are not DAF-Core are annotated accordingly in the table below.
OMOP Table Name | Recommended Profile for Data Extraction |
---|---|
Concept,Vocabulary,Domain,Concept_Synonym,Concept_Ancestor | ValueSet ** |
Concept_Class | Concept ** |
Concept_Relationship, Relationship | ConceptMap ** |
Cohort_Definition, Attribute_Definition | Group ** |
Specimen | Specimen ** |
Drug_Strength | Medication |
Procedure_Occurence | Procedure |
Drug_Exposure | MedicationOrder,MedicationStatement,Immunization |
Device_Exposure | Procedure,Device |
Measurement,Note,Observation | Observation |
Person | Patient* |
Observation_Period, Visit_Occurence | Encounter* |
Condition_Occurence | Condition* |
** Base FHIR Resources without any specific profiles, * DAF Research specific profiles
The Data Mart actor has to support the creation of a DAF-Task resource instance. This can be achieved using a FHIR API using the POST operation or using a graphical user interface which allows an end user to create the Task instance. This task instance has to have the following data
This Task instance would then be persisted for execution. The actual execution of the task can be controlled using a scheduled timer or a manual kick off. Note: If this is a task that is set up to repeat at a regular frequency, this step can be skipped after the first time.
A Bundle returned from Step 2 will conform to FHIR and DAF-Core or other specific IG requirements. This Bundle may have to go through additional transformations, mappings and other processing before it is loaded into a destination Data Mart. One of these processing steps is “De-Identifying the data”.
It is expected that most vendors supporting the ONC 2015 Edition CCDS API’s or the Patient/$everything operation would be returning identifiable patient information as part of the API. Since PCORnet required de-identified data the de-identification has to be performed subsequently. Implementations can choose internally approved mechanisms for de-identifying the data and populating the PCORnet CDM.
One of the value propositions of the data extract standardization is the need to eliminate mappings from each Data Source. As long as a Data Source has performed the right mapping to its FHIR Resources and profiles, the extracted data can be directly mapped to a destination model of choice such as the PCORnet CDM. The following is a mapping of FHIR to PCORnet CDM developed by DAF working with PCORnet community and data experts. This mapping can be followed to load the appropriate tables within the PCORnet CDM.
For systems loading to from OMOP to FHIR the following mapping developed by DAF pilots can be used.
Using the above mapping the task to load the data would be executed as follows.
Implementing C2 capability involves three steps.
The next few paragraphs will provide details for each step above.
The Data Mart has to instantiate a Conformance Resource instance to declare its characteristics that would help a Researcher to compose queries. In addition the Conformance Resource should also help a Data Mart administrator to manage the data within the Data Mart. The Conformance Resource declares the various profiles, operations and other specifics about the implementation. For the DAF Data Mart actor the following data is expected to be present within the DAF-Conformance resource instance.
Conformance.rest.mode - Populate with “Server”
For each Operation that is supported by the Server, a DAF-OperationDefinition instance should be created with the appropriate data and then the Conformance.Operation.defintion should point to the instance that has been created.
The following extensions should be populated for the Conformance resource instance * PCORnet Data Mart Active Flag - This indicates if the Data Mart is still active and is accepting queries.
The DAF-OperationDefinition profile has been created to help servers declare conformance to the various DAF-Research operations. In order to declare support for various operations, an implementation would create an instance of DAF-OperationDefinition and then point to it by the Conformance.rest.operation part of the Conformance resource.
The following data elements are expected to be populated for each DAF-OperationDefinition that is instantiated.
The following Extensions have to be populated as part of the OperationDefinition
The Conformance profile once published gets updated less frequently as compared to other clinical resources. However updates to the Conformance profile will be performed due to changes in the following data elements
The Conformance resource just like other FHIR resources can be queried by researchers.
Conformance resources should be available for querying without requiring additional authorization.
The Conformance resource will be published at the well known FHIR URL
Capability C3 implementation involves two steps
In PCORnet and most research environments, queries submitted to access data are asynchronous in nature, repeated frequently and may involve humans in the work flow performing approvals, rejections etc. In order to support these requirements an instantiation of a Task is performed. In order to track the Tasks across multiple Data Marts and states the following Task hierarchy is implemented.
A Task (this is known as the Root Task) would be created based on the query composed by the Researcher. For each Data Mart that the query will be sent to, a new Task instance (Data Mart specific task) would be created using the data from the Root Task. The parent of the Data Mart specific Task would be Root Task. Each Data Mart when it executes it’s Task would create an instance of the Task for the execution (Execution specific Task) from the Data Mart specific Task and then populate it accordingly with the results of the execution. This hierarchical nature would facilitate the Researcher to retrieve data specific to an execution within the Data Mart, across all Data Mart executions to date or across all the Data Marts.
This Root Task instances created will have the following data
The following extensions need to be populated on the Task
The following are the list of inputs to the daf-execute-query operation which would be populated on the Task.input data element.
Optionally the query can indicate the type of data expected as part of the results as part of the queryResultsPhiDisclosureLevel.
In order for the Researcher to execute the query against multiple Data Marts, the Research Query Composer system has to create an instance of the Root Task created in Step 1 for each Data Mart. In order to make Tasks specific to a Data Mart, the following Task data elements would be set.
All the other data elements would be replicated from the root task. Once the Data Mart specific Tasks are created, Research Query Responders can access these tasks via the search mechanism on the Task.owner data element.
Implementing C4 capability involves the following three steps
Each Research Query Responder can access the queries that it needs to execute by performing a GET on the Task where Task.owner would be itself. This GET operation on the Task resource may cross firewall boundaries and might require appropriate authorization before the resources can be accessed.
The Research Query Responder would then duplicate the task with all the data for the specific execution. This new Task instance would have the Data Mart specific Task as its parent. The Research Query Responder would set the Task.status to “Received”, “Accepted” and “Ready” as appropriate.
The Research Query Responder would start the execution specific Task instance by updating the Task.status to “In-Progress”. The Research Query Responder would then translate the incoming query to native execution language based on the following parameters
The query would then be executed and the results would be created using the DAF-QueryResults Observation profile. The data would be represented as follows
Observation.code - Set this to the types of things being aggregated. It can be Patient, Encounter, Observation etc.
For each measurement create an Observation.component entry with the following
One Observation.component should be created for each stratified data element.
Once these results are created the Research Query Responder should create the Bundle and set the execution specific Task.output to the Bundle instance.
The Task.status should be set to “Completed”.
In case of failures the data is returned as part of the OperationOutcome element.
These execution specific Task instances are now available for retrieval by the Researcher.
In order for a Researcher to get a complete picture of the population based on their query submitted, the query results from multiple Data Marts have to be retrieved. For this purpose the Research Query Composer, has to query each of the Data Marts for execution specific Task instances with the parent set to the Data Mart specific Task that was created during the initiation of the query. Once these task instances are retrieved then the Task.output would contain the result of each query execution for each Data Mart. These results would then be made available for the Researcher for further analysis.