Up to now, my company has performed Terminology Management using translate tables in each application. This kind of sort of works, but is cumbersome and prone to error. DocSite has a translate table for each practice, so the code mapping needs to be done over for each practice. Mirth, our CDR, has translate tables for each data source, and so the crossmaps have to be set up for each data source. Also, Mirth will not perform Standard-to-Standard code mapping, so if the code comes in in CPT, Mirth will not translate it to LOINC, for example.
I have a new project coming up where we will have perform code translation as the messages pass through our interface engine, because we will be recieving messages from trading partners (hospitals and clinics) and delivering them to a state-wide Health Information Exchange (HIE).
So, we will use Apelon's Distributed Terminology Service (DTS) and will work with Apelon to create cross maps from local lab codes to LOINC. We will then call DTS from within our interface engine to translate the codes before sending them to the HIE's component systems.
We will also use DTS to perform code set validation. For example, we will have the engine call DTS to ensure that we are actually getting a LOINC code in the field that we are expecting to receive a LOINC code.
I've done code mapping at a couple of previous jobs, so this is a process that I am familiar with. This will be Covisint's first foray into this technology, so it should be fun.
The other thing that should be fascinating is that we will be on-boarding approximately one hundred hospitals and approximately one thousand physician offices for this project. I should be very busy.