Thursday, December 23, 2010

HL7/IHE Health Story Implementation Guide Consolidation Project

This crossed my emailbox yesterday. I will be participating in my role as Publishing Facilitator for the HL7 Structured Documents Work Group.

-----

On behalf of the Health Story Project and HL7 Structured Documents Work Group I am pleased to announce and invite your participation in the launch of the HL7/IHE Health Story Implementation Guide Consolidation Project.

The Project will take place under the auspices of the Office of the National Coordinator (ONC) Standards & Interoperability Framework and is a collaborative effort among the Health Story Project, Health Level Seven (HL7), Integrating the Healthcare Enterprise (IHE) and interested Healthcare industry stakeholders - collectively, the Consolidation Project Community.

ONC has launched a wiki page to support the project community which you can access here:

http://jira.niemhealth.org/wiki/display/SIF/CDA+Template+Consolidation+Project

NOTE: When you first click the link, you will need to Register for a JIRA wiki account and request a login before you can access the page.

FIRST MEETING: January 4, 2011

# Recurring meeting: Tuesday @ 11:00 AM EST - 12:30 PM EST
# Conference line: 1-770-657-9270, Participant Passcode: 310940
# Webmeeting: tbs

In short, the Project will:

1. Republish all eight HL7/Health Story Implementation Guides plus the related/referenced templates from CCD in a single source.

2. Update the templates to meet the requirements of Meaningful Use, in other words, augment the base CCD requirements to meet the requirements of HITSP C32/C83.

3. Reconcile discrepancies between overlapping templates published by HL7, IHE and HITSP.

The Project will not develop new content. (There is a parallel project starting up under ONC to address new content requirements – this is not that project.)

The Project wiki includes links to source material including the HL7 Project Scope Statement.

Please let me know if you have any questions. I look forward to a successful collaboration.

Friday, December 17, 2010

Radiology Orders and Results Interface

Here is another interesting interface project.

We want to send Radiology Orders from our EHR system to our trading partner and want the Radiology Results document to be returned electronically and placed into the patient's chart in the EHR.

Simple. Right.

Except, the scheduling system at our trading partner will not receive the order. I suggested that we send them an order message (ORM), a scheduled appointment (SIU), and a registration message for a future encounter (ADT). They told us that they could not process any of those HL7 messages into their scheduling system. So, we settled on this kluge. I will convert the order message into an email that I will send securely to the scheduling department. They will then manually key the appointment into their system. Hopefully, they will remember to key our Order Number so that I can match the result to a patient. Securely? Well, yes. The appointment will contain PHI, so I will utilize the secure email function of our email system. They wanted some data that the EHR could not place into the order, so I had to look that data up from the database and place it into the email.

So much for the outbound "order." Now, for the returned results.

A Radiology Result is just a special type of Clinical Document.

We have an existing document interface which we use to receive clinical documents from this trading partner as ORU^R01 messages. They will place these radiology documents into that stream and I will have to select those messages, process them a little bit differently and send them to the EHR. I will need to look up our patient identifier and put it into the message in place of their patient identifier. I will also look up the NPI for each of the incoming providers and place that into the message. The sending system gives us their internal identifier for each provider, and the EHR would like to see the NPI instead.

We are still developing this interface. The outbound orders seem to be working fine. I have built the results processing route and am waiting to get a test result to see if I have built that correctly.

Saturday, November 20, 2010

Filtered ADT Feed to an Ultrasound System

Here is some more fun that an integration analyst can have.

I was asked to put together an interface to an ultrasound system that is used in two of our OBGYN clinics. They wanted patient demographics for the patients that would be arriving for an ultrasound at these clinics.

So far, so good. Typically, one would send a filtered version of the ADT stream to this system. Except that the ADT feed that comes out of our practice management system does not contain enough information for me to do so. However, the scheduling messages that come out of the PM system do. Kind of.

They send an SIU^S12 message with a patient status of "ARRIVED" when the patient arrives for their scheduled appointment. The first thing is to find the correct clinic code in the message. The message codes the location as "OBG^Oxxx" indicating that the patient belongs to the OBGYN department and is to be seen in location Oxxx. That's non-standard usage, but I was able to figure it out.

OK. So, I can pick that message out and translate it into an ADT^A04 (Register Patient) message and send that to the ancillary.

The other trick is to build a filter that will only allow the ultrasound patients to pass. Usually, there is information in the AIG segment (General Resource) that will tell you that the ultrasound machine needs to be part of the appointment. However, this PM system doesn't schedule that way. They send a bogus provider code for the ultrasound machine (actually, two of them, one for each clinic) in the SCH-12 (Placer Contact Person) field in the scheduling message. I would not have expected to find this data in this field.

Of course, none of this was documented. I needed to look at the live message stream and the values in the messages and deduce the values to filter on. I think that I got it right, but I won't know for certain until we test.

This message won't have all of the demographic information that the ancillary wants, but it will be better than nothing.

This illustrates part of the fun that interfacing can be. The business owners know what they want to do (usually), but they don't know how to do it. My job is to understand what they want to accomplish and then figure out how to construct an interface that will accomplish those objectives.

Thursday, October 14, 2010

Patient List Interface

Here is a cute, little interface project that I knocked out in a couple of days.

One of our business partners is sending us what they call "patient list" data. This is information on patient groupings. For example, a patient could be on the "Gastroenterology Consults" list. This is information that does not come across in the organizations ADT stream, and the likelihood of getting it added to that was pretty low. Instead, they send us a patient list file which would contain a record for each patient and the patient list that they are on. They generate this as a Comma Separated Value (CSV) file and would send that to us as a flat file. I built a route and a mapping in the engine to take the flat file, generate an HL7 A08 (patient udate) message for each patient and place the Patient List information into the PV2-23 field (ClinicOrganizationName). The route then places each ADT message into the ADT stream for the appropriate system.

The A08 updates the basic patient demographic information by only adding the patient list information to the PV2-23 field ClinicOrganizationName. They can then select all patients that are currently in the Neurology list, for example. The receiving system gets the Patient List information with no changes required.

We already had a mechanism in place to exchange flat files, so they added this file to the stream. On the receiving end I have a script that looks at each file, identifies which one it is and routes it for processing. I added the new file to the script and built a new route to process the patient list file.

I then had to build a data structure for the CSV, which was pretty straightforward. I then built a special ADT structure that contained multiple A08 messages, and then a Map (that is the Rhapsody term, in Cloverleaf, this would be a Translate) to map the data from the CSV file to the ADT Message. After this, I added a DeBatch filter to the route to split each of the A08s into a separate message.

I had not built this sort of interface before, so I learned a little bit doing it. The trickiest part was getting the file structure definitions set up correctly. The actual mapping took about an hour to build and test.

Friday, August 20, 2010

NextGen System Configuration Training

We are replacing the Practice Management at work with the NextGen Enterprise Practice Management (EPM) system. WSUPG installed the NextGen EMR several years ago. The normal practice is to install EPM and then EMR, so we are going about it backwards.

Next week, I will be in System Configuration Training for three days. In preparation for this, I took web based training that walked me through most of the functionality. Earlier this week I actually got access to a test environment for EPM and have been poking around that.

The online training was ok, and I learned a lot. I feel prepared for the coursework next week.

At several points in the training, I found myself thinking "Oh, that's how we'll do that."

I did have one issue with the web based training: it would only run in Internet Explorer. I gave up on using IE several years ago and use Firefox as my primary browser after a brief experiment with Chrome earlier this year. It is appalling that ANY web based application requires a particular browser to run in 2010! I sent feedback to NextGen on this when I started the online training.

We're upgrading the EMR next month and once that is done I'll get cracking on building interfaces to EPM. I'll be busy this fall building interfaces to EPM. We'll have several ADT feeds, several charge interfaces, a few lab results interfaces and a Master File interface to load updated provider information. I also have several "flat files" that I will have to parse and use the content to build HL7 messages that will be fed into EPM. I have started coding those, and we should be in good shape.

This should keep me busy most of the fall and winter.

Thursday, August 12, 2010

Camping Trip in August 2010


I just returned from a camping trip to Michigan's Upper Peninsula (aka "the UP:) with my dog Sonny. We spent time in rustic campgrounds at Tahquamenon Falls (the River-mouth Unit) and on Brevort Lake.

Sonny and I have enjoyed our summer camping trips for ten years, now. Every year, I think that he might be too old for hiking in the woods. And, every year, I have to remind him to slow down once we hit the hiking trails.

We traveled to the Tahquamenon River-mouth Unit campground on Saturday. It was an easy drive, and we arrived at the campground shortly after 1pm. I set up camp and we walked a bit around the campground. The campground is on the south bank of the Tahquamenon River located very close to where the river empties into Lake Superior. The campground has two loops, and we stayed in the "Semi Modern" loop, which had been called the "rustic loop" when I first camped here over twenty years ago. The sites are nicely wooded. There is no electricity. There are pit toilets. There are modern facilities, including a shower in the Modern loop, and we could utilize those, if we chose to.

On Sunday morning, it rained heavily while I was making breakfast. We waited out the deluge in the car. This is the first time that we have done that in all the years that we have been camping. A curious thing about the campground is that the soil is quite porous and following the heavy rain, the standing puddles of water disappeared in fifteen minutes or so.

I foolishly tried to continue to make breakfast, but bacon that has been waterlogged is not very tasty. Sonny enjoyed it, however. After breakfast, we drove to Whitefish point and walked a bit up there.

The main event for Sunday was a hike through the back-country at the Upper Tahquamenan Falls. We hiked the 8.5 mile Wilderness Loop during the afternoon. It was much warmer than it had been that morning. The trail was quite overgrown, and we had to crawl over and under and around many fallen trees. At one point, the trail which had been next to the Beaver Pond was a part of the pond, and I walked through water that was up to my knees. We returned to the car late in the afternoon and Sonny was asleep before we left the parking lot.

We moved camp to Brevort on Monday morning. We have been to this campground for eight of the last ten years and have been to this campsite three times. I like this campground because it is over a mile from US-2, but is close enough that we can drive to the beach on Lake Michigan and walk there. We walked that beach every day that we were there. There is a portion of the North Country trail that passes nearby. We have hiked this trail in our previous trips, but did not do so this year.

The routine for Tuesday and Wednesday began with breakfast, and then a walk along the beach on Lake Michigan. We returned to the campsite for the afternoon. We took it easy and I spent the afternoon reading Churchill's "The World Crisis" and practicing my Chapman Stick.

We packed up on Thursday morning and were back home at 2pm.

It does my soul good to leave the technology home for a week. I did take my cellphone with me, but I had it turned off for most of the trip.

Sonny is getting older. Here are some signs:

1. If nature calls during the middle of the night, Sonny will stay curled up and sleeping in the tent. Years ago he would always go outside with me.

2. At the Brevort campsite, he laid there and watched ducks swim up to the shore. Years ago, he would have charged after them.

3. He was so tired after our Sunday hike, that when I went to put him in the car, he would stop and look as if to say "that looks awfully high." I would ask him if he wanted a ramp. Normally, he leaps right in to the car whenever I open the door.

All in all, it was a wonderful trip. I'll be back to work on Monday.

Thursday, July 29, 2010

HL7 SDWG Completes Review of CDA R3 Formal Proposals

The final formal proposal for changes to the HL7 Clinical Document Architecture standard was reviewed today. The HL7 Structured Documents Work Group (SDWG) has been reviewing formal proposals for the next release of CDA for over a year. CDA R2 is the standard that is used for most of the exchange of clinical information in the Meaningful Use requirements that were published in the Federal Register this week.

I am the publishing facilitator for the SDWG, so this means that I get to work on creating the new version of the standard. I worked on CDA R2 back when I was working for HL7, converting the word document into "publishing XML." The new version of the standard will be developed using the HL7 Publishing database. I have not used the publishing db in several years, so it will be interesting to see how the tool has changed over the years.

The enhancements in CDA R3 are based on the experience of implementers. CDA will also use HL7 Version 3 Datatypes Release 2, which will give us new functionality.

Here is a link to the list of suggested enhancements for CDA R3:

http://wiki.hl7.org/index.php?title=CDA_Suggested_Enhancements

I'll post updates as we work through the process. I expect that we will ballot the new document early next year, and, hopefully, CDA R3 will become an approved standard sometime in 2011.

Tuesday, July 13, 2010

Meaningful Use Final Rule

The Final Rule for Meaningful Use was published today.

I haven't finished reading the 1000 pages, yet, so here are my preliminary impressions.

The document is available here..

http://www.ofr.gov/OFRUpload/OFRData/2010-17210_PI.pdf

http://www.ofr.gov/OFRUpload/OFRData/2010-17207_PI.pdf

They relaxed a couple of requirements, with thresholds being reduced from 80% to 50%. I don't see this as significant, because if your organization is really committed to meeting the requirements, you will be well over 80%. If you are committed to CPOE, you will be at well over 80%. If reducing the limit means that a few more organizations qualify for MU funds, that's fine with me.

They also removed the requirement for either SOAP or REST to be used as a transport protocol, which means that those organizations that use MLLP through encrypted tunnels will not have to re-do those interfaces.

I am still a little troubled by the requirement to use CCR and CCD, but since that means that my organization can produce CCD and the other CDA based documents, we should be fine. We will have to be able to consume CCR, but that is not as onerous as it might be. HL7 and ASTM are beginning a project to translate from CCR to CCD, so that should help organizations that do not want to have to support both.

It could have been better. It could have been worse.

Sunday, June 20, 2010

Northwestern Commencement 2010


On Thursday, I drove to Evanston for Northwestern's Commencement. There were ceremonies on Friday and Saturday. I took my parents with me, because I felt that they should be part of the celebration.

Before the main ceremony, the School of Continuing Studies (SCS) held a reception that we attended. As a distance learning student, I had not actually met many of my classmates. It was nice to finally meet some of them. I would recognize a voice in the crowd and walk over and introduce myself. I told the associate dean that I still thought that they might come to their senses and pull my diploma. He thought that was funny.

The main ceremony was on Friday at 6pm. At approximately 4:30pm, a thunderstorm hit the city, and we were drenched while we were on our way to the stadium. The weather cleared, and they were able to hold the ceremony outside, as scheduled. It was quite a sight. I was glad that I went. Mike Wilbon from ESPN and ABC (and a Northwestern Alum) gave the commencement address.

We went and had a wonderful dinner at Morton's following the ceremony. Another thunderstorm hit while we were having dinner, and we drove back to the hotel through downed power lines and traffic signals that were not functioning.

The ceremony for SCS was on Saturday afternoon at the Millar Chapel. It was hot. I was hooded and received my diploma. The drive back to Ann Arbor took longer than I expected, and I did not make it home until after 9pm. My brother Rob and Leslie were dog sitting for me, and we had steaks after I got home.

All in all, it was a wonderful weekend.

Friday, June 11, 2010

Restarting Interfaces

We recently changed five existing interfaces to "pass through" the Rhapsody interface engine that I am working with. These interfaces are made via a vpn tunnel between the two organizations.

That first day, one of the interfaces stopped working several times. I stopped and started the communication point on my end, and the connection was re-established and messages began to flow, again.

The next morning, one of the interfaces had not seen messages cross for seven hours. I again stopped and started the communication point, and messages began to flow.

I was told that "they do that all the time."

The next morning, three of the five new interfaces had stopped overnight. I restarted them, they connected and messages began to flow.

I have seen this sort of behavior with interfaces that cross through VPNs. When the interface is inactive, the vpn shuts down, but the receiving end still thinks that it has a connection. The sending end will try to re-establish a connection, but since the receiving side still thinks that it has a connection, the new connection is refused.

Being the cautious person that I am, I changed one of the three communications points to restart every hour. The interface would stop, reconnect, and messages would begin to flow, again. I wanted to see one of them work before I changed anything else.

Once I saw that this worked, I made similar changes to the other two interfaces that had exhibited similar issues. Two of the five original interfaces had not shown any issues. One was an ADT feed that constantly had traffic. The other received a "batch" of messages every morning. It would reconnect, deliver the messages, and then disconnect.

So far, the hourly re-start seems to be working. For years, these interfaces had failed. A trouble ticket would be opened, and an operator would manually stop and start the interface to get things working, again. Based on previous experience, I was able to diagnose the problem and implement a simple fix. This will save our business partner opening trouble tickets and our operators from manually restarting these interfaces.

One small step...

Thursday, June 3, 2010

Masters in Medical Informatics

I just completed the Capstone project for my Masters in Medical Informatics (MMI) program at Northwestern University. I have been working on this degree for two and a half years. This is a distance learning program, so I did all of the course work online. I found it difficult to say that I was "going back to school" while I sat in front of my computer.

I learned a lot in the program, and discovered that I knew a lot, too.

Here is my "portfolio" website that I created as part of the Capstone Project.

http://sites.google.com/site/peterngilbert/home

Sunday, April 25, 2010

First Evidence of Meaningful Use

We recently received an email from a laboratory system that sends us results. They currently send laboratory tests to us that are coded using "local vocabulary." They asked us if we could accept LOINC codes (http://loinc.org/) in the alternate codes.

What they are proposing is adding LOINC codes to the OBX-3 Observation Identifier field. The OBX-3 data type is Coded With Exceptions (CWE), The structure of this field is:

<Identifier (ST)> ^ <Text (ST)> ^ <Name of Coding System (ID)> ^ <Alternate Identifier (ST)> ^ <Alternate Text (ST)> ^ <Name of Alternate Coding System (ID)> ^ <Coding System Version ID (ST)> ^ <Alternate Coding System Version ID (ST)> ^ <Original Text (ST)>

So, they are proposing to send their local code in the first "triplet" of the CWE and the LOINC code in the second "triplet". This would change:

OBX|1|CWE|xxxxx^Local Test Value||result goes here.....

to

OBX|1|CWE|xxxxx^LocalTestValue^L^yyyyyy-y^Loinc Test Name^LN||result goes here.....

The first three subfields of obx-3 (aka the first triplet) are the local code (xxxxx), description (LocalTestValue), and the "L" indicates a local code. The second set of subfields (aka the second triplet) are the alternate identifiers and are the LOINC code for the test (yyyyyy-y), the description of the test from LOINC, and "LN" which specifies that the alternate code comes from LOINC.

This lab has been sending local codes for its tests for many years. The meaningful use requirements of ARRA/HITECH are nudging them to use LOINC.

Who says that we are not making progress :-)

Sunday, April 18, 2010

NHL Playoffs 2010

This is my favorite time of the year. I love the Stanley Cup Playoffs. I love that teams play every other night, unlike the endless delays in the NBA playoffs. It does not take long for teams to develop a strong dislike for each other during a playoff round. Because the coaching and film study have advanced in recent years, each team knows the other's tendencies and can develop counters for them.

I love playoff overtime. The overtime games tend to end either very quickly or go on and on.... As long as I don't have to get up for work the next morning, I will stick it out to the bitter end.

Here are a few things that I will be watching for during the first round (I refuse to call them "the conference quarter finals"):

  • Which of the goalies that are in the playoffs for the first time will shine? Jimmie Howard? Bryzgalov? Quick? Rask? Halak? Niemi? Wow. That's a lot of teams that are relying on untested goaltenders.
  • Upsets. The first round always seems to give us an upset.
  • San Jose. Can the Sharks make it out of the first round?
  • Officiating. Which games will be decided by a crucial call or non-call?
  • Who will step up and who will disappear? The playoffs usually give us an unlikely hero as well as some disappointing performances by "stars"?
Let's enjoy the next two months.