The Cloud in Healthcare – Top 10 Takeaways from iHT2 San Francisco

In the spirit of David Letterman’s top 10 lists here are our takeaways from the San Francisco iHT2 event this past week.  http://ihealthtran.com/sanfranciscohome.html

1. IHPs (large integrated health providers, like university systems, etc), are by and large going with EPIC for EHR solutions, thereby automatically forgoing a degree of flexibility and any chance of real near-term interoperability.

2. The historical problems of security, reliability, and control with Cloud-based solutions are being rapidly overcome, and the cost savings from hosting data and applications in the Cloud are becoming so compelling that increasingly complex medical organizations and systems will require the Cloud in order to be effective and efficient…..or risk becoming extinct.

3. The HealthCare system, as usual, is the last great industrial complex to accept collaboration and efficiencies based on advances in information technology.

4. There has been a dramatic shift in the last two years toward the use of mobile and portable devices in all aspects of care, and this will only increase.

5. Nobody can really define the phrase  ”HIPAA compliance,” it is best approached and understood as a process, it is not just about security.

6. 30% of the vendors sponsoring were cloud-oriented.

7. Edge, last-mile connectivity in a HIPAA compliant fashion was a common pain point from small patient practices to large integrated health providers.

8. Cloud-based or service platforms having the ability to be more nimble and in turn handle the growing complexity of connectivity, interfacing, and interoperability are available and should be considered.

9. The cloud mitigates the need for traditional software upgrades and release cycles.

10. CIO’s and CMIO’s are opting to outsource for best-of-breed services and applications.  Driven by skilled resources in healthcare IT becoming increasingly scarce and more robust SaaS/cloud-based options being available.

One Person’s Experience with Healthcare Interoperability or, Who Suffers When the Dots Cannot be Connected?

I have had the unfortunate experience of having my wife of over 30 years pass away recently from pancreatic cancer. She lived for 18 months from her initial diagnosis. Prior to that she had been a very healthy 62 year old.  During the course of her illness, she was treated in five different hospitals, was under the care of over 40 physicians, and had numerous surgical and diagnostic procedures. One might say,  ”Well, this was certainly an edge case.” But hasn’t experience shown that it is the edge cases that bring out the flaws in the system? While I participated in her long and painful journey I came to realize that in spite of all the assertions made about information exchange and interoperability in healthcare, they are almost nonexistent once you go outside the four walls of a hospital.

The fact is, unless the patient or their family takes responsibility for the information that different hospitals and doctors will require when they come on board, they will have no reasonable way to have access to that data. During my wife’s illness, on numerous occasions, I had to hand carry DVDs, CDs,  or memory sticks so that other physicians could see the results of CT scans and radiology reports. I had to manually maintain a spreadsheet of her medications since there was no centralized system that was kept up to date, even where she was being treated. Obviously, the more manual recording the greater the chance for error, not to mention lost time.

I am writing this blog as a call to action. While many are wringing their hands over healthcare costs, in my opinion IT Vendors and Hospital Administrators are doing a great disservice to patients and medical personnel by not forcing their vendors to make it a high priority to improve interoperability and information exchange. As we all know, there are a number of high level committees and organizations that are working on this problem. However their progress is slow and the need is now.

Many of them have not even thought through how the Cloud can be a game changer.

The reality is that if Apple can provide iCloud so that users can upload all their content of different types to a single user ID and then deliver it to multiple devices, it is not so far fetched that the same capability could be applied to patient records. Patients typically have single identifiers. The notion that information stored in the Cloud is neither secure nor easily accessible has been proven to be a myth. View this video link from a researcher at Johns Hopkins.

http://www.dreamsimplicity.com/community/saas-video/communityteam/video/1277-cloud-computing-in-medicine?utm_source=twitterfeed&utm_medium=twitter

In addition, there are companies who provide low cost HIPAA compliant secure messaging solutions that can be implemented in minutes that will securely transfer data to and from the Cloud as well as between applications hosted in the Cloud.

It is my belief that if as much attention and investment is focused on medical information exchange as has been placed on making billing systems interoperable, we will have not only improved patient care  but a more efficient use of our medical resources as well.

One Way to Avoid Becoming the Next Kodak

As a boy growing up, my first camera was a Kodak Brownie—easy to use and, for that era, it took excellent pictures. It was somewhat expensive for a 10 year old because of the costs of film purchase and photograph development. Overall, however, I was a happy customer who would eagerly look forward to picking up my photos at the drug store.

Of course, there are no simple answers for why one of the most prestigious companies in the world has found itself having to file for bankruptcy. We do know that one of the first order effects was their inability to shift the center of gravity of their business. New technologies were adopted by their customers that in fact eliminated the cost of film while significantly reducing the cost of development. All the while, the company continued to be in denial about how big the impact would be on their business.

We can only imagine the internal discussions that must have occurred relative to any endorsement by Kodak that digital photography was the future and that they themselves would develop and sell the worlds best digital cameras and printers. The Kodak film people would obviously and immediately do everything possible to prevent that from happening. Just imagine Kodak’s large investment in film manufacturing plants, equipment, and distribution! As a result, they continued with their former core strength of promoting film while developing mediocre digital cameras. Furthermore, their strategy missed the shift of photography and photo software into smart phones, with an eventual even bigger impact on their core market.

Today, there is a technology shift that I believe will be even more profound than the changes brought by digital photography. That is the advent of Cloud Computing. We are all watching advancements occur at a breakneck pace. Initially it was all about virtualization, but we are now seeing very powerful software development tools as well as applications being hosted in the cloud. The result is a new generation of functionality at costs that in some cases are a factor of 10 to 100 times lower than those hosted on traditional servers. In addition, there is unparalled user access—laptops, tablets, smartphones everywhere! Consequently, all these new cloud-based applications come with an entirely new user experience.

The next generation of Kodaks are today convincing themselves that the Cloud will have limited applicability and therefore they can take a “wait and see” attitude, moving to endorse and adopt when they are sure it is real. What I can say with certainty is that by the time they come to that realization and have to analyze the business impact of dismantling infrastructure and a large IT organization, it will be too late. Their competitors who have moved quickly to adopt the Cloud will roll over them with not only significantly better IT cost structures and associated efficiencies, but with a better ability to focus on their businesses and a stronger and growing connection to their customers.

Posted in Uncategorized | Tagged cloud, cloud computing, cloud messaging | 1 Reply

When Did Lawyers Become Technologists?

When did lawyers become technologists?

The current cloud computing debate centers on whether the Public Cloud can be trusted.  Can IT infrastructure start with a private cloud and migrate later?  The private cloud advocates cite concerns such as security, control and adherence to compliance requirements as their primary reasons for not utilizing the public cloud. Clearly, cloud security is in question. But who should make the decision within your organization?

I was amazed to have an attendee at a major industry conference tell me: “his lawyers cloudprime_lawyerwould never let him use the public cloud”.  My question was –“when did lawyers become technologists?”

Cloud Paradigm Shift

It is universally recognized that there is a major paradigm shift occurring driven by the new usage based pricing model of Cloud Computing.  Just 5 years ago SaaS was perceived as not being financially viable.  Indeed, the Public Cloud has become one of the primary approaches to utilizing mission critical applications.

CIOs around the world are now including Cloud Computing in their future planning. They are trying to determine which Cloud environments should be adopted that make the most sense for their infrastructure requirements. Leading CIOs are allocating resources to determine the most cost effective and scalable cloud investments.

Cloud Phobias and Facts

The fact is that a lot of the fears regarding public clouds are coming from those who do not understand technology.  It’s important to know the facts.

The largest companies in the industry are investing billions of dollars in creating cloud platforms that include state of the art hardware, networking and security. These companies include IBM, Microsoft, HP, Rackspace and Amazon.

The private clouds cannot possibly invest enough money to remain competitive with the capabilities and security that are available in public clouds. In addition as a result of economies of scale, public clouds are the leaders in establishing and implementing compliance standards.

This is also an industry where it’s really all about the applications and solutions. There will be a far more extensive SaaS application catalogue available for the public cloud than for a portfolio of private clouds all of which have implemented their own custom stack.

Let’s face it,  application developers have always followed the money…

Thank You IBM Impact!

A couple of weeks ago, we were given the opportunity to present CloudPrime at the IBM Impact Cloud Zone in Las Vegas. It was a great week, filled with a lot of interesting conversations.

We want to send a big thank you to Ann Saydah at IBM and her team for organizing the event and giving us the chance to showcase CloudPrime’s business messaging service in the Cloud Zone. Along with exhibiting in the Cloud Zone, we were given the opportunity to talk about how CloudPrime is transforming business messaging during the Rapid Fire session. (If we get a chance to share the video of our 4-minutes of fame, we will post it here on our blog.)

It was also exciting for CloudPrime to officially announce it’s multi-year partnership at the conference that enables MQ customers to extend their messaging footprint. We are looking forward to working with IBM and it goes without saying this is a really exciting time at CloudPrime!

How CloudPrime Transforms Enterprise Messaging

Our customers need resilient, secure and easy-to-deploy application  messaging solutions that meet their changing demands. As more and more buzz around the cloud, application migration, security and compliance percolate to the surface, CloudPrime has peaked the interest of more and more CIOs and Infrastructure Managers.

Transform: CloudPrime enables enterprises of all sizes and industries to transform how they connect applications. Our cloud-based infrastructure provides scalability, embedded security, resliency and economies of scale. Because of the Cloud, our customers can rely on a service based messaging infrastructure, allowing managers to focus on mission critical tasks instead of deploying and managing costly VPNs and hardware.

Manage and Serve: As a service, CloudPrime provides a robust network that guarantees the delivery of every message as well as providing “military grade” security and encryption.

Build: With CloudPrime you can build application interfaces in minutes regardless of the application or transport protocol.

Consume: CloudPrime’s application messaging services are easy to consume and do not require any hardware installation or IT training.

To learn more about CloudPrime and the advantages of our Cloud-based application messaging service, visit https://cloudprime.net/cloudprime-about.php

Response to NY Times’ Steve Lohr — Healthcare Connectivity

Mr. Lohr, great article and thank you for covering this topic.

healthcare connectivity interoperability horse before cart

Any story about hospitals taking steps towards connectivity is great, but I fear that most think that connectivity is a little easier than it really is, that it’s just a matter of getting everyone together. Integrating hospital systems is challenging enough, but it’s “everyone else” that will pose the greatest challenges and thus making connectivity a fragile vision if it cannot be streamlined for smaller practices, independent physicians, clinical labs, etc.
Mr. Lohr points out that only 25% of physician practices today are computerized, and that should improve given incentive payments and consequences for non-compliance. However, the real issue is that we are putting the carriage before the horse; physicians will adopt patient management systems, EHRs, and EMRs, but the connectivity piece will still be unanswered. Furthermore, smaller practices and physician groups most likely will not understand why it is that further steps for compliance need to be taken as most of them are being educated (by the very software vendors selling them their wares) that if they merely install software that allows them to manage patient data digitally, they are going to get compensated.
Recently on a call with a hospital CIO, we discussed how they had to put a connectivity project on hold because of the 200 physician practices outside of her hospital she had to bring onto the network, only 120 had EMRs and none of them wanted to deal with having to deploy and manage a VPN. Seems trivial, but the reality is that doctors are doctors 1st and anything not related to treating patients is a distraction and is perceived to decrease their bottom line (In the Docs’ defense, VPNs are a blunt instrument and I don’t blame them).
Healthcare connectivity is a long, windy road that needs better planning, better ideas, and better solutions. The Direct Project sponsored by NHIN is a great foundation for simplifying and standardizing connectivity, but this battle also needs to be won with hearts and minds.

Direct Project — Hooray!?

Fortunately NHIN will not even tell you that the Direct Project is the end-all solution to making the ubiquitous exchange of health information a reality.

That being said, many interpret it as a simple evolutionary step to secure health information exchange and compliance. Let’s take a look at what the Direct Project is and specifies:

1. The Direct Project is a specification or recommendation about how secure health information exchange can be achieved via the SMTP protocol,

2. In order to interface to the Direct Project network, you will still need to rely on a health information service provider,

3. Each participant in the Direct Project will have a published Health Domain Name or HDN which is used for authentication; this will look like an email address or domain to users and is how other participants will identify each other

There are many other requirements that outline what is needed to adhere to the Direct Project specification, but above are the high-level concepts. In itself, it is a great and elegant approach to solving the problem of interoperability and health data exchange, but there are some items of concern that need to be addressed when determining how to gain widespread adoption:

1. While some vendors are supporting the Direct Project in software patches and new releases of software, what does this mean for healthcare professionals that do not have applications that will comply? How will they interface to the Direct Project?

2. What about large hospitals or groups that have many systems from multiple vendors? Will their router be able to interface to the Direct Project network without increasing the work load of already over-stretched IT staff?

3. While some vendors have updates and patches for adhering to the Direct Project specification, are there other requirements needed in order to comply, e.g. changes to a hospital’s SMTP server?

At CloudPrime we are very excited about the Direct Project and believe it does provide a solid foundation for improving health information exchange. Our concern, however, is that it will require that healthcare providers to allocate over-stretched resources to meet the requirements of Direct and get their health IT systems to integrate with the network even if their EMR/EHR supports Direct.

We believe that as an added goal of the Direct Project, implementation and integration should not be difficult and that health IT folks need a solution that will minimize the impact to their workflows and current workload.

CloudPrime is defining a better way for health IT professionals to take advantage of everything the Direct Project has to offer while minimizing the impact to their IT infrastructure and workflows.

Healthcare Integration & Interoperability — Part 3

In the last blog we discussed 4 major file types/documents that are used in healthcare data exchange: HL7, DICOM, CCD and CCR. In order to exchange these data types, application interfaces need to be deployed to allow for the integration of disparate healthcare systems.

Today, we will cover how applications communicate or interface with each other.

Interfaces typically use what is called a transport protocol, which “provides end-to-end communication services for applications within a layered architecture of network components and protocols.”†

The most common transport protocol in use is TCP or Transmission Control Protocol. Sometimes referred to as TCP/IP, it allows for applications to stream data to each other. For example, if you have a Patient Management System that generates HL7 SIU messages (scheduling messages) and that application interfaces to an HL7 routing engine, it is likely that the two interfaces would communicate via TCP.

In healthcare, there is a subset of TCP known as MLLP which adds specific delimeters to messages to denote the beginning and end of a message. The receiving application needs to know where one message ends and another begins in order to deliver the correct information to the system. In TCP, you would do this by specifying a length header, or more simply put, the details about where messages start and end. With MLLP, specifiying length headers is not necessary as the transport protocol inserts the delimiters for applications to know where the messages begin and end. Confusing, I know!

Sometimes, it may not be necessary or preferred to use TCP/MLLP for streaming data, and applications will use Simple File Transfer or file drop. This is accomlished by outputing the messages and storing them in a directory on the computer or server. Another application or interface checks the folder for new messages and consumes them when they are written to the directory.

Interfaces need transport protocols to exchange information and it is common for systems to need to communicate over various connectivity services, making integration somewhat challenging. Talking with your health information service provider and/or your integration specialist can help you understand the best method for interfacing healthcare applications together.

† Wikipedia, http://en.wikipedia.org/wiki/Transport_Layer

Healthcare Integration & Interoperability — Part 2

Yesterday we briefly covered what healthcare integration and interoperability is and what it means to the healthcare industry. In today’s segment, we will be discussing some of the file protocols that are used in conjunction with continuity of care and interoperability.

The file protocols that we will focus on today are some of the more popular formats: HL7, DICOM, CCD & CCR.

HL7 File Protocol

Much like the blood cell in the human system, HL7 messages are the lifeblood of healthcare data exchange. Established in 1987, Health Level 7 (HL7) is a non-profit organization that’s mission is to “[provide] standards for interoperability that improve care delivery, optimize work flow, reduce ambiguity and enhance knowledge transfer among all of our stakeholders, including healthcare providers, government agencies, the vendor community, fellow SDOs and patients.”†

HL7 application interface

In more simple terms, HL7 is a file protocol through which care providers leverage a standard for sharing patient data. HL7 messages are broken into specific types that relate to a specific event within a patient record, also known as a trigger event [see list below]:

  • ACK — General acknowledgment
  • ADT — Admit discharge transfer
  • BAR — Add/change billing account
  • DFT — Detailed financial transaction
  • MDM — Medical document management
  • MFN — Master files notification
  • ORM — Order (pharmacy/treatment)
  • ORU — Observation result (Unsolicited)
  • QRY — Query, original mode
  • RAS — Pharmacy/treatment administration
  • RDE — Pharmacy/treatment encoded order
  • RGV — Pharmacy/treatment give
  • SIU — Scheduling information unsolicited ‡

Each on of these trigger events is created by a hospital system and will need to be shared not just across internal systems, but also with hospitals, HIEs, physician groups, clinical labs, etc. that may reside outside of a healthcare providers network. Not each message type is relevant to all applications and many hospitals that maintain dozens of systems will leverage HL7 routing engines to deliver messages to the appropriate destination.

While the HL7 message protocol is a standard widely adopted healthcare providers, it is sometimes seen as Stephane Vigot of Caristix puts it, as a “non-standard standard”. What Mr. Vigot is saying is that even though the protocol specifies syntax and message headers for identifying pertinent information, different systems may use different templates. Take patient “sex” for example: one hospital may register a patient as either male or female and another may have up to 6 attributes relating to the patient’s sex. As a result, when systems are integrated, HL7 messages need to be normalized so that the systems know where to look for the information.

Version 2.x vs Version 3

Probably the most important thing to know about HL7 version 2.x vs. version 3 is that the latter has not been embraced by the healthcare industry yet. Version 2.x is a textual, non-XML based file format that uses delimiters to separate information.˚ Version 3 on the other hand is an XML based file format.

DICOM

DICOM stands for Digital Imaging and Communications in Medicine. Like HL7, DICOM is a file format for exchanging patient data, but is used in conjunction with systems that exchange medical images. DICOM messages are the file protocol of choice for PACS (Picture Archiving and Communication Systems).

For a list of all the Value Representation (VR) of a DICOM message, you can visit: ftp://medical.nema.org/medical/dicom/2009/09_05pu3.pdf

Continuous Care Document (CCD) & Continuous Care Record (CCR)

These two documents perform very similar functions, and are considered summary documents. Both CCD and CCR are XML based documents and provide a summary of a patients healthcare history. Included in a CCD or CCR document is a human readable section that covers the patients care history as well as pertinent patient information such as demographics, insurance information, and administrative data.^

The major difference between the two revolves around how closely one is tied to HL7 standards than the other and how much easier one fits into the current workflow of a particular health IT system. While some see CCD and CCR as competing standards, Vince Kuraitus of e-CareManagement argues that “the CCD and CCR standards are more complementary than competitive.” The basis of his opinion revolves around the ”the right tool for the job” metaphor and HIEs adoption of CCD doesn’t say much.

Summary

Integration and interoperability need file protocol standards and as the healthcare IT industry keeps evolving, many of the ambiguities of the current standards will eventually (hopefully) be normalized and conformity will prevail. In the meantime, HL7 2.x, DICOM, CCD/CCR are here to stay and will continue to be the lifeblood of integration and connectivity.

† http://www.hl7.org/about/index.cfm?ref=common

‡ http://www.corepointhealth.com/resource-center/hl7-resources/hl7-messages

˚ http://en.wikipedia.org/wiki/Health_Level_7#HL7_version_2.x

^ http://en.wikipedia.org/wiki/Continuity_of_Care_Document