Steve Alder
Editor-in-Chief, HIPAA Journal

Are you concerned about virtual assistants and HIPAA compliance? Can intelligent personal assistants be used in a healthcare setting without violating HIPAA Rules or are they a HIPAA violation waiting to happen? In this post we explore whether virtual assistants and HIPAA compliance are compatible and which tech companies are developing voice and AI technology that can be used in connection with patient health information.

 

Uses of an Intelligent Personal Assistant in Healthcare

An intelligent personal assistant makes life easier. Commands can be issued by voice to launch applications and play music, they speed up internet searches, text messaging, and can be used to transcribe notes from speech to text.

 

They are extensively used in the home and many people have started using them at work.  At CES 2019, Google reported that Google Assistant would be available on 1 billion devices by the end of January 2019 and Amazon announced that 100 million Amazon Alexa devices have already been sold.

 

In a healthcare setting, virtual assistants have many potential uses. They could be used to transcribe medical notes, enter information into electronic health records (EHRs), retrieve patient data from EHRs, and perform a range of different functions in hospitals, clinics, and physicians’ offices. Patients could use an intelligent personal assistant to get medical advice or monitor conditions such as diabetes. They are also valuable tools that can help to engage patients in their healthcare.

 

However, there is the issue of virtual assistants and HIPAA compliance. The developers of the software that runs on smart speakers and the AI that powers virtual assistants need to ensure that safeguards are incorporated to protect the privacy of patients and make sure any health information sent through virtual assistants is kept secure and protected from unauthorized access.

 

Virtual Assistants and HIPAA Compliance Requirements

AI-powered virtual assistant such as Amazon Alexa, Siri, or Cortana certainly have great potential in healthcare to improve efficiency and productivity. Doctors may even consider using their Google Home or Apple HomePod in the office. However, while convenient, with virtual assistants HIPAA compliance violations are a very real possibility. HIPAA compliance requirements must be addressed before a device such as Amazon Echo, Google Home, or an Apple HomePod could be used in a healthcare setting.

 

Before a virtual assistant could be used in connection with any protected health information (PHI) – Health information that includes one of 18 identifiers that would allow a patient to be identified – controls must be put in place to keep that information secure. That means controls on the device itself, any cloud services that are accessed, and any software or systems that interact with the virtual assistant.

 

The HIPAA Security Rule requires safeguards to be implemented to prevent PHI from being accessed by unauthorized individuals. Any PHI must be protected at rest and in transit so the device would need to have end-to-end encryption to prevent the interception of data.

 

A device must only be able to accept commands from an authorized individual. A physician could dictate notes, but it shouldn’t be possible for anyone to input notes into an EHR via the device or send an order to the pharmacy. Virtual assistants would need to be programmed to prevent abuse and controls implemented to ensure mistakes are not made, such as recording a different drug to the one mentioned by a physician.

 

HIPAA also requires any individual or company that needs access to PHI to enter into a contract with a HIPAA covered entity prior to being provided with PHI – A business associate agreement. The business associate agreement provides reasonable assurances that the service provider is aware of its responsibilities with respect to PHI. Without a signed business associate agreement, a virtual assistant is not HIPAA compliant.

 

The race is certainly on to make virtual assistants HIPAA compliant and Apple, Google, Amazon, and Microsoft are all working furiously to become the first major tech firm to develop an intelligent personal assistant or device that can be used in healthcare.

 

Amazon has been working with the Mayo Clinic, which has already developed an Alexa first-aid skill to provide consumers with advice on various medical problems, although the solution is not yet suitable for use in hospitals.

 

Orbita is also working with the Mayo clinic. It has also developed a voice platform that it claims is HIPAA compliant and suitable for use in healthcare enterprises. Sopris Health has also developed an intelligent digital assistant that can transcribe voice notes and send them to an EHR system, but tech giants Amazon, Microsoft, Google, and Apple have yet to launch HIPAA compliant virtual assistants or devices.

 

Amazon Alexa and HIPAA Compliance

Amazon is working on making Amazon Alexa HIPAA compliant. In 2018, Amazon set up a health and wellness unit within the Alexa team that was tasked with developing the platform for use in healthcare. Amazon also launched the Alexa Diabetes Challenge in 2018 to encourage developers to build Alexa-based apps that help patients manage type 2 diabetes.

 

Alexa is already being used to improve patient experiences in hospitals, such as allowing patients to call nurses to their rooms. New York’s Northwell Health is planning on using Amazon Alexa in private rooms to allow patients to access their medical records, and Boston Children’s Hospital is trialing Alexa and various other voice technologies.

 

Amazon already signs a business associate agreement covering cloud services such as AWS and EC2, although Amazon Alexa and Amazon Echo are not currently covered by Amazon’s business associate agreement. That could change soon. There are rumors that Amazon is on the cusp of launching a HIPAA compliant Amazon Echo device.

 

Google Assistant and HIPAA Compliance

Google Assistant and Google Home devices have certainly taken the consumer market by storm. Google Assistant is the most widely used virtual assistant and tests have shown it to be the best performing voice assistant on the market, although the breakthrough into healthcare has not yet come.

 

The voice-powered care assistant provided by Aiva Health has received considerable investment from both the Alexa Fund and through the Google Assistant Investment program.  This investment is in line with Google’s effort to help expand its offerings and they have also absorbed DeepMind’s health care unit, which developed a virtual assistant for doctors and nurses. That system is currently being trialed in the UK.

 

The technology behind Google Assistant certainly has tremendous potential for use in healthcare, although it is not yet covered by Google’s business associate agreement so cannot be used in connection with PHI.

 

Siri and HIPAA Compliance

While the dictation feature on Apple devices is convenient and has potential to be used in healthcare, its speech to text processing takes place in its data centers rather than on the device itself. That means any PHI dictated to the device will be transferred to Apple’s data centers. Regardless of the controls that are in place to ensure data cannot be intercepted, without a business associate agreement from Apple, use of the dictation feature with PHI would be considered a HIPAA violation.

 

The Apple HomePod is fine for use in the home, but it cannot yet be used in a healthcare environment in connection with PHI. Apple will not enter into a business associate agreement with a healthcare provider that includes either Siri or the Apple HomePod, so neither can be considered HIPAA compliant.

 

Cortana and HIPAA Compliance

Cortana is not HIPAA compliant and is not included in Microsoft’s business associate agreement so it cannot be used with healthcare data, but Microsoft has been working on developing voice-activated, AI-powered virtual health assistants and chatbots.

 

The Microsoft Healthcare Bot is now generally available for healthcare providers to develop their own HIPAA-compliant virtual assistants for use in a variety of healthcare settings. Several healthcare organizations have already started using the technology. Premera Blue Cross has developed Microsoft Healthcare Bot-based Premera Scout,

which provides members with claims information. Quest diagnostics has also developed a website solution based on the platform.

 

Summary

There is certainly great potential for intelligent personal assistants in healthcare, but today it is not feasible to use virtual assistants without HIPAA compliance guarantees. While the major virtual assistants are not covered by business associate agreements, that is likely to change soon. Virtual assistants and HIPAA compliance will soon be a reality, and before long they are likely to be extensively used in healthcare facilities to improve efficiency and productivity.

That said, even when developers certify that their devices meet HIPAA requirements, there will still be considerable potential for HIPAA violations. HIPAA compliance is not only about incorporating privacy protections and technical safeguards into virtual assistants. Users will also need to ensure that they use their virtual assistants in a HIPAA compliant manner.