top of page

AI and Business Associates

  • johnehaskell
  • Feb 26
  • 3 min read


ree

As the use of artificial intelligence ("AI") in healthcare grows, it's important for the health care providers who deploy AI and the vendors that supply providers with that AI to understand how HIPAA confines the use of protected health information ("PHI") by vendors of AI.


Classified as "business associates," tech vendors' use of PHI is very limited under HIPAA. To understand this limitation, it's important to understand the role business associates play in the world of HIPAA. The business associate provision was drafted, more or less, to allow covered entities to disclose PHI to entities that provide them with certain support services, or who perform certain functions or activities or their behalf. Importantly, business associates are narrowly defined to include only those persons or entities that perform certain functions, activities, or services to, for, or on behalf of a covered entity. Those specific functions, activities, and services are found in the definition of "business associate." A tech vendor, for example, who uses PHI for a function, activity, or service that is not found under the definition of "business associate" is not a "business associate." Finally, covered entities are permitted to share PHI with their business associates without an authorization signed by the individual whose PHI is shared with the business associate.


To ensure the privacy of the PHI shared with business associates, HIPAA puts strict limits on how business associates can use and disclose information. First, the Privacy Rule permits a business associate to "use or disclose [PHI] only as permitted or required by its business associate contract . . . or as required by law." Second, the Privacy Rule requires a business associate to obtain permissions in its business associate agreement ("BAA") with covered entities for certain uses and disclosures that fall outside of those uses and disclosures required to provide the functions, activities, or services. For example, the ability for a business associate to de-identify PHI, use PHI for its own proper management and administration, and to carry out its legal obligations all require the business associate to receive affirmative permission to do so in the BAA.

Collectively, these permissions and restrictions highlight that business associates are generally, with the aforementioned very narrow exceptions, not permitted to use PHI for their own purposes. The U.S. Dept. of Health and Human Services ("HHS") addresses the question of whether a business associate may use PHI for its own marketing purposes in an FAQ. In response to the question, HHS writes:


While covered entities may share protected health information with their contractors who meet the definition of “business associates” under the HIPAA Privacy Rule, that definition is limited to contractors that obtain protected health information to perform or assist in the performance of certain health care operations on behalf of covered entities. Thus, business associates, with limited exceptions, cannot use protected health information for their own purposes.

Those are the bounds in which business associates can use and disclose PHI. Next, we look at the operational issues tech vendors need to overcome to comply with HIPAA .


AI needs data. It needs data to both provide an output (what the providers would use, in the case of healthcare), but also to train the underlying large language model ("LLM") or algorithm. Providing AI to support a health care provider's operations is in the same vein as providing it with an EMR, transcription service, or other technology. The problem with AI lies in the use of PHI to train the underlying LLM or algorithm.


Again, training an LLM or algorithm is not a problem, per se. The problem comes when the LLM or algorithm trained on PHI underlies AI that is marketed to other users. For example, health care provide A uses a AI to help organize medical charts. The underlying LLM or algorithm is trained on the PHI fed into it. The AI, supported by the underlying LLM or algorithm, to other health care providers. The tech vendor in this case is using PHI for its own purposes; to train its LLM or algorithm.


One solution is to use a local LLM, where the LLM is hosted on the user's hardware. This has multiple advantages (and drawbacks), including, crucially for our conversation here, control over data and ensuring that it is not used to train a model that supports AI marketed to other users.


The data flows and (potential) secondary uses are the primary concern for covered entities using AI. Yes, signing a BAA and ensuring the tech vendor meets the requires of the Security Rule are important, but they are not the only aspects of compliantly using a tech vendor.








 
 
 

Comments


bottom of page