Amongst all aspects of regulatory compliance as an addiction treatment provider, the privacy and security of Patient Health Information (“PHI”) as well as protecting Patient Identifying Information (“PII”) appears largely misunderstood. The requirement of privacy and security of patient information stem in part from both HIPAA (the “Health Insurance Portability and Accountability Act” of 1996) and the HITECH Act of 2009 (the “Health Information Technology for Economic and Clinical Health Act”), the latter of which was created to motivate the implementation of electronic health records (“EHR”) and supporting technologies to start to be able to data crunch and see a patient’s “whole health” picture before any given physician. [Note: EHRs are the aggregation of health data from a patient; EMR or Electronic Medical Records, are the electronic equivalent to charting within a health care provider’s office.]
A patient’s PHI is more and more regularly kept as an electronic medical record – a single practice’s digital version of a patient’s chart. An EMR contains the patient’s medical history, diagnoses and treatments by a particular physician, nurse practitioner, specialist, dentist, surgeon or clinic.
In contrast, electronic health records are larger data sets, which are aggregations of data from EMRs, designed and intended to be shared with other providers, so authorized users may instantly access a patient’s EHR from across different healthcare providers and platforms.
Within this space of Health Information Management, the governing laws have essentially two main parts: the Privacy Rule, and the Security Rule.
The Privacy Rule addresses what PHI and PII must be protected from unauthorized disclosure The Security Rule essentially discusses the steps that a provider of holder of PHI and PII must take to secure such data, including in its transmission to another authorized party (often what is referred to as a “Business Associate.”). “Patient Identifying Information” includes “the name, address, social security number, fingerprints, photograph, or similar information by which the identity of a patient can be determined with reasonable accuracy and speed either directly or by reference to other publicly available information.” Both PHI and PII must be secured and protected.
The Privacy Rule describes the ways in which covered entities can use or disclose PHI, including for research purposes. In general, the Rule allows covered entities to use and disclose PHI for research if authorized to do so by the subject in accordance with the Privacy Rule. In addition, in certain circumstances, the Rule permits covered entities to use and disclose PHI without authorization for certain types of research activities. For example, PHI can be used or disclosed for research if a covered entity obtains documentation that an Institutional Review Board (IRB) or Privacy Board has waived the requirement for authorization or allowed an alteration. The Rule also allows a covered entity to enter into a data use agreement for sharing a limited data set. There are also separate provisions for how PHI can be used or disclosed for activities preparatory to research and for research on decedents’ information.
However, if the data is de-identified, then the data itself is not PHI as a matter of law, and therefore is not protect by HIPAA’s Privacy Rule. PHI excludes health information that is de-identified according to specific standards. Health information that is de-identified can be used and disclosed by a covered entity without authorization or any other permission specified in the Privacy Rule.
De-identified patient data is health information from a medical record that has been stripped of all “direct identifiers”—that is, all information that can be used to identify the patient from whose medical record the health information was derived. According to the Health Insurance Portability and Accountability Act (HIPAA), there are 18 direct identifiers that are typically present in patient medical records. These include: Names; Geographic subdivisions smaller than a state (e.g. street address, city and ZIP code); All dates that are related to an individual (e.g., date of birth, admission); Telephone numbers; Fax numbers; Email addresses; Social Security numbers; Medical record numbers; Health plan beneficiary numbers; Account numbers; Certificate/license numbers; Vehicle identifiers and serial numbers, including license plate numbers; Device identifiers and serial numbers; Web universal locators (URLs); IP address numbers; Biometric identifiers such as fingerprints and voice prints; Full-face photographic images; and Other unique identifying numbers, characteristics or codes.
According to HIPAA, there are 3 acceptable ways to de-identify patient data. The first is the “safe harbor” option, in which all 18 identifiers are removed. The second is the “statistical” option, in which a retained statistician determines which of the 18 identifiers can be maintained without creating greater than a “very small” risk that the data could be re-identified. The third is the “limited data set” technique, in which the organization removes 16 identifiers and protects what remains with special security precautions.
As a result, both PHI and de-identified health data has been lawfully used for years to help find medical breakthroughs, improve care, estimate the costs of care, and support public health initiatives.
More recently, “big data” companies like Google, Amazon and Apple have been getting into the health care data space. Back in 2016, Google partnered with England’s NHS using Google’s “Deep Mind” subsidiary which was granted access to medical records on 1.6 million patients who had been treated at some time by three major hospitals. The purpose of this data mining was a joint effort between Google and the Royal Free NHS Trust to develop a diagnostic app for detecting acute kidney injury. (see, www.royalfree.nhs.us/news-media/news/google-deepmind-qu/).
However, if one can de-identify the information, meaning, remove those aspects of a chart or file or data set that contains PII, then the rules governing privacy and security effectively go away.
So why does this matter?
This begins the brief discussion of this post – the massive unground economy which has evolved around the brokering of de-identified PII and PHI to third-parties.
“Why would anyone pay for that,” you ask?
The offspring of that evolution in medical data brokering and big number crunching is a not-so-new economy that is using your health care data, and that of your patients, for profit.
One of the most prolific writers and outspoken critics of this underground economy of medical data brokering is Adam Tanner, a fellow at Harvard University’s Institute for Quantitative Social Science and author of “What Stays in Vegas: The World of Personal Data — Lifeblood of Big Business — and the End of Privacy as We Know It” and “Our Bodies, Our Data: How Companies Make Billions Selling Our Medical Records.” According to an interview Tanner gave to Time Magazine in 2017, “prescription records, blood tests, doctors’ notes, hospital visits and insurance records are all sold to commercial companies, which gather years of health information on hundreds of millions of people,” which de-identified data is then sold to various entities including insurance companies, researchers, and you guessed it, Big Pharma.
In a comprehensive article published in Scientific American written by Adam Tanner, he summarizes it best:
“Health researchers are not the only ones, however, who collect and analyze medical data over long periods. A growing number of companies specialize in gathering longitudinal information from hundreds of millions of hospitals’ and doctors’ records, as well as from prescription and insurance claims and laboratory tests. Pooling all these data turns them into a valuable commodity. Other businesses are willing to pay for the insights that they can glean from such collections to guide their investments in the pharmaceutical industry, for example, or more precisely tailor an advertising campaign promoting a new drug.”
How much are they willing to spend?
PharmaceuticalCommerce.com reported back on May 8, 2019, that the EMR market alone is growing at 6% annually, now topping $31 billion dollars annually. Within the EMR economy is the recognition that Big Pharma has been willing to pay big bucks for EMR vendor data, such as prescriber habits, outcomes-related data, or connecting the e-prescribing function with copay and formulary data. Pfizer is on record as stating that, as of 2016, it was spending $12 million dollars annually to acquire anonymized health data.
In addiction medicine, this sharing of information is very important amongst researchers to begin to find long-term cures to heal the addicted brain. For insurance carriers, it may become a source to help identify effective “evidence based outcomes” and modalties. For Big Pharma, they want to follow who is prescribing MAT (and who is not) to get everyone on their latest and greatest magic pills.
So the takeaway question is this – if you are a health care provider/addiction treatment center, do you know what’s going on with your patient health data you are uploading to your EMR? And equally important, is your EMR legally selling your acquired data, and then cutting you out? What about the patients? Who are the “brokers” in this space?
On this point (and a topic of future blog posts on this subject), most states including Florida had adopted laws which provide that such patient data is owned by the health care provider, not the patient (and thus, not the EMR either).
So is your EMR provider controlling or even selling your patients’ data, without your knowledge or consent? And if so, to who, and why?
Thanks again for reading, and please “Like” our Facebook page where we share daily news articles of interest. https://www.facebook.com/soberlawnews
Latest posts by Jeffrey Lynne (see all)
- Travel Assistance, Patient Financial Responsibility, and Good Intentions - October 7, 2019
- Harm Reduction – Supervised Injection Sites - October 3, 2019
- DCF Updates Rules Regulating SUD Treatment Providers - August 14, 2019