Global trade in health data is not something new. Adam Tanner, Associate at the Institute for Quantitative Social Science of Harvard University and author to two books on personal data and privacy, including Our Bodies, Our Data: How Companies Make Billions Selling Our Medical Records published in 2017 wrote, back in 1956, one of the leading medical advertising agencies in New York deployed a staff to West Germany to survey the pharmaceutical market and the types of drugs that sold well.
These insights were put together into reports that were exceptionally popular among drug companies. As a result, the agency quickly expanded its data-mining business to other parts of the world and became the “heart of a for-profit global trade in anonymized patient data”. The $20 billion company still stands strong today; under the name IQVIA, after it merges with Quintiles, a clinical research company back in 2016.
A trade concerning patients but keeps them in the dark
Primarily, the company collates and processes medical records, prescriptions, insurance claims, lab test results and so on from over half a billion patient records worldwide. In the US, such business is not illegal as there are no defined rules to say who – physicians, hospitals, patients – have the explicit rights to medical records.
The Health Insurance Portability and Accountability Act (HIPAA) prevents the disclosures of individually identifiable health information. So, if one cannot identify a person via given health information, those data are not subject to all of the HIPAA’s requirements (i.e., HIPAA exception). However, de-identification is a complicated process of going beyond the removal of one’s name. As technology advances, we can’t omit the possibility of re-identifying an individual.
AIMed reported earlier, last June, a class-action was filed in the District Court for the Northern District of Illinois against The University of Chicago Medical Center as well as Google, accusing them of the information used in a collaboration are at risks of being re-identified. Nevertheless, many still regards data mining or trading of health information as a path leading to new discoveries and cures. Besides, the fractured nature of the American health system has somewhat facilitated or complicated the nature of these issues too.
Mayo Clinic is accused of becoming a recent participant of this secret global trade. The medical institution has given green light to 16 companies over the past two years to have access to their patient data via licensing deals. Mayo Clinic says it’s embracing artificial intelligence (AI); believing in its potential to metamorphose the way medicine and care are being delivered. So, data is inescapable. Moreover, the institution does not sell data but partner and invest in companies to co-develop new products.
Innovation or exploitation?
In most cases, patients are unaware of these deals. They also do not have a say in withdrawing their participation. Surely, Mayo Clinic is not alone. Many other medical and healthcare entities are also struggling to balance among data privacy, data rights and innovation. This is what happened when technology is moving ahead of regulations.
According to Professor Kayte Spector-Bagdady, Lawyer and Bioethicist at the University of Michigan Medical School, patients are afraid that their healthcare data are used in combination of other data obtained from smartphones and social media platforms to identify them. These are known as “shadow health records” or collection of health data outside of the health system to paint a clear picture of one’s health. Patients are also worried that their health data will lead to some financial gains or scientific advancement that they will never know, like the case of Henrietta Lacks.
Spector-Bagdady and her team had suggested a new ethical framework for the use of patient data in academic centers; one which forbids sharing of data with third parties. This means if the framework is adopted more widely, the development of AI and related technology that are dependent on data, will be affected in the long run. On the other hand, Mayo Clinic is exploring the use of federated learning, a machine learning technique whereby an algorithm is being trained across various decentralized entities holding onto their respective sets of data without the need to share any information.
Still, as our take towards privacy – what’s considered as acceptable and what’s not – continues to change, it’s challenging to tell whether some of these measures will be adequate in the years to come.