My medical data
I have a problem.
I’ve taken an increasingly active interest in health. My own health, specifically. I’m a pretty curious person and I also have an obvious vested interest in the topic of my health. I also admit to some near neurotic inclination to optimize and overoptimize and optimize once again. The richness of quantifiable health data interpreted via the lens of my own qualitative lived experience proves irresistible.
Sources of data abound. My Oura. Apple Watch. Apple Health (which is somehow different than Apple Watch? Confusing). Lab results from my Kaiser health portal. Before Kaiser I was at OneMedical. But specialists kept their stuff in MyChart - which was all connected somehow but also all different. My genetic code - which I acquired from an old 23andMe purchase and saved down in my files.
More services pop up all the time! Just recently my Twitter timeline was abuzz with a service called Function Health (turns out they just closed a funding round - of course). It’s all the labs you could possibly want for a dollar a day and some analysis from clinicians - I assume that means dashboards. Before Function Health it was Superpower. Friends have gotten full body DEXA scans, that’s a whole new vector of medical data I haven’t even begun digging into.
It all makes for a wonderland of data to be constantly tracked and terminology to be Googled and numerical scales to contextualize.
But something’s bugging me.
It’s not my data
All these interested parties are generating data about me. It’s all related to my health.
Conceptually, I think about this output as my data.
It’s my body.
My health.
This data is literally a representation of me. My attributes and my body. The meat carcass which carries my mind and my soul on this giant rock called Earth that hurling through space. Hopefully for many more years to come.
But actually this data isn’t mine. It’s their data. The wealth of medical providers and private enterprises generating and storing it away. They own the data which I think of as mine.
American law is pretty confusing on this point. When discussing the topic of medical data I hear “HIPAA” thrown back pretty much immediately, but it’s not always clear to me that my conversational partner knows what “HIPAA” really means and, even if they do, I certainly do not.
To the best of my understanding, American law does afford consumers certain rights over their medical data. We have the right to request our data and the request for data transfer must be honored - under some conditions and exceptions. Accessing and taking custody of that data cannot be subject to an excessive fee - whatever that means. We can give the option, or specifically ask for, or deny access of that medical data to other 3rd parties - again, with some exceptions. Legal jargon and disclaimers abound. By the way, this only covers the medical providers governed by HIPAA - everyone else opts to do it out of the goodness of their heart.
But it’s not our data. We don’t retain it and we don’t decide how it’s stored and handled and protected and we don’t ultimately decide who accesses it.
There’s a moral and principled argument to be made here. That medical data is among the most personal and privileged data which exists, and that as a result consumers ought to own and control and manage and delegate their own data because it’s the right way to treat this type of data. That’s not the gripe I’m expressing right now.
It’s really annoying.
The reality of my data
My go-to source for understanding my medical data is ChatGPT. I’m sure I’m hardly alone on this.
Using ChatGPT for medical help means downloading files and PDFS and copying and pasting numerical values and hunting and searching for old records. These records are strewn across my medical provider’s portals and saved down files and sometimes searching through consumer apps like Oura and Apple Health for charts to screenshot and paste in. It’s a tedious process, it’s a questionable practice from a security standpoint, and it’s repetitive. It means going back and doing the same thing and attaching the same files time and time again. But it’s the best option I’ve got.
As a byproduct it also means that ChatGPT’s memory is the closest thing to a unified health record that I possess. The variety of sources of records and conversations and context I’ve input into the app is a more comprehensive representation of my health than any one source I’m pulling from to upload.
This bugs me for reasons I find difficult to put into words.
On the one hand, medical data is a mess - aggregating between disparate sources, reconciling unmatched metrics, discerning quality and import according to source - and an intelligence layer like ChatGPT is uniquely capable of working through and making sense of the mess. In addition, an ability to interact and provide a qualitative interpretive layer on top of this data like ChatGPT can, and do it patiently and infinitely and compassionately, is genuinely novel and helpful.
On the other hand, giving ChatGPT the opportunity to be the new repository for my medical data is simply continuing the cycle of frustration with this question of “my medical data is not really mine - it’s a tool for private interests to lock me in.” In some ways ChatGPT is worse than all of them because the data stored is unknown and unmanageable. The manner in which chatbots store and manage memory is totally obtuse and insecure. The makers of the tools do not understand it themselves, as near as I can tell. And as of today it’s not protected in any legal or obligatory sense. My data is safe to the fullest extent of OpenAI’s competence and goodwill. Which isn’t much source for comfort.
My ask
I have a need.
What I really want is MyChart - but I want it to actually be mine.
I want to aggregate the representation of my medical data across all its sources that exist and I want it to live and exist in one place and I want that place to be mine. I guess all those other guys (the companies and the providers, that is) will own their respective shards of the data that is me. There’s nothing much I can do about that.
But I want my whole to at least be comprehensive and aggregated and in some loose sense organized to the extent that it is possible. Getting it actually well organized is likely impossible - for technical reasons there is a triangular tension between comprehensive and unique and accurate - but surely some kind of internal scaffolding isn’t too much to ask.
I want it to be up to date.
I want it to be portable. I want to be able to share access to some or all of it to providers, machine and human, at will so as to give them context to get advice. I want to be able to revoke access.
I want it to be secure. Obviously.
And that’s really about it, I think. The dashboards and the intelligence and the chatbots and the reminders and the trackers and the push notifications and the options and the offers and the counseling are certainly needed, but not here. Not subsuming and living on top of my data. That can all go over there, where it should be.
The problem is that I don’t think there’s the oxygen for something like this to exist in our politico-economic system.