00:00 / 00:00
Nov 20, 2024

Musk Asks For Something SHOCKINGLY Private From X Users. They HAPPILY Oblige.

Elon Musk, X’s owner, has asked it's users to upload X-rays, MRIs, CT scans and other medical images to Grok and many obliged.
  • 10 minutes
I think. I think AI is going to be incredible for for medicine. Actually, as it is for, for grok right now, you can actually upload a Pet scan and ask grok to analyze the Pet scan for you. And, and then you can compare that to what, what a doctor tells you [00:00:17] and see what the difference is. Like you can literally upload your image to grok and it will analyze your your MRI, your Pet scan, whatever the case may be, and and tell you what it thinks the probable, issue is. [00:00:32] Elon Musk is looking to improve his health care related AI venture by essentially encouraging X users to upload their personal health records to his AI chat bot. It's called grok. [00:00:49] He referenced it in that speech, but there are some real privacy concerns that his supporters might not have considered. And so I think it is important to think about that, especially if you're tempted to upload this very personal information and data to this AI chatbot. [00:01:04] So over the past few weeks, X users have unfortunately started submitting their X-rays, their MRIs, their CT scans, and other medical photos to grok. All because Elon Musk basically asked them to on X. So he wrote this. [00:01:20] Try submitting X-ray, Pet scans, MRI, or other medical images to grok for analysis. This is still early stage, but it is already quite accurate and will become extremely good. Let us know where grok gets it right or needs work. [00:01:37] I mean, if you've if you're fortunate enough to have healthcare and you've gotten x rays or a Pet scan or an MRI, like it's probably because your doctor ordered it, shouldn't your doctor tell you what's going on? I don't know, but anyway, it doesn't stop there. [00:01:54] Apparently. You can also submit other medical results as well. So I'll give you an example of an interaction on X where a woman named Barbara says, Will we be able to submit other medical result results, such as lab work, so that grok could analyze trends and explain terminology? [00:02:10] And Elon Musk responds with yes. Now, obviously people can do whatever they want, but they should also know about some of the downsides or potential risks in doing so, because there's nothing more personal than your medical records, which is why there are federal regulations in place to protect you [00:02:27] and protect your privacy as it pertains to your medical history and medical records. So the public call out for Musk and users willingness to upload their medical information has alarmed the medical privacy experts, because personal medical information that's shared via social media [00:02:45] is not bound by HIPAA regulations. Those are the privacy laws I'm referring to, and HIPAA does protect your personal health care info from being shared without consent. This is very personal information, and you don't exactly know [00:03:01] what grok is going to do with it. Posting personal information to grok is more like, we let's throw this data out there and hope the company is going to do what I want them to do. And that was a professor of biomedical information at Vanderbilt University. [00:03:18] And according to, Marlin or Marlin, I should say that's a marlin. That's the, person that we just heard a quote from. If a tech company partners with a hospital to get data, that's a completely different story. Those deals typically do include agreements on how the information [00:03:36] is stored, shared and used. But in its privacy policy, X, formerly Twitter, has said it will not sell user data to a third party, but it does share it with related companies despite Musk's invitation to share medical images. [00:03:51] The policy also says X does not aim to collect sensitive personal information, including health data. So I look, I would be very careful. I personally would not upload my super personal information, [00:04:07] including medical records or test results, to an AI chat bot. Especially knowing that that data is not protected under HIPAA regulations. And if people are fully aware of that and still choose to upload these [00:04:23] personal details about themselves into this, AI chatbot, that's fine. But I don't think people are necessarily thinking about that or even know about that. What do you think, John? Yeah, I think this potentially could be really bad for a lot of people [00:04:38] in ways that I think are pretty obvious, if you think about it literally at all. As you pointed out, there's nothing that's stopping you from uploading an image of an x ray to ChatGPT. Six months ago I suppose him encouraging people to do it. Maybe it's just the thing that he's throwing out, or maybe there's a reason [00:04:54] that he wants people to do that. And he's making promises in about about not sharing it or whatever. Look, I guess I'm giving up on trying to convince people that trusting Elon Musk is a really bad idea, as well as respecting or admiring Elon Musk. Since I guess he's co-president now, I think we should probably treat him [00:05:12] with some respect. He is richer than us, after all. Which means he's better than us. I don't think there's any reason to trust him with literally any of your data, including your tweets. By the way, I certainly would not be uploading Pet scans, X-rays, blood workups. I think that's crazy. [00:05:29] I wouldn't want people to do that on any of the other platforms. I mean, all of them are run by people who are kind of a variation on the Elon Musk model, but I don't trust him at all. And I think it's honestly playing on people's desperation. I mean, you talked about, you know, like you should be going to a doctor, [00:05:45] but some people might not have a doctor or certainly not one that they can go to, like if they're just concerned about something. And so I think a lot of people might be willing to roll the dice with an AI chatbot, not knowing that in all likelihood it will be used to train the model [00:06:00] or to be sold to someone else, or leaked accidentally or hacked at some point. And I don't like it at all. Yeah, I mean, I definitely think that he is soliciting this data because he wants it to, you know, train the chatbot, the AI, you know, that's the way I functions. [00:06:19] And so and look, what I was referring to is if you've gotten a, an x ray or a, you know, MRI scan or whatever it is, usually you get that after you've been referred by your primary care physician to do it. So your primary care physician can see if there are any issues [00:06:36] and that gets analyzed by your doctor. I think, you know, there is a point to what you're saying, though, about desperation, because there are a lot of people in this country who don't have health insurance aren't covered, and they tend to rely on Google search results to try to figure out, [00:06:54] you know, why they're ill or why they're noticing certain symptoms. And I don't think that's a solution to our broken health care system. And I'm worried not only about the privacy concerns here, but also about the fact that This AI chatbot might not be accurate in analyzing what's wrong [00:07:14] with someone like health wise. They might actually diagnose someone with an illness they don't have or miss the appropriate diagnosis. Right. So like that's another thing to keep in mind, you know, are people going to be sober minded as they rely on the results of what the AI tells them about their [00:07:32] own bodies and their own illnesses? So I do see some significant, significant downsides here. And again, if there was some clarity and if there was a very clear disclosure in his solicitations about the privacy issue here, then fine, you know, [00:07:48] go ahead and do what you're doing. If you want to still upload your personal data, that's your right. But I don't think people are fully aware of what the downsides are here, especially because it's kind of like it has this facade of of wanting to help people and wanting to help people in an area where I think Americans have been [00:08:07] neglected and screwed over quite a bit. And that's, you know, the health care industry. Yeah, 100%. I guarantee that already people are doing stuff like this and I'm sure it is causing endless headaches to doctors. I'm sure it's just their probably their diagnoses are being constantly questioned. [00:08:26] And look maybe 1 in 100. It's right. It should be questioned. But we already know that people go on WebMD and are convinced that they've got sarcoidosis and stuff like people already do that. And honestly, it was oddly specific. Oddly specific. It pops up, you could get it. But, yeah, I, I assume that this is probably making [00:08:44] doctors lives a living hell. Oh, for sure, for sure. And yeah, I agree with you on that. Anyway, there are going to be a lot of bumps in the road when it comes to AI. I think this is just one of the examples. But you know, [00:08:59] the other thing that could happen is you could have a more effective Congress that passes legislation to expand HIPAA regulations to things like AI. But I don't even know if members of Congress understand what AI is. [00:09:15] - So it's a series. - Of tubes, I think. I mean, any like congressional hearing with tech executives makes me die a little inside because of how unaware they are of this tech like technology, how it works, [00:09:31] the impact it has on people's lives, the potential, you know, areas of exploitation that exist, like they're just completely out to lunch on those issues. So I wouldn't rely on Congress, unfortunately, but that would be another possibility. [00:09:48] You know, certain laws and regulations that protect consumers, protect Americans from having their personal data, you know, shared with third parties or with other companies or potentially with other health care companies. Just something to keep in mind and consider. [00:10:04] Thanks for watching. If you become a member, you get to watch all this ad free. Except for of course, this ad still hit the join button below.