Amazon patents Alexa tech to tell if you’re sick, depressed and sell you meds

D

Deleted member 330960

Guest
Hey Bezos...

torvalds_bird.jpg
 
Upvote
83 (93 / -10)

helel ben shachar

Ars Legatus Legionis
13,549
Subscriptor++
... and if Alexia hears your partner moans, would it offer morning-after pills?

More likely when it hears "It's ok, it happens to lots of guys...." you get a free sample of blue pills in the mail.

Or it hears one of my kids voices, mistakes it for mine, and asks if I'd like to be connected with a doctor so that my testicles might be reattached.
 
Upvote
59 (59 / 0)

trimeta

Ars Praefectus
5,618
Subscriptor++
Are people talking to their Alexas enough for it to get a meaningful baseline for the emotional component of this? I could sort of see the "if you sound like you have a sore throat, recommend cough medications" application, but anything beyond that seems like you'd need to interact with your Alexa a lot more (and do more than issuing it simple commands) for it to get meaningful results.
 
Upvote
15 (16 / -1)

Edgar Allan Esquire

Ars Praefectus
3,097
Subscriptor
Are people talking to their Alexas enough for it to get a meaningful baseline for the emotional component of this? I could sort of see the "if you sound like you have a sore throat, recommend cough medications" application, but anything beyond that seems like you'd need to interact with your Alexa a lot more (and do more than issuing it simple commands) for it to get meaningful results.
That's the future, no more "lonely cat lady" tropes. It's going to be an elderly person sitting in a room talking to their home-monitor and media center AI for companionship... I just bummed myself out...
 
Upvote
44 (44 / 0)

H2O Rip

Ars Tribunus Militum
2,130
Subscriptor++
Are people talking to their Alexas enough for it to get a meaningful baseline for the emotional component of this? I could sort of see the "if you sound like you have a sore throat, recommend cough medications" application, but anything beyond that seems like you'd need to interact with your Alexa a lot more (and do more than issuing it simple commands) for it to get meaningful results.

Probably yeah - though I imagine it would be more if you are using some of the skills that encourage more interaction like that.
 
Upvote
2 (2 / 0)
My mind would be blown to see the technology that allows for an accurate diagnosis of clinical depression based on the very minimal interaction that humans typically have with voice assistants. And that's not even getting into the legal framework that doesn't yet exist, but would be required to allow machines to prescribe anti-depressants.

ps. I'm a depression researcher.
 
Upvote
63 (63 / 0)

jdale

Ars Legatus Legionis
18,346
Subscriptor
It's worth noting that your personal health information as generated by the medical system is private and protected. But your health information as obtained by a system like this that is not a medical professional, not to mention over the counter medications, are not protected in any way and can be freely processed and sold to advertisers, insurers, and other interested parties.

It's a really big hole in our privacy laws.
 
Upvote
55 (55 / 0)
I'm all for this (other than the patent aspect).

Wait, hear me out.

Because then Alexa data becomes covered under HIPAA laws. Can't have it both ways!


What is bothering me here, though, is that if this gets used for any mental health aspects versus just "do you have a sore throat?", then there are some deeper issues there, and it's something that at a minimum I'd like to require to be reviewed by and accredited by the APA even for basic analysis validity/reliability, much less any recommendations then provided.

What's also bothering me, tangentially, is that this patent sounds suspiciously like something I suspect has prior art in terms of vocal analysis, and instead is just "doing it on a computer" and thus should be thrown out under Alice.
 
Upvote
22 (22 / 0)

jdale

Ars Legatus Legionis
18,346
Subscriptor
My mind would be blown to see the technology that allows for an accurate diagnosis of clinical depression based on the very minimal interaction that humans typically have with voice assistants. And that's not even getting into the legal framework that doesn't yet exist, but would be required to allow machines to prescribe anti-depressants.

ps. I'm a depression researcher.

Prescribing anti-depressants is probably not on the table, but over the counter drugs like cough suppressants is.

But this is a pretty narrow view of what this is about. Mood has significant effects on purchasing decisions. E.g. see https://www.theatlantic.com/business/ar ... ss/380986/ Amazon may not have an interest in whether you need anti-depressants, but they do want to know whether they should be serving you ads for televisions, vacation travel, or comfort food. Getting information on your health and mental state is an ideal way to take advantage of you.
 
Upvote
25 (25 / 0)

Dilbert

Ars Legatus Legionis
34,009
This would be just the same as talking to an AI entity like we've seen in scifi for decades, if the processing were local. But it isn't. The processing is remote. Local device is just a voice recorder sending your voice to the cloud, and cloud does speech to text, finds an answer, sends it back, and device then does a text to speech.

The backend keeps everything you've ever sent them and learn more about you as time goes on.

Anyone still fine with this after they know how it works, perhaps they deserve the consequences. But vast majority don't know how it works.... :/
 
Upvote
4 (5 / -1)
Also, Jon, I really hope that you're not conflating sadness with depression. They're not the same. Amazon's blurb only mentions "sadness", so I'm not sure why the article brings in depression:

Amazon has patented technology that could let Alexa analyze your voice to determine whether you are sick or depressed and sell you products based on your physical or emotional condition.

vs

"For example, physical conditions such as sore throats and coughs may be determined based at least in part on a voice input from the user, and emotional conditions such as an excited emotional state or a sad emotional state may be determined based at least in part on voice input from a user," the patent says. "A cough or sniffle, or crying, may indicate that the user has a specific physical or emotional abnormality."

These aren't equivalent (sadness and depression are completely different when discussing mental health contexts, which the article does bring this into), so I'm curious how "depressed" came into this (if it's somewhere else in the patent, that's fine, but what's quoted here doesn't have it).
 
Upvote
18 (18 / 0)