Skip to navigationSkip to content
Close
Amazon Is Working on a Device That Can Read Human Emotions

Amazon Is Working on a Device That Can Read Human Emotions

Read more on Bloomberg

Contributions

  • Honestly, this is exciting! (Yes, there are definitely some potential negatives, it’s tech after all).

    “Eventually the technology could be able to advise the wearer how to interact more effectively with others...” this could be so helpful for people who are dealing with varying levels of ptsd, anxiety

    Honestly, this is exciting! (Yes, there are definitely some potential negatives, it’s tech after all).

    “Eventually the technology could be able to advise the wearer how to interact more effectively with others...” this could be so helpful for people who are dealing with varying levels of ptsd, anxiety, spectrum levels, and a variety of other disorders/conditions.

    It can also teach us more about and specific emotional disorders, how to potentially better handle emotions (or lack of), and in general provide interesting data for consumer research.

    Where do I sign up for the beta!

  • According to a person familiar with the project, it will use “microphones paired with software that can discern the wearer’s emotional state from the sound of his or her voice.”

    It is a strange concept. What would be the benefit unless there is much more powerful, primary algorithms that can do something

    According to a person familiar with the project, it will use “microphones paired with software that can discern the wearer’s emotional state from the sound of his or her voice.”

    It is a strange concept. What would be the benefit unless there is much more powerful, primary algorithms that can do something with that data input? I mean at the moment, as smart as Alexa is, Amazon wouldn’t be able to do anything constructive with the tech? I mean, who would want a device strapped to them telling them that they are “having a bad day”?

  • Some devices are capable to read the mind of individuals, others are capable to give artificial voices to non-verbals... and now devices capable to read emotions? Those tools will have to stay in good hands.

  • This has huge potential for good – listed in many comments here – and bad. We already know that people working for Amazon routinely listen to our commands, seemingly for the fun of it. Pair this with vocal sentiment analysis and the dystopian scenarios are not hard to paint.

    It's clear that we can't

    This has huge potential for good – listed in many comments here – and bad. We already know that people working for Amazon routinely listen to our commands, seemingly for the fun of it. Pair this with vocal sentiment analysis and the dystopian scenarios are not hard to paint.

    It's clear that we can't, and shouldn't, stop the advent of new technologies. But we must be prepared with the appropriate regulations that ensure people are protected from Big Tech behemoths which are happy to sell our souls if it means they make a profit.

  • There are a multitude of good applications and use cases that can come from monitoring human emotions. Not only for academic or medical research, but also for crisis management, criminology and risk analysis. But the danger is on the opposite side for data and privacy now extend into covert understanding

    There are a multitude of good applications and use cases that can come from monitoring human emotions. Not only for academic or medical research, but also for crisis management, criminology and risk analysis. But the danger is on the opposite side for data and privacy now extend into covert understanding of behavior even beyond the apparent personal data.

  • This is a logical step for voice (Alexa) technology. Amazon has a ton of data linking emotional state to purchase behavior and the nudge to buy stuff to solve an emotional problem is a relatively simple step. And unlike many other emotional AI, the stakes aren’t that high in an emotional purchasing app.

  • I find the whole concept alarming. It also seems to me to be even more of an invasion of privacy than all sorts of systems that are already spying on us. I am quite capable of being aware of my own state of mind without needing my watch to tell me. What I would benefit from is a device that could alert me to my wife's mood.

  • Odd this type of device was featured on Big Bang Theory. Fiction follows art? Once upon a time ever cutting edge exec had a voice stress analysis device on their desk. Pseudo science strikes again?

  • This also makes sense - because facial expressions can be ready quite easily with tech. How would this be used against us?

  • This is invasive. It will be used to take advantage of particular emotional states to drive sales and particular interconnected services under the guise of making our lives better. If you’ve been following Amazon’s moves over the years you will see a company hellbent on being at the forefront of every

    This is invasive. It will be used to take advantage of particular emotional states to drive sales and particular interconnected services under the guise of making our lives better. If you’ve been following Amazon’s moves over the years you will see a company hellbent on being at the forefront of every single action of every single day of their customers. Why - for profit, that’s why. This isn’t altruistic in nature, it’s about pushing product, growing revenue, and increasing profits. Their earnings and value are indicative of their relentless pursuit of being the world’s most customer centric company. Winning that game means knowing us better than we know ourselves.

    Can this be helpful? Sure. Can this be harmful? Sure. Buyer beware. I’m all for freedom of choice even if it results in wearers of this device giving away intimate, vulnerable data to be used to empty their wallets. But when should regulation and governance step-in? How much do we allow ourselves to give away to giant corporations in the name of technology, innovation, and convenience? Who is qualified to govern such a thing? Can these devices collect intimate data on people near the wearer - what do we call that, second-hand espionage?

    Sometime in the near future:

    Human:

    “Alexa... How do I feel today?”

    Multi-National Corporation Sympathy Bot:

    “You feel sad, why not buy something on Amazon that brings you joy? You should consider switching your Prozac and Zoloft prescriptions to Amazon PillPack for direct to door delivery. Why don’t you grab some snacks and something hot to eat from Whole Foods Market on your way home? Then you could curl-up on the couch with something yummy and catch-up on your favorite Prime Video series - shall I set that all up for you?”

    My suggestion: Buy Amazon Stock...

  • The truth is a lot of AI software in use in call centres today detects tone and sentiment in your voice and routes customers to agents best trained to deal with angry, confused etc.. customers. This feels like a build of that technology but applied to the wrong use case has the potential to be dangerously abused!

  • 'Eventually the technology could be able to advise the wearer how to interact more effectively with others, the documents show.' Seriously? Has it come to this? Then again, if Trump had one, he might be able to converse like a human being with Pelosi.

  • In other words, your Alexa has detected that you are Grouchy Smurf 28 out of 30 days of the month. Like we all need one more thing judging us.

    Amazon crosses lines in Facebook-esque fashion. I think you all will be up in arms about this within the year. This should not excite you, it should offend you.

  • And why would someone want to wear this? What's the pitch?

  • This is a horrific step towards greater monitoring. The only reason to have this would be to sell more products as a “quick fix” for someone who is upset. I’ve seen comments here saying “this better stay in good hands.” What happens when it doesn’t stay there? Who watches the watchman?

  • What a bunch of random, pseudo-knowledgeable opinions!

  • It is a good idea, until someone uses it with bad intentions...

  • It is interesting to see Amazons experiments and moves in different directions to learn more and more about us, from buying behaviour to our routines to emotions.

  • The first time I saw this kind of tech in a shopping application was 10 years ago. I'm surprised that it's taken this long for Amazon to develop their own!

    The demo I saw a decade ago used cameras for facial recognition and would modify suggestions based on how people reacted to various things around

    The first time I saw this kind of tech in a shopping application was 10 years ago. I'm surprised that it's taken this long for Amazon to develop their own!

    The demo I saw a decade ago used cameras for facial recognition and would modify suggestions based on how people reacted to various things around the screen. Eye tracking technology has really evolved, so I can imagine a smartphone camera could even be enough. They were also tracking mouse movements on the screen.

  • I love Asimov books (and have caught wind there is work being done to produce a Foundation series but I digress), this technology is fascinating and admittedly freaks me out. Not sure how you all feel but I suppose there is a space in which we’re gonna have to embrace this radical future that is taking

    I love Asimov books (and have caught wind there is work being done to produce a Foundation series but I digress), this technology is fascinating and admittedly freaks me out. Not sure how you all feel but I suppose there is a space in which we’re gonna have to embrace this radical future that is taking shape. When we all have our augmented reality glasses on and smartphones become a thing of the past, that is when things are gonna get really “far out”.

  • This needs to be adjusted to account for Botox.

    Perhaps another detection method instead of facial or in combination would prove to be even more accurate and effective.

  • It seems like this can improve accuracy of their recomend engine rapidly.

  • Can a device help you communicate with others more effectively? And sooth you when you’re emotional?

  • Incredible

  • Yup did this a few years back