Rezultati pretraživanja
  1. I'd love to see something like () on

  2. 6. stu 2019.

    Look at that! Another security hole of microphones we put in our homes!

  3. 26. pro 2019.
    Odgovor korisniku/ci

    It would be really cool if you or someone from could talk about this vulnerability or other stuff you have learned on security or . You have a unique view on these things.

  4. 4. stu 2019.

    , and Home Assistant are vulnerable to attacks using lasers to inject commands, as identify a semantic gap between the physics of microelectro-mechanical systems microphones that respond to light as if it was sound.

  5. 4. stu 2019.

    Controlling Alexa or Google Home with a laser from outside of a house using laser-based audio injection attacks... the device will convert light into electrical signals that consist of commands.

  6. 5. stu 2019.
  7. 5. stu 2019.
  8. 5. stu 2019.

    Anyone who has the ability to test out the exploit, could you grab the recording from your Google Account Activity, so that we can hear what exactly the Google Home heard? Does it hear static? Does it hear actual voice?

  9. 4. stu 2019.

    This is wicked s***. If this actually works, it's brilliant research.

  10. 28. sij

    So excited to discuss Security and Privacy of IoT devices today at Privacy@Michigan Symposium!

  11. 23. sij
  12. 28. pro 2019.

    It's time to get a privacy shutters for mic's of smart devices, just how we have for cameras of smartphones. Here's how one can hack your smart home device via laser light. Here's a great video by :

  13. 28. pro 2019.
  14. 11. pro 2019.

    A smart speaker is vulnerable to hacking with a laser pointer and some know-how, researchers demonstrate at the University of Michigan.

  15. 23. stu 2019.

    It's possible (and fairly simple I should note) to inject voice commands into an Alexa or Google Home from a distance using a cheap laser pointer and a laser driver... Keep your devices away from windows folks :) UMich security research

  16. 14. stu 2019.
  17. 7. stu 2019.

    Just when we found a place for the speakers near the window. I guess, I'll move them. lets use to send to .

  18. Researchers have found how to remotely control voice-activated assistants like 's , 's , and from up to 360 feet away. 😰 Find out more about the attack on our blog! 👇

  19. 🤔Smart speakers are vulnerable to spoofing attacks using . That’s the conclusion of research. And an attacker could aim the laser through a window. In at , considers weird new threat models:

    Prikaži ovu nit
  20. 6. stu 2019.

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.