-
I'd love to see something like
#lightcommands (https://lightcommands.com/ ) on@whoismrrobot -
Look at that! Another security hole of microphones we put in our homes!
#LightCommands https://lightcommands.com/ -
It would be really cool if you or someone from
#LightCommands could talk about this vulnerability or other stuff you have learned on security@_BSidesKC or@sec_kc . You have a unique view on these things. -
#Siri,#Alexa and#Google Home Assistant are vulnerable to attacks using lasers to inject commands, as#LightCommands identify a semantic gap between the physics of#MEMS microelectro-mechanical systems microphones that respond to light as if it was sound. https://arstechnica.com/information-technology/2019/11/researchers-hack-siri-alexa-and-google-home-by-shining-lasers-at-them/ … -
Controlling Alexa or Google Home with a laser from outside of a house using laser-based audio injection attacks... the device will convert light into electrical signals that consist of commands.
#LightCommands https://lightcommands.com/ -
Alexa, Siri, Google Smart Speakers
#Hacked Via Laser Beam: https://threatpost.com/alexa-siri-google-smart-speakers-hacked-via-laser-beam/149860/ … via@threatpost#LightCommands#DigitalAssistant#VirtualAssistant#IoT -
Lasers can silently issue 'voice commands' to your smart speakers. https://buff.ly/2PMTRJa
#Alexa#gadgetry#gadgets#gear#lightcommands#Siri#tripontech#tot@TriponTechpic.twitter.com/0oGZ2130aK
-
Anyone who has the ability to test out the
#LightCommands exploit, could you grab the recording from your Google Account Activity, so that we can hear what exactly the Google Home heard? Does it hear static? Does it hear actual voice? -
This is wicked s***. If this actually works, it's brilliant research.
#LightCommands https://twitter.com/svblxyz/status/1191521118563291136 … -
So excited to discuss Security and Privacy of IoT devices today at Privacy@Michigan Symposium!
#PrivacyDay#IoTSecurity#LightCommands@UMengineeringhttps://twitter.com/floschaub/status/1222167174367703042 … -
Gli Attacchi
#LightCommands: non sono una pura fantasia... Il mio contributo per#matricedigitale#cyberthreats#CyberSecurity#infosecurity#cybernews
https://www.matricedigitale.it/notizie_192/post/attacchi-light-commands-non-sono-una-pura-fantasia_763.html …pic.twitter.com/biIjmroOW9
-
It's time to get a privacy shutters for mic's of smart devices, just how we have for cameras of smartphones. Here's how one can hack your smart home device via laser light. https://lightcommands.com/
#lightcommands Here's a great video by@smartereveryday :https://youtu.be/ozIKwGt38LQ -
#hack au#laser des#assistantsvocaux par#injection lumineuse :O L'article qui va vous faire regretter certains de vos cadeaux de#noel X_X#lightcommands#googlehome#amazonalexa#siri#facebookportalhttps://webplus.agency/injection-audio-malicieuse-a-base-de-laser-sur-des-systemes-a-commande-vocale/ … -
A smart speaker is vulnerable to hacking with a laser pointer and some know-how, researchers demonstrate at the University of Michigan.
#11investigates#amazon#google#apple#lightcommands#michigan#UMhttps://www.wtol.com/article/news/investigations/11-investigates/hacking-smart-speakers/512-5a8c21dd-e1b8-4dee-9d7a-b684ae3a3a97 … -
It's possible (and fairly simple I should note) to inject voice commands into an Alexa or Google Home from a distance using a cheap laser pointer and a laser driver... Keep your devices away from windows folks :) https://youtu.be/iK2PtdQs77c
#lightcommands UMich security research -
Los hackers pueden controlar silenciosamente su Google Home, Alexa, Siri con luz láser. https://bit.ly/3502N21
#Actualidad#Actualizaciones#Alexa#AMENAZAS#GoogleHome#Hackers#LightCommands#Luzlaser#NoticiasdeSeguridad#Siri#Tecnovan#Vulnerabilidadpic.twitter.com/5Zm82i4TV2
-
Just when we found a place for the
#Sonos speakers near the window. I guess, I'll move them.#LightCommands lets#hackers use#lasers to send#voicecommands to#voiceassistants.#cybersecurity@EduardKovacs@SecurityWeek http://bit.ly/2JZ8aXr pic.twitter.com/WJADDnNBld
-
Researchers have found how to remotely control voice-activated
#personal assistants like#Apple's#Siri,#Amazon's#Alexa, and#GoogleAssistant from up to 360 feet away.
Find out more about the #LightCommands attack on our blog!
https://www.intego.com/mac-security-blog/researchers-use-lasers-to-hack-siri-alexa-google-assistants/ … -
Smart speakers are vulnerable to spoofing attacks using #lasers. That’s the conclusion of#LightCommands research. And an attacker could aim the laser through a window. In#SecurityBlogwatch at@TechBeaconCom,@RiCHi considers weird new threat models:https://techbeacon.com/security/alexa-dont-listen-silent-laser …Prikaži ovu nit -
Light Commands – Laserbasierte Audioinjektion in Smart Speaker

» https://www.beyto.com/light-commands-laserbasierte-audioinjektion-in-smart-speaker/ …
.
.
#beyondtouch#beyto#voice#voicefirst#lightcommands#digitalassistan#voiceassistant#alexa#googleassistant#siri#research#attack#smarthome#smartspeaker
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.
Didn't know this was even possible.