Malicious Siri Commands Can Be Hidden In Music And Innocuous Sounding Speech Recordings
A group of students from Berkeley have demonstrated how malicious commands to Siri, Google Assistant and Alexa can be hidden in recorded music or innocuous-sounding speech. Simply playing the tracks over the radio, streaming music track or podcast could allow attackers to take control of a smart home … The NY Times reports that it builds on research that began in 2016. The 2016 research demonstrated commands hidden in white noise, but the students have this month managed to do the same thing in music and spoken text....