Tim Verheyden, a journalist with Belgian public broadcaster VRT, contacted the couple bearing a mysterious audio file. To their surprise, they clearly heard the voices of their son and baby grandchild—as captured by Google’s virtual assistant on a smartphone.

Verheyden says he gained access to the file and more than 1,000 others from a Google contractor who is part of a worldwide workforce paid to review some audio captured by the assistant from devices including smart speakers, phones, and security cameras. One recording contained the couple’s address and other information suggesting they are grandparents.

Most recordings reviewed by VRT, including the one referencing the Waasmunster couple, were intended; users asked for weather information or pornographic videos, for example. WIRED reviewed transcripts of the files shared by VRT, which published a report on its findings Wednesday. In roughly 150 of the recordings, the broadcaster says the assistant appears to have activated incorrectly after mishearing its wake word.

Some of those captured fragments of phone calls and private conversations. They include announcements that someone needed the bathroom and what appeared to be discussions on personal topics, including a child’s growth rate, how a wound was healing, and someone’s love life.

Google says it transcribes a fraction of audio from the assistant to improve its automated voice-processing technology. Yet the sensitive data in the recordings and instances of Google’s algorithms listening in unbidden make some people—including the worker who shared audio with VRT and some privacy experts—uncomfortable. Privacy scholars say Google’s practices may breach the European Union privacy rules known as GDPR introduced last year, which provide special protections for sensitive data such as medical information and require transparency about how personal data is collected and processed.

Sourced through Scoop.it from: www.wired.com