
In a bit of news that should surprise absolutely no one, there's a chance that some of your voice interactions with your smart speaker will be reviewed by a human at some company. Whether you're rocking an Echo Dot, Google Home Max, or some other third-party device with one of the aforementioned assistants built in, those companies might review your voice clips in person. Scary, right?
Alright, scary impulse reactions aside, this does not mean that someone's sitting at Amazon HQ listening to every little thing you say in your house. That would be logistically insane, and Amazon, Google, or whoever would spend more money on payroll for some barely usable data. That's not at all what's happening.
What is happening is that occasionally these companies will review voice clips to help annotate and improve voice recognition for commands. While yes, that does sound kind of creepy, it's really not nefarious and is partially responsible for the current state of excellent AI we're enjoying right now.

However, not all of these companies treat data the same. Amazon randomly selects voice clips to analyze, but it seems like they're probably the worst about obfuscating that data from users. Reviewers wouldn't be able to individually tell whose account those clips are coming from, but Amazon does keep an identifier attached to the information. That's better than nothing, I guess, but still makes Amazon the least privacy-focused here.
Google, on the other hand, distorts the voice clips before reviewers have a chance to listen to them, and there's no personally identifying info attached to the data at all. That's significantly better than Amazon's policy, and it also helps that you can delete your voice recordings so Google has no access to them at all.
Apple is unsurprisingly probably the best out of the bunch, since Siri voice data is uploaded with zero personal info and are attached to a random identifier ID, which is reset every time Siri is turned off. After six months that random ID is stripped from the data, and humans never see any of it. You also can't delete your Siri recordings because Apple doesn't store them with anything that could be tracked back to you.
I'm not saying one way or the other whether or not you should have a smart speaker in your house, but this should be a wake up call that yes, all of these speakers are constantly listening for that wake word. Once the wake word is heard and you give it a command, it's offloaded to a cloud somewhere to be processed. That's not necessarily a bad thing, but please be aware of what you're putting in your kitchen.
source: BBC