See also 'Staying safe online is more important than ever, experts say'
A $160 dollhouse and four pounds of cookies showed up on the doorstep of the Neitzel family’s home in Houston in 2017. They were puzzled about where the order had come from; neither parent had ordered the mysterious delivery. They soon realized that their 6-year-old daughter had accidentally ordered the dollhouse and sugar cookies while talking to the family’s Alexa device a few days prior.
The story took the internet by storm, and Alexa-driven shopping sprees continued. On San Diego’s CW6, news anchor Jim Patton shared the story on the morning show and said, “I love the little girl saying, ‘Alexa ordered me a dollhouse.’”
Alexa device owners who were watching the broadcast reported that their own Alexa devices had tried to make a dollhouse purchase after interpreting Patton’s comment to be a voice command, according to Fortune Magazine.
Each day Americans use virtual assistants like Siri, Alexa and Google Assistant to catch up on the news, set timers, hear about the latest sports game or even learn a new language. However convenient these devices may seem, users are giving these gadgets more information than they might think.
There are approximately 40,000 Google search queries per second — totaling almost 3.5 billion search queries each day, according to Internet Live Stats.
With all the voice queries that happen each day, a concern of many virtual assistant users is whether or not their devices record the conversations they are having outside of directly asking their device a question.
“All of our voice-enabled devices have to be constantly listening to everything we’re saying because they have to hear the wake-up phrase,” said Adam Durfee, a BYU communications professor and digital media specialist.
Phones and smart home devices will typically record in three-second loops — just long enough to hear the wake-up phrase like “Alexa,” “Siri” or “Hey Google.” The audio constantly records over itself and it is never recorded long term until it has been awakened. When this does happen, the device begins recording everything it hears, and the audio is sent to a database cloud to be interpreted.
The trouble is that sometimes these devices are awakened accidentally by other words commonly used in households.
People who wake up their devices accidentally are recorded unknowingly and the recording is sent off to the cloud to be analyzed by a computer. However, a small sample of these recordings are regularly listened to by real humans for quality control purposes.
These accidental wakings can also cause the device to take action on behalf of the owner — like ordering something from Amazon or calling a contact — without the owner knowing.
“I don’t really like the idea of devices listening without my permission,' said BYU student Shane Dawson. 'If I wanted my information to be gathered, I would give it willingly. I don’t like them taking that without my consent.”
Anything users do on an internet-enabled device is tracked — each app they open, each item they order, each question they ask their virtual assistant — and is ultimately used for advertising purposes.
However, information collected about users from virtual assistants is not as commonly sold to advertisers by companies like Google or Amazon as many may assume. The U.S. Postal Service actually sold more information about consumers than Facebook, Google or other large companies, according to Durfee.
Instead, companies like Facebook, Google and Instagram often contact advertisers and offer to strategically place and show ads to members of the advertisers’ target audience based on the information they’ve collected.
“Generally, companies are really only interested in using data to sell stuff,” said Merrill Oveson, information technology director at UVU. “Does anyone really care if the Googles and Amazons of the world know my approximate income, age, race or hobbies? Some may. I don’t — trust me, they already have this information.”
In collecting such biographical information, each individual using a phone, laptop or device enabled with the internet has a unique experience based on the information their technology has learned about them.
Ultrasonic targeting is another way virtual assistants and advertisers work hand in hand to provide users with a personalized experience.
Some say that the personalized experience advertisers try to create for online consumers is a small price to pay for the free services that they receive.
“No matter who you are, you are going to see ads every day — that’s how we get a lot of things we have for free,” Durfee said. “We know we have to see ads. We know that’s how the world keeps going as a free concept. No one wants to pay 50 cents per Google search, so we like the idea of having these ads.”
However, others may say targeted ads cross a line that companies should not be allowed to cross. Many online users have turned to social platforms like Twitter to voice their concerns about alleged creepy patterns they see while using their technology.
Ads that seem to read users’ minds and data scandals like Cambridge Analytica have stirred up a conversation about personal data privacy and whether the risks of being online are worth it.
“I definitely feel like pros outweigh the cons because if someone wants the information on you, they will get it,' said BYU student Marisa Johns. 'It’s the day and age — I don’t think privacy will be as big of a deal as we may think right now. Technology makes our lives better. The pros make our lives easier and the cons are all theoretical.”
Although some may feel skeptical about online user tracking, users like Johns and Durfee believe that the positive outweighs the negative.
“We have to understand that selling data is part of the price we pay for making our lives a little bit better,” Durfee said. “I’m okay if they want to keep a record, as long as they use those in ways that better my life or tailor an advertising experience. If I have to see ads, I might as well see more relevant ones.”