Even Apple uses your data in ways you might not like. But it has to if you want Siri to be more than a pretty voice.
A piece from The Guardian back in July claimed Apple "regularly" had third-party contractors listening to the things Apple customers were telling Siri has kicked off a bit of a stir. The company that uses privacy to bolster its reputation was caught doing the same things that Google, Amazon, and every company with a digital personal assistant was doing, and people were ready with digital pitchforks raised.
In response, Apple apologized and has promised to change the way it collects data and that third-parties are no longer going to be involved when it comes to listening to saved recordings of users who have opted-in. This, of course, is exactly what any company would do when it was caught doing a thing its users don’t like. The thing is, this move guarantees that Siri is always going to suck compared to the digital assistants offered by other companies.
Apple’s biggest blunder
This week, Apple released a full-on apology along with a promise to do better when this came to light. You should follow the link and read it if this sort of thing interests you, but here are the "important" bits:
-First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve. -Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt-out at any time. -Third, when customers opt-in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.
These all sound like good ideas that should have been implemented from day one, but at least they are in place now. They definitely should have been considered when the company was buying ad space to tell us all that only Apple cares about our privacy and those other companies will sell you down the river because they are inherently evil.
The worst thing about what Apple did was that it promised us it would never happen.
When caught with its hand in the cookie jar, Apple did the right thing. But Apple promised it would never try to steal a cookie and that leaves many with a really bad impression.
Training AI isn’t magic
Earlier I said this move would make sure Siri always lagged behind the competition when it comes to intelligent assistants. That’s because of how you make something like a digital voice-activated assistant better — especially when it’s not very good right now.
Amazon and Google are super intrusive with the amount of data each collects. Don’t make the mistake of thinking Apple doesn’t also collect a sickening amount of user data, which is plain to see if you try to download your personal data from Apple’s Privacy Portal. The difference is that Amazon and especially Google are very upfront about aggregating it all so your experience with Alexa or Google Assistant is much more personal. Alexa knows I just bought product X so it’s ready to show or tell me about product Y. Google knows I just bought plane tickets so it tries to help plan my vacation.
If you want a voice-activated product to understand people, you need to let programmers listen in.
Siri, for one reason or another, isn’t there yet and without incorporating more user data it never will be. Apple seems fine with this, positioning Siri as more of a product you can ask questions to get answers instead of something more proactive, and the changes being made to how Siri stores data are going to make that even harder. Users happily opt-in when it comes to Google Assistant because it gives something they think is valuable in return. If Siri can’t give you proactive information about your day to day life, there isn’t as much value in letting Apple listen to what you’re saying.
That’s a real problem when it comes to voice recognition. No matter then language, people in different areas or from different backgrounds will always speak differently. Accents, voice inflection, the choice of which words to use and more mean any AI needs plenty of training to recognize how we talk and it can’t get that from a written transcript. By doing the right thing, Apple is making it harder on itself when it comes to Siri getting better.
They all do it
The best thing you can do when it comes to digital assistant tech is to remember that every company offering it grabs as much data about you as it can. It needs that data to be more than a text to speech version of Ask Jeeves and that’s something nobody wants. What is important is that you’re properly notified in advance of what data is being collected and how that data is stored and used.
If you trade your data away, make sure what you get in return is worth it.
Equally important is how a company reacts when we realize just what those terms and conditions really mean. Apple told everyone upfront — as did Google or Microsoft or Amazon — that data was being stored and possibly even listened to in the terms you agree to when you first use Siri. While the language obviously wasn’t clear enough, otherwise we wouldn’t be surprised when this sort of thing happens, it is there for us to read if we want to read it. The gripe that third-parties are the ones doing the listening is something that should have been addressed beforehand, but at least now it is.
What you need to do is consider the value of the service(s) you receive in return for all that data. If you love Siri, or Google Assistant, or any other voice service from a tech company and think it’s worth trading your data for it, then keep doing it. just make sure you know what you’re giving away in return for it.
How do I sound?
Google Home Mini
Google’s aging Home Mini speaker is the cheapest and easiest way to get full access to the Google Assistant anywhere in your home.
August 30, 2019 at 03:34AM