Opinion: The ChatGPT iPhone app from OpenAI has a glaring privacy problem – the company is reading your conversations


The ChatGPT app is displayed on an iPhone in New York, Thursday, May 18, 2023. The free app started to become available on iPhones in the US on Thursday and will later be coming to Android phones. Unlike the web version, you can also ask it questions using your voice. — AP

On May 18, OpenAI released an iOS app for ChatGPT and it quickly became the most popular free app in the App Store.

That’s not surprising considering some reports suggest ChatGPT had more than 100 million users in January – just two months after it launched. That would make it the fastest-growing technology product of all time.

For comparison, it took Facebook four and a half years to reach that number. Even TikTok took nine months.

Until now, almost all of the usage of ChatGPT has been done in a browser on a laptop or desktop computer. There was no mobile app available, and accessing the website on your iPhone wasn’t ideal.

At the same time, there have been plenty of impostor apps attempting to capitalise on the fact that so many people are paying attention to generative AI and exploring what it can do. It makes sense that OpenAI would want to get its own app out in the world.

The official app comes with a few cool features. First, there’s the fact that it’s free (there’s a paid upgrade to ChatGPT Plus that gets you access to OpenAI’s latest language model). Considering that many of the existing apps charge a weekly subscription fee – making them very expensive if not outright scams – having an official app that doesn’t cost anything is a welcome development.

The other feature is that you can talk to ChatGPT. ChatGPT can’t actually process audio prompts, so the feature will convert your speech to text and send it like any other question. For a conversational AI product, being able to simply say what you want to ask is a great feature.

The iOS app does, however, come with one important tradeoff that users should be aware of. It’s a big enough deal that the app prompts you the first time you open it. In addition to a caution that ChatGPT may just make things up, there’s another warning about sharing personal information because “Anonymised chats may be reviewed by our Al trainers to improve our systems”.

OpenAI’s privacy policy says that when you “use our Services, we may collect Personal Information that is included in the input, file uploads, or feedback that you provide”. Specifically, that means that if you ask ChatGPT questions that contain personal information, that information will be sent to OpenAI. That’s a big deal when you realise your chat may be read by a human reviewer.

The company says it anonymises conversations before they are seen by a human, but that just means that it removes identifying information from the metadata of the file – not the content of your prompt. If you include personal information, that information will still be included.

The company isn’t clear on whether it reviews the audio files to determine the effectiveness of its speech-to-text transcription, but the privacy policy certainly gives it the right to do so. That means that humans at OpenAI could listen to what you say, not just read what you type. It also means that they would have access to whatever other sound is going on in the background while you’re using the feature.

I reached out to OpenAI but did not immediately receive a response.

It’s not unreasonable that the company would want to review conversations – that’s how OpenAI evaluates how well the service is working. And, to be clear, it’s not possible for them to read every conversation from every user. That said, you have no idea whether OpenAI is reading your conversations, and you have no ability to opt out.

The human reviewers look to see what you ask ChatGPT, and look to see how it responds. They then look to see what, if any, follow-up you ask. That part isn’t nefarious, but it’s certainly something you should know – especially if you’re using it on your iPhone.

I say that because people interact with their iPhones differently than they do a browser on a computer. First, it’s a thing you carry with you everywhere, which means you’re far more likely to pull it out in a variety of situations. You’re more likely to ask it questions related to what you’re doing throughout the day, as opposed to sitting down at your laptop to play around with ChatGPT. That increases the likelihood that people will use it in ways that reveal more personal information.

I think it’s good that OpenAI is providing a notice that it’s reviewing your conversations with ChatGPT – a lot of companies would rather hide that kind of information. A few years ago, Apple, Google, and Amazon faced criticism over the fact that humans were reviewing recorded conversations that people were having with those companies’ voice assistants. Each company eventually stopped listening in by default and gave users the option to opt out, but only after reports revealed what they were doing.

That’s what OpenAI should do – provide an option for users to decide whether their conversations can be used for research or for training the language models that power ChatGPT. Sure, most people will probably opt out, which makes it harder for the company to train its product, but that doesn’t mean it isn’t the right thing to do. If you’re going to use information from your users, you should ask for their permission instead of just giving a blanket statement that anything they type is fair game.

Sometimes doing the right thing seems like it makes running your business harder, but in the end, it builds trust. And, it turns out, trust is your most valuable asset. Not only that, but doing the right thing is always, well, the right thing. – Inc./Tribune News Service

Follow us on our official WhatsApp channel for breaking news alerts and key updates!

Privacy

   

Next In Tech News

Jeff Bezos says most people should take more risks. Here’s the science that proves he’s right
Musk, president? Trump says 'not happening'
Bluesky finds with growth comes growing pains – and bots
How tech created a ‘recipe for loneliness’
How data shared in the cloud is aiding snow removal
Trump appoints Bo Hines to presidential council on digital assets
Do you have a friend in AI?
Japan's antitrust watchdog to find Google violated law in search case, Nikkei reports
Is tech industry already on cusp of artificial intelligence slowdown?
What does watching all those videos do to kids' brains?

Others Also Read