On iPhone release day on Sept 20, no one is paying much attention to anything else (at least in the tech world). After all, the iPhone is the most successful consumer product ever, so a new release is a big deal. However, new iPhones aren't the only thing Apple released last week. In addition to new versions of AirPods and Apple Watch, you probably remember that Apple released iOS 18 as well.
Like most iOS updates, there are a lot of helpful and interesting features that make using your iPhone a little better. For example, with iOS 18, you can now tint the colour of your home screen and place icons wherever you want. You no longer have to follow the grid from the top down and from left to right, which means if you just want a row of icons at the bottom, you can do that. If you want them to all be tinted yellow to match the color of your golden doodle, you can do that, too.
You also have more flexibility to add controls to Control Center. There's also a new standalone Passwords app, and your iPhone now supports the RCS messaging standard, which will make the experience of sending text messages to your friends who use Android a little less terrible.
Unlike most years, however, the most interesting feature – Apple Intelligence – isn't coming until next month. It's kind of strange to talk about software when the headline feature didn't ship when it launched, but that just shows what a massive lift it has been for Apple to figure out the best ways to integrate generative AI into its software.
For me, however, the most impressive feature is something almost no one is talking about. In the Notes app, you can now record audio files, and it will automatically transcribe them in the background. Those transcripts are then searchable within Notes, and once Apple Intelligence arrives, it will create a summary of the transcript and put it right at the top.
Look, I spend a lot of my time talking to people, and often I need to remember those conversations. The problem is, I'm really bad at remembering what they say. As a result, I record and transcribe pretty much everything. I do it so often that for years, I have paid for Otter.ai, a software service designed to do exactly that. Now, however, Apple's stock Notes app will be capable of handling this.
Before you tell me that you don't use Apple Notes because you have some fancy note-taking app that you can't live without because it has become the centre of your digital life, I will just tell you that the Notes app is a lot better than you think. It just is. For an app designed to take notes, there are a surprising number of useful features.
While I don't use it to write long-form content like this article (I'm a big fan of Ulysses for that), it is where I take notes in pretty much every meeting, every day. Now, my iPhone can pretty much take notes for me. It also works in macOS Sequoia, though I don't carry my MacBook Air around with me nearly as often as my iPhone.
Otter still does some things that Notes doesn't (at least for now). For example, Otter has an AI chatbot that allows you to ask questions about your conversations. It will also generate a far more robust summary than Apple Notes, including bullet points and action steps. And since it's a paid service, I'm sure Otter will continue adding more advanced features.
My point is that sometimes the headline features get all the attention but aren't necessarily the thing that actually improves your productivity in a meaningful way. I'm not arguing that Apple Intelligence won't have plenty of great features, especially once it's able to draw from your personal context and information. In the meantime, however, this simple feature is a killer productivity tool, and it's available right now. – Inc./Tribune News Service