
For nearly two decades, smartphones have all worked pretty much the same, whether you bought an Apple iPhone or a smartphone running Google’s Android system: You have a grid of colorful apps that you tap on. But this year, Apple and Google are finally taking separate paths.
Apple’s next phone operating system arriving this fall, iOS 26, includes a transparent aesthetic mimicking the look of glass and making apps and buttons blend in with content on the screen. Google is doing the opposite with its newly released operating system, Android 16, which emphasizes brighter, punchier colors.
Those are just cosmetic changes that may represent the beginning of a greater split between iOS and Android. Google is also leaning heavily into integrating Gemini, its A.I. chatbot, to automate tasks like writing emails, editing photos and creating shopping lists. In contrast, Apple has released a small set of A.I. features and has postponed the debut of a revamped version of Siri because of technical challenges, so the company is focusing on making its software interface look prettier.
What this means for you, the consumer, is that your technology experience may differ drastically depending on which type of phone you buy in the coming years. With Google diving into the deep end of A.I., Android users will soon have phones that dig into their data to do lots of tasks for them — but whether they will appreciate this remains an open question. Apple phone users will get some nice-looking software with extra polish, which is more of the same.
Here are the highlights of what’s changing in our smartphones with the imminent arrival of iOS 26 and Android 16.
The iPhone’s Apps Are Fading Away, and Android Is Looking Spicier

When Apple unveiled iOS 26 — giving its software a new numbering scheme based on the fiscal year when it becomes available — at a software conference last month, it announced a new software interface that it calls Liquid Glass, referring to a translucent aesthetic that mimics the look of glass. For instance, an app icon or a button could change its appearance to adapt to the lighting and colors of the photograph behind it. Apple is applying the glasslike aesthetic to its other devices, including iPads and Macs, to make the experience more consistent across its ecosystem.

In contrast, at Google’s software conference in May, the company unveiled the new design for Android 16, called Material 3 Expressive, which makes your phone screen look more like pop art. You can choose a color theme to change the overall look of the software interface — a purple theme includes pink app windows, plum text and dark-violet buttons, for instance. Google said its goal was to give users a more emotional connection with Android.
Yet both of these design overhauls feel like a distraction from the real transformation happening to our phones, which is being driven by A.I.
Google Is Trying to Make Gemini Android’s Killer App
Like its predecessor, Android 16 features Gemini, which users can interact with through voice or text to streamline tasks on their phones.
Over the last few years, Google has expanded Gemini to control various pieces of software, including its note-taking app, Google Maps and YouTube. The chatbot is based on generative artificial intelligence, the technology that uses complex language models to predict which words belong together.
This lets Android users hold down the power button on their phone to summon Gemini and speak into the microphone to ask it to do things like:
To put it another way, even though the flashiest new part of Android 16 is its colorful interface, the true force driving Android is shaping up to be Gemini.
Apple Is Still Playing Catch-Up in A.I.
In iOS 26, Apple is expanding on its A.I., Apple Intelligence, which debuted last year, with new features including automatic language translation and the ability to do a web search using data from a screenshot — tools that Android users have had for a while.
The real-time translations can work inside some of Apple’s communications apps, including messages and FaceTime. On a FaceTime call with a relative speaking his or her native tongue, you can see a translated caption in a bubble on the screen, for example. (Google released a similar tool in 2021.)
The new iPhone software also uses A.I. to streamline tasks using information in a screenshot. For example, if you take a screenshot of a website with the date and time for a concert event, a suggestion to add the concert to your calendar will appear. Or if you take a screenshot of a handbag you are shopping for, you can tap a button to do a web search for similar-looking handbags. (That’s similar to Google’s Circle to Search tool, which lets Android users draw circles around objects to do image-based searches. Many users have called the feature a gimmick because it is seldom useful.)
As for Siri, Apple was supposed to release an overhauled version of its virtual assistant with A.I. to rival Google’s Gemini this spring, but those plans have been postponed indefinitely after internal testing found that it was inaccurate on nearly a third of its requests. For now, users can talk to the old-school Siri and redirect some requests to OpenAI’s popular chatbot, ChatGPT.
(The New York Times has sued OpenAI and its partner, Microsoft, claiming copyright infringement of news content related to A.I. systems. The two companies have denied the suit’s claims.)
What This All Means
Every major consumer tech company is redesigning its products to include new A.I. technology in the software we use every day, and all the tools still make plenty of mistakes.
In other words, there’s no rush to jump on this bandwagon. But at this rate, Android users will get to experience before iPhone owners what it’s like to have an A.I. phone — a device that uses your apps for you.