With over 5,000 five star reviews; Magic Lasso Adblock is simply the best Safari ad blocker for your iPhone, iPad and Mac.
As an efficient, high performance and native Safari ad blocker, Magic Lasso blocks all intrusive ads, trackers and annoyances – delivering a faster, cleaner and more secure web browsing experience.
The import/export feature, which Apple demonstrated at this week’s Worldwide Developers Conference, will be available in the next major releases of iOS, macOS, iPadOS, and visionOS. It aims to solve one of the biggest shortcomings of passkeys as they have existed to date.
Yep, I’m back on the passkey beat! This is Apple’s implementation of the standard developed by the FIDO Alliance, which handles the specification for passkeys. The goal is to create a system more secure than just outputting a plaintext file full of your sensitive cryptographic keys and allow easy migration between password managers.
Nowhere to go.
On the macOS Tahoe beta running on my MacBook Air, I can start the export process in Passwords, which requires first re-authenticating with Touch ID. You can choose to export either a single item or all your items; in the latter case, you can’t export accounts created with Sign in with Apple or those that were shared to a group by someone else, and exporting will not delete the items from Passwords itself.1
In order to complete the export, you need to select an app to send it to, but as most password managers have not yet implemented the standard, I don’t currently have any options available. 1Password said last fall that it intends to adopt the standard; they’ll likely be joined by other apps, and I wouldn’t be surprised if support rolls out more broadly right around the time macOS Tahoe is released this fall.
Standard passwords can of course still be exported as a file, though the app warns you that they’ll be unencrypted. ↩
Note: This story has not been updated for several years.
On Thursday there was a Six Colors Zoom call for Backstage-level members and contributors alike. Glenn Fleishman asked Jason Snell and Dan Moren about Spotlight. He wondered about the discoverability and the intuitiveness of some of these features. Jason mentioned that Apple views the features as power user features that don’t get in the way if you don’t know what they are. Dan said it would still be nice to have documentation of what all the features were, because it was difficult to know exactly what all the command functions are otherwise.
I piped in with my view that the real missing piece is natural language processing so people aren’t trying to discover commands or read documentation. We still need those other things, but to make this truly accessible we can’t expect everyone to memorize all the Quick Keys.
In March I wrote an opinion piece for Six Colors lamenting how text-to-Siri pales in comparison to typing a web search into your browser. I also compared text-to-Siri to Spotlight which handles searching better, but can’t process natural language requests. What I wrote in March is much broader in scope and encompasses requests like product knowledge.
Apple still isn’t doing any of that right now, but with App Intents and Quick Keys in Spotlight it’s creating the explicit command syntax that could be fed by something interpreting a natural language request.
Think of it like this: this year they’re writing grep, sed, pine, ffmpeg, etc. for Spotlight. A common issue for people is not knowing how to structure commands and turning to the web, and LLMs, to copy and paste arguments and flags for those powerful tools. They’re more accessible when people don’t have to figure out the flags and arguments themselves, but the explicit commands you pass them are still the foundation for what’s doing the actual file operations.
Jason said on the call that he thinks that this missing puzzle piece might be as early as next year, and that it seems like the next logical step. It certainly seems more achievable with a foundation like this laid.
Hopefully bozos like me aren’t writing blog posts in two years asking where it is while we ask LLMs to compose our Spotlight queries for us. I’m thinking positive thoughts, though.
[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]
Is that a Mac? Nope, it’s iPadOS 26 running on a Studio Display.
Some of us have spent an awful lot of time pondering the iPad’s use cases as a professional productivity device. As a heavy user of the iPad, I’ve frequently wanted to push it into areas where it wasn’t designed to go because if I could get it to do what I wanted, it would fit in my life better than going back to the Mac.
But a funny thing happened this week: Apple seems to have changed direction, again, when it comes to more advanced uses of the iPad. In the early days, the iPad was clearly being groomed as the future of computing. In the middle ages, after Apple seemed to accept that the Mac wasn’t going to be eclipsed by the iPad, there seemingly remained a fear to let the iPad come too close to acting like a Mac.
We are in a new era now. Today’s Apple is not afraid to let the iPad run Mac-like windows, complete with stoplight buttons and Expose. In Cupertino this week, I got the strong sense that whatever dogma about not letting the iPad feel Mac-like has dropped away, replaced with an acceptance that the Mac is pretty great at a lot of things—and if the iPad is also great when it does those things, it should just do those things. It’s like a weight has been lifted from the soul of the iPad.
This is good news for advanced users who want to push the iPad to its fullest, of course. But it’s also a move that benefits Apple directly, because—if you haven’t noticed—the company is continually shipping very expensive iPad Pros powered by some incredible hardware, only for the reviews to keep mentioning that the hardware is let down by the less accomplished iPad software. I predict a little less kvetching about iPadOS when the next pricey iPad Pro model rolls around.
Checking the boxes
I’ve been writing about the iPad Pro since it arrived in 2015. I was about a year into doing Six Colors and podcasts as a living, and I was really intrigued by the idea of changing my productivity and breaking out of the laptop box. I wrote stories on my iPad. I edited podcasts on my iPad. I traveled with only my iPad.
But along the way, I built up a huge list of complaints about all the things that the iPad just couldn’t do, things that got in the way of me using it the way I wanted to. The latest version of that list, going into this week, was this:
Can’t record local microphone audio while on a VOIP call
Awkward multitasking and windowing
Limited support for global keyboard shortcuts
Better support for items running in the background
Clipboard manager
Improved Files interface for working with, well, files
I can’t say that Apple checked all the boxes, but after this week, I feel a lot more confident that those not checked this week may be checked in the near future.
I’m going to get the podcasting thing out of the way first. It’s such a niche need, but it’s a huge blocker from a workflow standpoint: you’re recording your podcast or video on a third-party app, but for quality reasons you want to also be recording your local audio and video so that they’re of the highest quality, as opposed to the versions that get compressed and sent over the Internet. It’s easy to do on a Mac, but impossible to do on an iPad… until now. (For the record, this feature also works on iOS 26, which means that podcasters could actually get by with just an iPhone and a USB microphone!)
iPad users will be able to opt for complexity or simplicity.
Next, multitasking and windowing. In earlier eras of the iPad, Apple reluctantly accepted multitasking by introducing Split View and Slide Over, and then later Stage Manager, which created a windowing system that was not Mac-like at all. Windows couldn’t be resized freely, or placed freely, or overlap other windows in the wrong way. But at some point, Apple decided to just throw out that entire system and build a new one that’s unabashedly inspired by the Mac. In iPadOS 26, you can resize windows arbitrarily, put them anywhere, and manage them using the familiar stoplight buttons in the top left corner. (It even supports keyboard shortcuts, so you can Globe-F to toggle full screen, or Globe-Shift-Left Arrow to automatically send a window to the left half of the screen.)
Related: The iPad has a Menu Bar now! This has been something Apple has been creeping toward for four years, since iPadOS 15, but it’s finally here. And you know what? Within an hour of using the iPadOS 26 developer beta, I ended up wondering how to perform an action in an app—and realized I could just look in the Menu Bar. The Menu Bar is one of the great innovations of the Mac, allowing an ordered way to browse through functionality and discover keyboard shortcuts, and why should the iPad be denied it just because it’s such an important part of the Mac? (And yes, Command-Shift-Question Mark will let you automatically search the menus.)
It’s kind of hard to believe that it’s been two years since Final Cut Pro for iPad shipped, answering once and for all the question “why are Apple’s biggest pro media apps not on the iPad Pro?” Unfortunately, it also just showed how far behind the iPad was: Once you kicked off a video export, you had to just sit there and watch the progress bar, because leaving the app would cause the export to fail. Again, iPadOS 26 to the rescue: There’s now a Live Activities-based interface for background tasks (available for all user-initiated tasks with clear end states, such as exports, renders, and file copies) that actually does the Mac one better by coalescing all the ongoing activity in one place. I should be able to leave Final Cut or Logic or Ferrite and move on to something else while the export takes place in the background, just like on my Mac.
There are also enormous improvements in the Files app, where the list view now features customizable columns and folders with expanding disclosure options. You can also control which app opens a file and, yes, even assign a default opening app, something Mac users take for granted that was just never there before in Files.
There’s no clipboard manager or support for global keyboard shortcuts yet, but even there, I’m optimistic. If macOS can gain a clipboard manager after 41 years via upgrades to Spotlight, it’s pretty easy to suppose that iPadOS might be getting similar functionality next year. That Spotlight upgrade in macOS also features a bunch of other power-user productivity boosts that would work well on the iPad, adding keyboard-based control power that might make my desire for global keyboard shortcuts less strong.
As a fan of the original iPad pointer, I’m sad to report that it’s been replaced by a new, Mac-inspired one. The reason the old one died is a pretty good one: it was meant to represent the touch target of iPad software designed for fingers, and Apple is now accepting that sometimes pro users want more precise pointer control than that. (Also, those new stoplight buttons are smaller than the old pointer circle!) I’ll miss the morphing cursor because I think it might’ve been the strongest example of the iPad rethinking and outdoing an old Mac idea, but the new pointer fits like a comfortable old shoe.
Easy or expert?
One of Apple’s greatest challenges is its own success. It’s got millions of users across a wide spectrum of demographics, geographies, and levels of expertise. How do you create a single product that can be what it needs to be for all of them? This can lead to discoverability problems for new features, overly complex interfaces for novices, and frustratingly simplified features for experts.
The iPad is the device where this struggle has been out in the open, though I’d argue it affects the iPhone and Mac just as much. On the iPad, though, the divide is pretty stark: A lot of people really never want to do anything but use one app at a time. They’re never pressuring the processor. They’re not connecting peripherals, even Apple-built ones. How do you give the people who want more what they want, without wrecking the experience for the much larger group who like it simple?
Apple’s taking another cut at this, and it seems to me that by following the Mac’s lead, they’re setting the iPad up for success. Nobody, not even power users like me, wants to see the simplicity of the basic iPad experience degraded in any way. I think they’ve done a pretty good job of adding pro features without breaking it for everyone else. We’ll see how it goes over the summer and into the fall.
It’s not an obvious candidate for a classic. A commencement address by a college dropout. A talk aimed at 22-year-olds that warns “You will gradually become the old and be cleared away.” A text as shadowed by reality as soaring with inspiration: “Your time is limited, so don’t waste it living someone else’s life.”
There are also a remarkable set of emails that Jobs sent himself with notes about what he wanted to say in his speech, and a lot of details (like his nerves, and Apple PR’s attempts to suggest things for him to say) that I hadn’t heard before.
It remains one of the most remarkable speeches you will ever hear. And certainly one of the best things to ever happen at Stanford Stadium.1
French media company Canal+ announced this week that it’s working on a documentary about French motorcycle racer Johann Zarco at the French Grand Prix, which would probably not be notable for most people were it not for how it was made:
Produced in collaboration with Apple and MotoGP, a competition organized by Dorna, this new documentary event is the first Apple Immersive Video production filmed entirely with the new Blackmagic URSA Cine Immersive camera. CANAL+ will be the first global studio to publish content in this exciting new storytelling format for Vision Pro.
MacBreak Weekly listeners/viewers will know that we’ve been discussing the potential for Immersive Video on the Vision Pro to expand rapidly once non-Apple filmmakers have access to Blackmagic’s new Immersive camera. This documentary is the first, but hopefully it is the beginning of a bigger trend.
I had a chance to watch the immersive teaser for this documentary, and it looked great, with a quality very much in line with other Immersive videos on the Vision Pro. I don’t know if I care a lot about the French Grand Prix, but I am a bit of a sucker for a good sports doc—and an Immersive one? Sign me up.
Canal+ says the documentary will be available in September.
My visionOS 26 persona. The side of my head really does look like that!
If visionOS and the Vision Pro are all about charting a course to the future of wearable devices in front of our eyes, Apple needs to keep pushing toward that future at every opportunity. Fortunately, visionOS keeps moving forward, with several substantial feature improvements that have rolled out in updates over the past year-plus.
With visionOS 26, Apple keeps pushing, as it should. Apple had already taken its most uncanny launch feature (the dead-eyed Personas) and made it shine with a software update; it could have paused there for a while, but instead, visionOS 26 ups the game.
Spatial Personas are now the default, and there’s an entirely new Persona engine that makes them look remarkably better. The old Personas looked good straight on, but from a bit of an angle, they looked like a face tacked on to a flat piece of cardboard or something. These new Personas capture more of the side of the head, capture hair and eyelashes better, and do an incredible job of capturing skin details. Unfortunately, while beards look better, they still limit a Persona’s mouth movement.
Another drive forward is geographic persistence. In the long run, assuming AR glasses are a thing (which is what we’re all assuming here, because that’s why this whole project exists), you’ll want to be able to place an item somewhere and have it appear there when you come back to it later. In previous versions of visionOS, there was basically no item persistence at all—if you rebooted the Vision Pro, all your windows were closed, and you needed to set them up again.
visionOS 26 fixes all of that. Now you can leave items in one place and they’ll appear when you enter that space, even if the Vision Pro has rebooted or shut down in the interim. Windows are always where you left them. It’s great for short-term reusability, and a must if you take the long view.
A big beneficiary of geographic persistence is the new ability for visionOS to use widgets from other apps. visionOS will let you place widgets on physical surfaces like walls, where they’ll remain anchored. I’ve really enjoyed using Widgetsmith to place a clock on my ceiling or Windora to put a picture-that-looks-like-a-window on my wall, but without persistence, I gave up. You can browse widgets by launching the new Widgets app and then placing them wherever you want. There are a few beautiful and subtly three-dimensional clock widgets, a photo widget that basically Sherlocks Windora, and many more.
Immersive Environments (the desktop wallpaper of visionOS) are another favorite feature, and I admit that I’m disappointed that Apple didn’t add a bunch of new Environments to the mix, nor apparently enable third-party developers to contribute their Environments to the system as a whole. But the one new Environment Apple is adding also shows off some new extensibility and interactivity: the Jupiter environment, which I got to try briefly, lets you adjust how fast time passes (do you want to stare at Jupiter’s gas bands swirling rapidly, or do you need to slow it down to get some work done?) and jump to different points during Jupiter’s day. As with so many aspects of visionOS, all I can say is: this is great… more, please.
In a welcome sign of rapid iteration, Apple has thrown out last year’s algorithm that turned flat photos into remarkably good 3-D ones, and replaced it with the same multi-layered spatial scenes that it’s featuring in lock-screen effects in iOS 26. The result is an image that doesn’t just look 3-D, but which adds more of a perspective change when you move your head toward the image or from side to side. It can’t reveal information that’s not really there, of course—there’s some smudgy generative filling going on in the background—but the effect is still impressive.
Another feature that’s more for the future than it is for today is support for consensual viewing of items when two people are using Vision Pro in the same room. Right now, it’s awfully unlikely that you and a friend are going to bring your combined $7000 in Vision Pro hardware together just to watch a movie or play checkers, but as more people get devices like this, you’ll need the ability to share widgets and objects and whatever in person, not just remotely. Apple has presumably implemented this feature by combining its existing SharePlay technology with the same stuff that powers geographic persistence.
In any event, I was able to manipulate a shared 3-D model of an astronaut in a space suit in collaboration with an Apple representative who was wearing his own Vision Pro, and we walked around it and gestured to it as if it was a real thing, because we both saw the very same VR object. I don’t know if I’ll use this feature any time soon, but it shows how Apple continues to build out features that it’ll need in its AR platform of the future.
I also got to live out an alternate life as an extreme sports enthusiast by watching some Insta360 footage, courtesy of visionOS 26’s new support for extremely wide field-of-view video formats from 180- and 360-degree cameras from the likes of Insta360 and GoPro. I was able to verify this with some 360 footage I shot myself (of people playing Dungeons and Dragons at a table, and not an adventurer parachuting into snowy backcountry—c’mon, this is me!). After opening the file in Files, visionOS asked if I wanted to convert it to a more Apple-friendly format, and began playing the video immersively.
There’s a new Spatial Browsing mode in Safari that answers the question, “What if Safari Reader, but in three dimensions?” When you enable Spatial Browsing, everything else drops away and you’re able to focus on a Safari Reader-like view of webpage text. As you scroll, images in the webpage are automatically converted and displayed as a spatial image, using the same algorithm you can use for your own photos. Is this feature necessary in life? No, but the very nature of visionOS makes me perfectly happy to witness Apple deploying a few wacky features and asking us, “Is this a thing?”
Unfortunately, I didn’t get a chance to try one of the visionOS 26 features that excites me the most: support for hand controllers, namely the PlayStation VR 2 Sense controller. I’ve felt strongly for a while that Apple needs to expand what’s possible on Vision Pro by adding support for games. No, the $3500 Vision Pro is never going to be a game console, but by far the weakest thing about the platform is a lack of content—and there are plenty of VR games out there that could make the platform more appealing, especially if more affordable versions are coming eventually.
As always with visionOS, it comes back to the long game. As long as Apple keeps pushing forward and building out its AR platform of the future, I’ll be confident that the company is on the right track. visionOS 26 offers robust evidence that the work remains ongoing.
The most contentious decision about tabs since the soda got discontinued.
For a company that’s long been known for its “my way or the highway” philosophy when it comes to design, some of Apple’s latest interface choices have abandoned that approach in favor of the realization that, well, change is hard. So, instead of throwing its users into the deep end, Apple has started more commonly letting them continuing using an older interface if it suits them.
That’s a categorical no.
This approach began most meaningfully with Safari, back when Apple first moved the location bar to the bottom in iOS 15. While it clearly thought the new location was superior—else, why move it?—it also realized that some users were perfectly happy with where it was.1 So it split the difference: in Settings, you could choose to use the bottom “tab bar” or the top “single tab” design.
More recently, we’ve seen this trend continue with a few of Apple’s other app redesigns. For example, the company added the Mail Categorization features in iOS 18 last year, followed earlier this year on macOS and iPadOS, but it also included an option—right within the Mail app—to swap back to the old List View mode. And, if my anecdotal evidence is any indication, a lot of users have.2
With this week’s announcements of Apple’s latest updates, there’s obviously a big redesign in play across the company’s platforms. But one thing that struck me was that Apple has gotten increasingly upfront with this approach: for example, the Phone app features a brand new unified designed, but even in the keynote itself, Apple was careful to present switching to this interface as “an option”—preemptively defusing the annoyance of users who might get annoyed.
The new Phone app lets you keep your options open.
Similarly, the new version of Safari on iOS once again makes changes to the way the location bar is presented, getting rid of the previous toolbar in favor of a minimalist floating palette. Except that you can still switch back to the bottom or top address bars if you want.
So choice
So, what’s behind this move? Is Apple less confident of its design chops than it’s been in the past? Or is it simply a practical recognition that giving people the choice to stick with the familiar option makes its users happier?
Apple certainly hasn’t taken this approach with every single design choice it’s made. Take, for example, last year’s redesigned Photos app, which threw out the old toolbar model for a combined view. While the app evolved during beta period, it still eventually shipped in its much changed state. Until this year, anyway, when Apple once again added a (reduced) set of tabs.
Tabs, you’re back! We hardly had time to miss you.
Similarly, Apple also showed off a new interface for the Camera app. The new design attempts to streamline what had become a slew of options, putting the most common ones front and center. It’ll be interesting to see how that’s received during the beta process—and whether Apple will end up making any concessions for those who find the new interface alienating, given that it’s probably one of the most important and most used apps on the phone.
Ch-ch-ch-ch-changes
One thing that I think might be at the root of all of this is a concession from Apple that becomes our devices—and our smartphones in particular—have become so critical to so many aspects of our lives they can’t simply be redesigned overnight. We depend on them so much that even small changes can be jarring, and drastic changes can be severely off-putting. This is one of the challenges of having a product that exists at the monumental scale of the iPhone.
But this approach isn’t without risks. By keeping the old interfaces around, does Apple end up committed to having it available…forever? That’s a problem too. Because if you always leave the safety net of the familiar, old interface, it becomes harder to convince people to move forward and try something new. And, as a result, those interfaces can become calcified, impervious to change even if there might be a better way.
I also might argue it goes all the way back to the change to “Natural Scrolling” in Mac OS X Lion, which is still an option to this day. ↩
Which is kind of wild, when you think about it, because it means a swath of users are just opting out of what was a significant new feature…indefinitely? ↩
[Dan Moren is the East Coast Bureau Chief of Six Colors, as well as an author, podcaster, and two-time Jeopardy! champion. You can find him on Mastodon at @dmoren@zeppelin.flights or reach him by email at dan@sixcolors.com. His next novel, the sci-fi adventure Eternity's Tomb, will be released in November 2026.]
I use Dropbox for storing all my work and personal files and syncing between my three Macs. The plan was to move my Dropbox folder and my Apple Photo Library to the external SSD, leaving the internal SSD mainly for Applications. The issue I am having is that Dropbox re-indexes every time I restart the Mac, taking about 6–7 hours, then syncs for another 4–5 hours.
Lindsay wonders why Dropbox doesn’t work correctly on a non-startup volume, a feature that Dropbox supports.
Containment breached
Dropbox offers step-by-step instructions for relocating its folder from your startup volume to another volume, whether on the same drive or an external drive. Those details are straightforward. But I’m more interested in the requirements before you move the folder:
Your Mac has to be running macOS Catalina version 10.5.4 or later.
The volume has to be formatted as APFS (Apple Filing System), the filesystem Apple transitioned macOS to several releases ago.
The volume must be encrypted.
Finally, you can’t have more than 500,000 files in your Dropbox folder.
The first requirement is easy enough—just check About This Mac in the Apple menu.
Check your volume’s format type via Disk Utility.
Use Disk Utility (Applications > Utilities > Disk Utility) to examine the drive. With a volume selected in the sidebar (View > Show Sidebar), the main Disk Utility window displays information, including the formatting type. It will appear as APFS Volume if formatted as APFS. If not, consider converting to APFS—there’s typically little reason to keep a volume as another format unless you have a historical purpose for it. Control/right-click the volume in the sidebar and choose Convert to APFS. (This operation is non-destructive, but be sure you have a backup beforehand.)
Encryption isn’t shown in this main view. You can either:
Click the info (i) button with the volume selected and look for the “File system” entry, which should read “APFS (Encrypted).” There is also an “Is encrypted” item: a Yes or No indicates the status.
Control/right-click the volume on the Desktop. If it’s encrypted, the menu shows Decrypt as an item. If it’s not encrypted, then Encrypt appears.
If you haven’t yet enabled encryption, the easiest way is through that Finder menu. Make sure to generate and store the volume password in a password manager, as it will otherwise be unrecoverable if you can’t recall it later.
Check your Dropbox folder to see if you’ve exceeded the 500,000-file limit for non-startup volume storage.
The maximum file limit can be checked in the Finder. Because of how Apple moved cloud-based storage services into a separate hierarchy, you can no longer directly select the Dropbox folder and then choose File > Get Info. Instead, click the Dropbox folder in the Finder sidebar or click the Dropbox menu icon and click the folder icon to open Dropbox in the Finder. Now, with no folders selected within the window, choose File > Get Info.
Lindsay thinks the file limit was the missing problem. Unfortunately, while Dropbox has that limitation, the company doesn’t explain why, nor does it present an error message via its app, a notification, or other methods when you have too many files for a volume other than the startup one to work.
Making sense of storage
For more help in making sense of managing storage on your Apple devices, you might consult Jeff Carlson’s Take Control of Your Digital Storage. The book examines all kinds of attached, network, and cloud storage as well as encryption, choosing filesystem types, and troubleshooting.
[Got a question for the column? You can email glenn@sixcolors.com or use/glennin our subscriber-only Discord community.]
Live from Apple Park (mostly). Apple’s new Spotlight for Mac, how iPadOS 26 will change our iPad usage, Apple’s latest improvements to messaging, and the Liquid Glass redesign.
Software is more than just a new version number: at this year’s Worldwide Developers Conference, Apple rolled out expansive updates across its platforms, with a brand new Liquid Glass design, access to AI models for third-party developers, and new features.
But you’ve heard about all of that, I’m sure, so we’re not going to rehash it. Instead, let’s get personal: I’m picking out, in my opinion, the best and worst new features of each of Apple’s platforms. To be clear, these are my completely scientific and totally well-reasoned expert opinions on the features that were announced, not just some off-the-cuff reactions less than a day later.
macOS
Best: Spotlight enhancements
Yes. This. 100, as the kids say.1 Not everyone with a Mac is a power user, but power users probably disproportionately use the Mac, so a feature that is aimed at them feels like a tacit admission that the Mac is going to keep being the Mac. I gave up third-party launchers a long time ago, and I’m glad to see Apple finally embracing the more powerful features they can offer.
Worst: Disappearing menu bar
Didn’t we go through this back in 2007? The menu bar is an integral part of the macOS experience, why don’t you want me to see it?
iOS
Best: Spam filtering in Messages
Long overdue. There was some facility for third-party filters in the past, but they were never really used.2 But if I never get another weird spam message asking if I left my sunglasses at their house, it will be too soon.
Worst: Tapbacks in CarPlay
Maybe calling this the “worst” is overselling it, but a) there wasn’t a lot else in iOS 26 that I found objectionable, and b) do I really want people tapping on messages and trying to pick the perfect emoji while driving? I do not.
iPadOS
Best: Improved windowing
Finally, after what feels like seventeen different multitasking approaches, Apple has hit upon a surefire winner: windows you can place anywhere, manage via Expose, and tile to parts of the screen. Maybe they finally developed a time machine and went back to Xerox PARC.
Worst: Games app
It’s a supercharged Games Center, but that doesn’t do much for me. I appreciate that at least it tends to show off apps that I actually have played, but a giant splash screen of games recommended for me feels like an ad more than anything.
watchOS
Best: Configurable widgets
I like the Weather widget on my Watch, but I’m annoyed that it forces me to include the wind direction and AQI along with temperature. Being able to pick the items it’s showing is vastly preferable—I always like to have the UV level in there to know how much I’m going to sunburn. Not if. How much.
Worst: Workout Buddy
Look, I’ve recently started running again, using the Nike Run Club app, which has a chipper (maybe too chipper?) coach who roots you on. But for all that it’s an actual person, not a robot. Your robot enthusiasm does nothing for me, you hear me? NOTHING.
visionOS
Best: Widgets
Look, the ability to create little widgets and leave them in persistent places around your workspace is just cool. It’s a wholehearted embracing of spatial computing. The Vision Pro may be too rich for my blood3, but Apple seems committed to pushing that forward, and more power to them.
Worst: Image Playgrounds
We failed to keep the virus contained.
tvOS
Best: There’s still a tvOS
Damning with faint praise, but the Apple TV continues to be the device I use the most that Apple seems to care about the least. tvOS doesn’t even get a tab in Apple’s preview of all the new operating systems. Poor kid.
Worst: Sing in Apple Music with iPhone
I really do not need to hear your off-key rendition of “Africa”, much less amplified through my soundbar.
Do the kids say it? I don’t know, my kid is too young to be up on all the slang. ↩
Weirdly, I think those features were deployed more aggressively in other countries, but never in the U.S. ↩
[Dan Moren is the East Coast Bureau Chief of Six Colors, as well as an author, podcaster, and two-time Jeopardy! champion. You can find him on Mastodon at @dmoren@zeppelin.flights or reach him by email at dan@sixcolors.com. His next novel, the sci-fi adventure Eternity's Tomb, will be released in November 2026.]
So last year’s WWDC was too bold, too loud, too defensive. And, as it turned out, too aggressive in promising features Apple couldn’t deliver.
This year’s WWDC strikes me as Apple sticking to its knitting a little more, focused on what it feels it can currently do well. Apple is not a leader in developing AI models, but it does make a bunch of devices that people use every day. Maybe focus on that a bit more?
The sense I get from Apple, based on the keynote and various conversations around Apple Park today, is that the company wants to revert a little to what it used to do quite well. For years now, it’s been building features that use AI (what it used to call machine learning) to improve features scattered throughout its operating systems.
Make no mistake—Apple’s still committed to AI and to trying to catch up with the rest of the industry. But on Monday, I saw industry commentators complaining that Apple couldn’t match up with Anthropic’s Claude editing Rakuten’s code base for seven hours or Google’s firehose of new features from I/O with varying degrees of weirdness and likelihood that they will ship anytime soon.
Trying to make those kinds of commentators happy is what got Apple into this mess. Is anyone frustrated that Apple’s not generating weird AI videos or advanced coding systems all by itself? Apple’s AI stuff needs to get better, but what the company really needs to be is a builder of platforms that are good for users, including those who want to use AI to perform tasks.
On that score, Apple Intelligence does not seem to have faded away. Apple talked about it up front in the keynote, despite the fact that it knew it would be judged for what it did last year and that owning up to its failure to ship certain features (now due by the end of this year) would sting a little.
Apple is adding new features to Visual Intelligence, a feature that has never really seemed essential. Now it’ll analyze screenshots of your device interface, using on-device models to find the most relevant items in images and processing them in interesting ways, from creating calendar events based on images to performing image searches in any app that builds an App Intent to give Visual Intelligence access to their image search features.
Live translation is a feature that will be welcome, though it’s a place where Apple lags behind similar features from Google. Automatic translation in Messages has been a long time coming, while the more intense FaceTime or phone call audio translations are cooler (and are limited to a small palette of languages, for now).
Apple’s generative model, used in Genmoji as well as Image Playground, has apparently been updated, but in a nod to how far behind the curve Apple’s model has seemed, you can also now just use ChatGPT’s generative models to create images in Image Playground. I think it’s actually a good example of how Apple doesn’t necessarily need to build every AI feature out there.
Speaking of Apple’s models, they’ve been updated, and Apple is opening them up much more broadly. App developers have direct access to the smaller on-device model, with relatively free rein to build features based on it. Even more impressive is the latitude being given to actions in Shortcuts on the Mac, which can use the on-device model, Apple’s Private Cloud Compute, or even ChatGPT to perform tasks and return data. (It’s interesting that individual Shortcut developers get access to Private Cloud Compute before app developers do.)
There are several places where Apple Intelligence has just been diffused into the system, where you least expect it. A new Reminders item in the Share Sheet will take any text, including that on a web page, and parse it for possible to-do items using Apple’s on-device model. Then the user can choose which items to add to Reminders. Reminders has also been updated to support the use of Apple Intelligence to auto-categorize those items.
However, there are a few areas where Apple does still seem to be pushing its AI message a little bit beyond what is required. After seeing a couple of demos of Fitness Buddy, a feature that provides AI-generated motivational interjections while working out, I’m pretty sure I hate it. The feature uses an artificial voice to essentially repeat a load of stats that are already being displayed on the Apple Watch, with the occasional exhortation that you’re “doing great” or “crushing it.” At first blush, this seems like the kind of feature that could’ve been built without AI at all.
After last year, Apple could’ve been forgiven for wanting to soft-pedal this year’s Apple Intelligence announcements and regroup. It didn’t do that, nor did it double down on last year. Instead, it’s chosen a middle ground—a bit safe and familiar but also a place where Apple can feel a bit more like itself. In the long run, it needs to get this right. In the short term, maybe it should focus on meeting its users where they are, rather than pretending to be something it’s not.
Live from Cupertino, Jason has his in-person reactions to Apple’s WWDC announcements. And in London, Myke dangerously installs betas while reacting to momentous iPad and Mac news.
It’s the end of an era! Five years after the introduction of Apple Silicon on Macs, and two years after the company fully transitioned its product line to the new architecture, the end of the road is in sight: the company announced on Monday that this year’s release of macOS Tahoe will be the last to support Intel-based Macs.
The field has already been winnowed somewhat: this year’s Mac Tahoe supports only a few Intel Macs, including the 16-inch MacBook Pro from 2019, the 13-inch MacBook Pro from 2020 with four Thunderbolt 3 ports, the 2020 27-inch iMac, and the 2019 Mac Pro.
Intel Macs are not being entirely thrust out into the cold, however; they’ll still get security updates for three years.
Personally, this feels momentous: the first Intel Macs were introduced in 2006, the same year I first started writing about Apple.1 Twenty years later, they’ll be on their way out. But it’s hard to argue with the tremendous success of Apple Silicon in terms of performance and battery life. The Intel Macs have had a good run, but they’re about to relocate to a farm upstate where they can frolic with all those 68k and PowerPC Macs.
My very first Apple keynote was sitting in the overflow room at Macworld Expo 2006, where Steve Jobs announced the Intel iMac and the new MacBook Pro. ↩
[Dan Moren is the East Coast Bureau Chief of Six Colors, as well as an author, podcaster, and two-time Jeopardy! champion. You can find him on Mastodon at @dmoren@zeppelin.flights or reach him by email at dan@sixcolors.com. His next novel, the sci-fi adventure Eternity's Tomb, will be released in November 2026.]
On an Apple TV, you can enable play history just as on other devices linked to your iCloud account.
A couple of questions related to syncing your TV app usage across devices have come across the transom. Six Colors subscriber Jason owns two Apple TVs of different vintages, both updated to the latest tvOS production release. He notes:
The older [HD] 4th generation device always seems to lose the ability to update the TV app for the Continue Watching list. I’ll watch something on TV+, and it won’t update the list; same for Max, Hulu, Paramount+, etc. Manually marking episodes as ‘watched’ on that device won’t reliably push that status across other devices (the 4k Apple TV, iPad, iPhone).
Six Colors reader Mark had a related question:
I use the TV app on two different Macs. My play counts and other info does not transfer between Macs. It seems that only my iMac gets info from Apple TV, not the MacBook Pro. Is there a way that I may copy that info (play counts, ratings, etc.) from iMac to the MacBook Pro?
It’s a minor inconvenience but one that itches like a mosquito bite. The whole point of sync is that we don’t have to track these little nubs of friction—we defer to the device ecosystem to make it work. Is there anything we can do to fix this?
Viewing history should just sync
Enable play history sync in the TV app’s settings in iOS.
The only exposed part of the interface for managing how the TV app tracks what you’ve watched and other related data comes from settings for the app on different platforms:
In tvOS, iOS, and iPadOS: Go to Settings > Apps > TV and make sure Use Play History is enabled.
In the TV app for macOS: Go to TV > Settings > Play History. Use Viewing History should be checked.
As long as all of the respective devices you want to sync are logged into the same Apple Account to use for iCloud syncing, metadata should sync freely among your devices.
As is usual with Apple’s black-box approach to settings and features, you could try toggling the states above: disable, enable. Wait a bit. Does that help?
If not, the only obvious way to affect this is by clearing the play history, which is not desirable as you lose all that information. To clear that history:
In tvOS: Go to Settings > Apps > TV and choose Clear Play History.
In the TV app for iOS, and iPadOS: Tap your avatar or initials in the upper-right corner of the app, then tap Clear Play History.
In the TV app for macOS: Go to TV > Settings > Advanced and click Clear Play History.
The Advanced settings view in the Mac TV app lets you clear play history—which might not solve the problem for long.
For now, that’s the end of the line. People have posted complaints on Apple and related forums about how play history broke for them with the fall 2024 releases and hasn’t started working again. They can sometimes resume syncing temporarily by pursuing either the toggle or clear strategy above, but it doesn’t stick.
It sounds like our only hope is to wait for it to be fixed in tvOS 19—or, as is being reported, tvOS 26.
For more Apple TV insight
If you’re looking for advice in setting up or using an Apple TV, particularly a 4K (1st generation or later), take a look at Take Control of Apple TV, Fifth Edition. I recently took over revising this book from original author Josh Centers, and we just released a version fully up to date with features through tvOS 18.5.
[Got a question for the column? You can email glenn@sixcolors.com or use/glennin our subscriber-only Discord community.]