Six Colors
Six Colors

Apple, technology, and other stuff

This Week's Sponsor

Magic Lasso Adblock: Effortlessly blocks ads, trackers and annoyances on your iPhone, iPad, Mac and Apple TV

By Glenn Fleishman

The afterlife of Apple Accounts (and others)

Glenn Fleishman, art by Shafer Brown

I know we all like to get a good memento mori jolt now and again, particularly when the world is in turmoil! (That’s sarcasm.) But as I grow older, I can’t avoid thinking about my digital legacy, or that of my parents, as I am the executor named in their wills.

Six Colors reader David apparently has had his mind running down the same path, as he asks:

Should Apple users name a Legacy Contact for their Apple account? What is wrong with simply giving the password info to the executor of the estate without notifying Apple that the account owner has passed?

It’s an excellent question. While I am not a lawyer and this is absolutely not legal advice, this question plays into the larger issue of preparing for your or someone else’s incapacity or death if you or they want to have their digital footprint carried into the future.

Three inevitable parts of life: death, taxes, and data loss

Before the creation of the concept of the cloud, preserving your digital legacy often meant just making sure someone knew the password or passcode for physical devices. If you had accounts online that stored data, they often held lower-resolution images and video or synced items that were present on endpoints. I’m thinking, for instance, of Flickr, which was and remains for current users a way to push media out for private storage or public access, but would never be your sole repository.

You might also have had online storage you accessed via FTP or more secure methods and used to offload data or make it available to other people. Instead of “the cloud” it was “the server” on which you had an account—and you could easily also give someone that account information. (Sometimes that storage was really in the cloud, but it was all storage, like Amazon S3.)

The introduction of service-based storage and syncing, combined with iCloud’s promise of offloading data entirely from local devices, complicated the issue. I don’t know how many hosting companies kept a close watch on access to accounts after someone died as long as the account bill kept being paid.

But companies like Apple, Google, Meta, and Microsoft had made few plans about the legal and technical aspects of ownership and control, even as they accumulated hundreds of millions of user accounts with petabytes—maybe exabytes—of data. Often, an executor or heir would write a company about a late person’s account only to find that the account was instantly locked down or deleted.

Over time, policies emerged from these companies and others that explained precisely what you needed to provide as an heir, executor, or other party authorized by the deceased to obtain access to their cloud-stored data. That may require a step beyond providing a death certificate: an appropriate party has to obtain a court order, which I’ll talk about more below.

Companies then went further, creating special advance-directive tools like the ones from Apple, Google, and Facebook. By using one of these features, people inheriting your bits avoid cost and time as may bypass needing a lawyer to draft specialized documents.

Screenshot of Google Inactive Account configuration screen
Google offers a way to set preferences for an inactive account, which can include you no longer interacting with the account or being decease.

Across the last several years, 47 U.S. states, D.C., and one territory have adopted the Fiduciary Access to Digital Assets Act, Revised, or RUFADAA for short. This provides a near-universal legal standard for how you write your digital bequests or disposal requests into a will.

David’s question remains hanging in the air, however…

Why not just share passwords?

The American Bar Association has a straightforward answer, the kind of thing that you would generally not expect from a lawyer, as you assume there would be too many provisos. In a Q&A section about digital property, they ask and answer the following:

[Q.] A loved one recently passed away and I have all of her usernames and passwords for her online accounts. Can I simply log-on to her online accounts using this information?

[A.] Legally, you cannot log-on to her online accounts, whether you have the passwords and usernames or not, if the terms of service governing the account prohibit it. Some online accounts specify that only the original user may access the user’s online account, even when the original user is deceased.

Despite not being a lawyer, I should pick the nit that the original dead user is in violation of a contract, not the person logging in. This isn’t the same as hacking a computer, although without authorization by the decedent, it could be construed that way? (If you used an account to pretend to continue acting as the person, well, that’s beyond my opinion pay grade.)

I don’t know any case of a person being prosecuted for accessing an account innocently when they are a legitimate party to act on behalf of the person who is gone. However, I have read stories from time to time about cloud hosts and other service providers locking or deleting accounts when executors, heirs, or others continue to log in. It’s not like online services have death records feeds they use to close accounts, but there is some mechanism besides an estate contacting them that sometimes leads to awareness of death.

To avoid this, there’s a reasonably straightforward set of operations you can carry out before you’re gone. Sub in “they” for “you” if you’re working with a family member or other person needing help to set this up.

Screenshot with redactions of a signed-up Legacy Account on a Mac
Apple’s Legacy Contact lets you appoint people who will have a lower bar to obtain data you have stored with iCloud after you’re gone.
  1. Update your will to meet the provisions of RUFADAA. Any lawyer who creates wills should know all about it; if not, you likely need a different person.
  2. Set up all the legacy/inactive contact tools at every service that offers one. I’ve listed three above.
  3. Make a list of every online account at which you have digital materials stored, like audio and photos. Share that list with whoever will manage your affairs.
  4. Share passwords securely. Absolutely not legal advice, but if you trust one or more people to have your active information, you can use 1Password or Apple Passwords to share critical passwords or notes about passwords securely, and keep them updated and in sync. If that’s too risky, you may be able to have a lawyer retain a physical file for you—literally pieces of paper—that they can give to an executor or other appointed person.

This last step is a big help if you’re unexpectedly incapacitated and someone needs to run your affairs for you until (or if) you recover.

As Joe Kissell wrote in his Take Control of Your Digital Legacy ebook, a comprehensive guide to the subject:

Your digital legacy—whether, how, and in what ways your data will carry on without you—is a hugely important topic in the 21st century. No matter your age or health, something could happen to you at any time, and having a plan in place to deal with your accounts, files, and other digital data is a great kindness to your family and friends, to say nothing of future generations who may want to know all about you.

If you’d like step-by-step instructions to set up Legacy Contact for your Apple Account, see Apple’s help page or my book, Take Control of Your Apple Account.

We can’t predict the day we will be gone

You may decide or postpone carrying out the above steps, or find the cost of having a will drafted prohibitive, although many lawyers offer flat package rates for such work. More likely, though, you or someone who asks you for help as their technical expert will be the executor or heir of a person who left no instructions and took no steps to facilitate handing off their digital life. That’s very typical, though it should be less so with every passing year due to RUFADAA and growing awareness.

Reader David’s question was prompted, in fact, by a touching, richly informative, and long thread at TidBITS Talk, the forum used by our friends at TidBITS. A different David, Dave K., documented his struggle to get the right documents to recover the creative work of his late sister.

Because she had left no instructions nor set up a Legacy Contact, Apple told him they required a court order. I will be honest that I didn’t realize this was a requirement in that set of conditions, as I set up Legacy Contact as soon as it was available. Having that in place requires Apple to review a death certificate and possession of the contact key.

Without that in place or a conforming will, a court order would be required, not just for Apple, but likely for each service. While not a complicated document, it’s almost certain you need a lawyer to draft and process it so that it meets the requirements of the court, and the attorney can respond to any deficiencies. On your own, as posters in the thread note, you could mess it up and have the judge refuse to issue what is typically a pro forma response. Dave K. had a lawyer on tap who would manage this for $750, likely representing roughly two to three hours of work for a lawyer and paralegal.

In the end, he was able to negotiate with Apple over a long period to provide enough documentation and support that he could gain access to his sister’s Apple Account and retrieve everything important online. She didn’t leave passwords behind for her devices, and Apple has no way to provide those, so those remained inaccessible. He was able to get Apple to remove the Activation Lock on all of them so he could erase and resell them.

His story is both a cautionary tale for anyone you love who wants their digital life to be preserved or remembered, but also instructive that Apple listened and helped him navigate to a mostly happy conclusion.

The right to be forgotten

Photo of two large Ks back to back along their main stroke on a cobblestone courtyard outside the Kafka Museum
While Kafka asked to be largely forgotten, the Kafka Museum in Prague celebrates his life and work. (Photo by Glenn F.)

Perhaps writing about Dave K. reminded me of Josef K., the protagonist and victim at the center of Franz Kafka’s The Trial. Kafka, who died in 1924 at age 40, told his close friend Max Brod, who was also the executor of his estate, in the instructions in his will:

Everything I leave behind me…in the way of notebooks, manuscripts, letters, my own and other people’s, sketches and so on, is to be burned unread and to the last page, as well as all writings of mine or notes which either you may have or other people, from whom you are to beg them in my name.

Kafka wanted a few of his novels and short stories to remain available, but not all. Brod famously ignored him, published everything, and wrote a thinly fictionalized novel about his dear friend. Scholars have debated his actions for a century.

The RUFADAA has a Kafka Option, as do some of the legacy contact/inactive account tools. You can state in writing or configure your preferences for everything to be deleted. If that’s your intent, then there’s even more reason to get on planning. Those around us are more likely to be like Max Brod, and intentionally or accidentally preserve what you asked to be forgotten.

[Got a question for the column? You can email glenn@sixcolors.com or use /glenn in our subscriber-only Discord community.]

[Glenn Fleishman is a printing and comics historian, Jeopardy champion, and serial Kickstarterer. His latest book, which you can pre-order, is Flong Time, No See. Recent books are Six Centuries of Type & Printing and How Comics Are Made.]


By John Moltz

This Week in Apple: Time is the FireWire in which we all burn

John Moltz and his conspiracy board. Art by Shafer Brown.

Apple wants you to know about its latest pet project, WWDC is over but for all the shouting, and we bid FW (farewell) to FW (FireWire).

Oh, you have a movie coming out?

There is just kind of a lot right now all the time so you may be excused for missing the 10,000 times Apple tried to tell you about it but were you aware that Apple has a movie coming up? Can anyone guess the name? It’s two characters and rhymes with “fu-un”.

Apple, of course, did a humorous video tie-in to the film to kick off the WWDC 2025 keynote, starring Craig Federighi’s hair, but this week it turned the promotion up to F11.

If you went to Apple’s 5th Avenue store this week you might have run into one Tim Cook and actor Brad Pitt who were there to promote said flick. The movie is not called “Pitt Stop”, despite it being about cars and starring Pitt.…

This is a post limited to Six Colors members.


by Jason Snell

Big telescope, big data

Kenneth Chang and Iera Hwang of the New York Times take a deep dive into the unique data challenges of the new Vera C. Rubin Observatory, which is powered by a 3.2 gigapixel camera:

Each image taken by Rubin’s camera consists of 3.2 billion pixels that may contain previously undiscovered asteroids, dwarf planets, supernovas and galaxies. And each pixel records one of 65,536 shades of gray. That’s 6.4 billion bytes of information in just one picture…. Rubin will capture about 1,000 images each night.

Although Rubin will take a thousand images a night, those are not what will be sent out into the world at first. Rather, the computers at SLAC will create small snapshots of what has changed compared with what the telescope saw previously… Just one image will contain about 10,000 highlighted changes. An alert will be generated for each change — some 10 million alerts a night.

Storing, transmitting, and disseminating that much data leads to some interesting problems, like having enough storage onsite in case of outages, stringing fiberoptic cable across the Atacama desert, and processing the images to provide manageable data for astronomers to access remotely.


By Jason Snell

A few notes on iPadOS 26 audio and video capture

an all iPad production

Last week’s Six Colors podcast was recorded entirely on iPads running iPadOS 26, mine in California and Dan’s in Massachusetts. The podcast is usually just for Six Colors members, but you can listen to it here if you want.

You’ll be disappointed if you expect to hear anything special about it, though. We both recorded it on our usual Shure MV7 USB microphones, and it just doesn’t sound any different at all. (For the full iPad extravaganza, I should’ve edited it in Ferrite on my iPad, but for expediency’s sake, I didn’t at the time. I’ve since done that just for kicks, and that’s the image at the top of this story.)

It’s probably worth explaining why this feature has so many podcasters and other creators in a bit of a tizzy. Many podcasts record remotely, with people all over the world, and they usually use some sort of app to have that real-time conversation. It was Skype back in the day, and these days it’s often Zoom or a web-based recording program like Riverside. Because those apps prioritize real-time audio and video over quality, the quality is frequently bad by necessity.

To ensure that the very best audio and video is used in the final product, we tend to use a technique called a “multi-ender.” In addition to the lower-quality call that’s going on, we all record ourselves on our local device at full quality, and upload those files when we’re done. The result is a final product that isn’t plagued by the dropouts and other quirks of the call itself. I’ve had podcasts where one of my panelists was connected to us via a plain old phone line—but they recorded themselves locally and the finished product sounded completely pristine.

The problem has been iPadOS and iOS, which won’t let you run a videoconferencing app and simultaneously run a second app to capture your microphone and video locally. One app at a time is the rule, especially when it comes to using cameras and microphones. Individual iPhone and iPad videoconferencing apps can choose to build in local-recording features if they want, but in practice… they just don’t.

Apple has solved this in an interesting way. What it’s not doing is allowing multiple apps access to the microphone (so far as I can tell, I just tried it and the moment I started a FaceTime call, my local recording app stopped). Instead, Apple has just built in a system feature, found in Control Center, that will capture local audio and video when you’re on a call. It doesn’t work when another app is not currently using the microphone and camera, so it can’t be set to surreptitiously record stuff, and it displays a recording symbol at the top of the screen when it’s running. When you’re done, you can tap that symbol and it’ll save the file to the Files app.

The file it saves is marked as an mp4 file, but it’s really a container featuring two separate content streams: full-quality video saved in HEVC (H.265) format1, and lossless audio in the FLAC2 compression format. Regardless, I haven’t run into a single format conversion issue. My audio-sync automations on my Mac accept the file just fine, and Ferrite had no problem importing it, either. (The only quirk was that it captured audio at 24-bit, 48KHz and I generally work at 16-bit, 44.1KHz. I have no idea if that’s because of my microphone or because of the iPad, but it doesn’t really matter since converting sample rates and dithering bit depths is easy.)

Even in Developer Beta 1, this feature is pretty solid. What’s missing is a better preview of the audio levels and the ability to adjust audio gain, since different microphones have different gain levels and not all of them are easily adjustable. Beyond that, though, this feature is a winner. Podcasters should be rejoicing—I know I am.


  1. Since we had video off for the Six Colors podcast, the video track was blank and took up no space. 
  2. Why FLAC and not Apple’s own lossless format? My guess is that it’s being done for compatibility and simplicity reasons. 


by Jason Snell

Apple embraces the fast pace of developer tools and AI

This year, John Voorhees and I returned to the scene of the crime—the place where we got a demo in 2024 of Swift Assist, a feature that never shipped that we could’ve sworn we saw demoed live—to see the updated Xcode with AI assistance. Same room, same people, but this time the feature wasn’t just promised, it was shipping in Developer Beta 1.

More to the point, as John writes on MacStories, Apple had entirely rearchitected the tool so that developers can use any AI system they want and update to new models as they become available:

I’m not a developer, so I’m not going to review Swift Assist (a name that is conspicuously absent from Apple’s developer tool press release, by the way), but the changes are so substantial that the feature I was shown this year hardly resembles what I saw in 2024. Unlike last year’s demo, this version can revise multiple project files and includes support for multiple large language models, including OpenAI’s ChatGPT, which has been tuned to work with Swift and Xcode. Getting started with ChatGPT doesn’t require an OpenAI account, but developers can choose to use their account credentials from OpenAI or another provider, like Anthropic. Swift Assist also supports local model integration. If your chosen AI model takes you down a dead end, code changes can be rolled back incrementally at any time.

This is perhaps the best sign that Apple’s attitude toward AI and Apple’s role in the world has changed dramatically since 2024. If, in late 2025 or early 2026, a new coding model becomes all the rage with developers, Xcode will be able to use that model. That’s a big step forward for Apple.


By Dan Moren

At WWDC 2025, Apple played to its strengths

Tim and Craig take the stage at WWDC 2025

Another Worldwide Developers Conference is in the books, and after a week of keynotes, briefings, and travel, I’ve finally had a chance to sit and zoom out to the 35,000-foot view of the company’s latest announcements.

The Apple of 2025 has definitely learned some lessons.

In hindsight, last year’s event has seemed even more rocky, with the company hustling to unveil Apple Intelligence, including showing off features that still have yet to ship. To its credit, it avoided doubling down on those mistakes with this year’s announcements without fully repudiating its previous steps. Instead, the company went back to focusing on the assets that make it the best at what it does. In other words, the ones that let Apple be Apple.

Continue reading “At WWDC 2025, Apple played to its strengths”…


By Joe Rosensteel

tvOS 26 brings minor additions and weird priorities

Note: This story has not been updated for several years.

A screenshot of the TV app in tvOS 26 beta 1 showing a splash screen image for WWDC 2025. The text overlay on the image refers to it as a 'Movie - Special Interest'
They should have given it a theatrical run.

Apple has largely tied major revisions of tvOS to the launch of new Apple TV hardware over the years. Since the introduction of Apple TV+, WWDC’s tvOS “features” have largely focused on showcasing sizzle reels of Apple TV+ shows, and very little about tvOS itself. This WWDC gave us a trickle of announcements that don’t seem to align with what I would consider to be the rough spots in the tvOS user experience.

It is possible that Apple is holding back meaningful revisions until they launch an updated Apple TV box this fall. Maybe they’ll even mention the 10th anniversary of tvOS itself, which was unveiled in September of 2015 at the iPhone 6S launch event. Until then, I guess we should reflect on what’s announced, instead of wish lists of what could be.

Through a glass, darkly

I’m not going to rip into the design in beta 1. It’s also mostly a conservative evolution of what came before but with highlights on edges. However, Apple has really underscored a very specific part of the interface as working as intended and I will push back on that.

Apple has two kinds of Liquid Glass (Regular and Clear) and Clear is supposed to be used over rich media, like video. The only things that define the existence of the controls are the highlights and brighter/blurry refractions visible through the clear elements.

Well, gee whiz, aren’t clear glass playback controls going to be difficult to see over video, especially when it’s playing through the controls?

To make the controls easier to discern, Apple applies a dimming layer on everything around the controls, but not on the video visible through the controls. It’s like someone stenciled out aftermarket window tinting.

Apple says this is on purpose in its Meet Liquid Glass WWDC video, when demonstrating playback controls on iOS. In its Newsroom post for tvOS, it says: “tvOS 26 is designed to keep the focus on what’s playing so users never miss a moment.”

This is bananas. How is this getting out of the way of the content? You can barely discern the playback timeline and playhead while motion is occurring through the element, which causes it to pulse in a thin strip. What is being achieved here? The playback controls and timeline should be flat. No one is going to feel sad that there’s no glass effect in this one spot, where it serves no practical or artistic purpose other than being a wicked smart shader demo.

Poster through it

Another notable change in the interface is the pivot from horizontal thumbnails to portrait-orientation posters. Apple says that this means more tiles can fit on the screen, but that’s only more tiles visible in one row, and it’s only one additional tile over the smallest scale thumbnails (6 posters instead of 5 thumbnails). The older design had thumbnails that matched the aspect ratio of the TV in various sizes so you’d get more rows with fewer titles visible on screen in each row.

To compensate for this difference in aspect ratio, the text that was below or next to the thumbnails is now on top of them. I’ll let readers debate which is more legible, and whether or not the text is always helpful.

tvOS 18.5 (left) versus tvOS 26 beta (right).

This decision pushes content downward. If you want to see what kind of category you’re in the mood for, you will do more scrolling down, which means it will take you longer to count the number of times the TV app recommends you watch “Stick.” Unless you really want to flip through one particular row of the interface one title faster, it’s not really an improvement.

Used any good profiles lately?

I’m unclear about the continued push by Apple to get developers to adopt Apple’s user profile system. It really doesn’t provide any benefit to the developers of these large streaming services that need to have their own multi-platform profile systems with personalized content recommendations, and it doesn’t provide substantial benefit to households with shared viewing.

A screenshot of the tvOS profile selection screen. It shows a user profile for Joe and a user profile for Jason with a semi-transparent '+' over the corner of Jason, and another '+' next to that. At the bottom of the screen the 'Don't show this screen again' button is highlighted.
Someone had the forethought to include this button in beta 1.

I have no animosity towards user profile improvements whatsoever, and I do appreciate that on your first boot of tvOS 26 you can say you never want to see the profile switcher. However, system-level user profiles just don’t feel like the area of the TV viewing experience that needs this much attention when compared to other aspects.

If I were being generous, I could hypothesize that this emphasis on user profiles is because there will be some genuine effort put into personalizing the TV app based on the active user profile.

Unfortunately, you still can’t express any kind of preference in “personalized” areas of the interface to mark a recommended show as watched (without first adding each title to your Watch List and then marking it there) nor can you express that you have no interest in a title.

Even if increased personalization is on the horizon, there’s no reason to expect that to work as well as the personalization offered in each streaming app’s own recommendation systems. Such a thing requires developer participation and cooperation with Apple.

Speaking of developer participation…

Just keep adding single sign-ons until one of them works

The 10th anniversary of Single Sign-On is next year, so we’ll be celebrating this latest attempt a little early. That first attempt used a convoluted system to recognize your cable provider to authenticate all the individual apps you had that worked with existing subscriptions so you wouldn’t have to sign in. Just 18 months later Apple announced zero sign-on, where if you were on a qualifying provider’s internet network, the apps would authenticate on their own.

It’s safe to say that these systems almost immediately became obsolete because they were centered on a business relationship between customers and service providers that was in quick decline. Apple’s blind spot here was believing that anything not subscribed to via a cable provider would be subscribed to via Apple. Due to Apple’s App Store policies on subscriptions, many streamers have left the App Store behind. That means people have to do little sign-on dances that makes using Apple products as frustrating as cheap streaming hardware.

Instead of repairing its relationships with streamers, it’s providing this very latest sign-on feature, which links accounts via your Apple Account email address… but requires streamers to want to implement it. I hope they do, and I hope it works to make everyone happier.

Sing out loud, sing out long

I find myself scratching my head at the announcement regarding using iPhones as microphones to do Apple TV-mediated karaoke.

Look, this feature won’t hurt me, or cause harm to the world—with the possible exception of those within earshot—but it’s such a niche thing to do. I have to imagine that someone took a look at the collection of technologies that Apple had built and realized they could put them together, you know, for fun!

I hope people who use this feature do have fun. But it’s a strangely specific thing to use as a selling point, when there are other use cases for the Apple TV, such as watching television, that might be better places to focus.

Give me more

I want tvOS to improve, and am frustrated when another WWDC comes along and the changes are as minor as they were this year. I hold out some hope that there’s more to announce, and it’s being held back on for a new Apple TV hardware announcement. But for now, we’ve got tvOS 26… and it cuts down on information density and creates make see-through timelines.

tvOS needs to sort out the dichotomy between the home screen and the TV app. The current TV app is a mess and needs to be upgraded to support features that Apple has never taken a single pass at, like a universal live guide. I don’t expect them to be perfect, but it would be nice if we could see that Apple is making an effort. Change is long overdue for a platform that many take for granted. Apple needs to try harder at the TV part of tvOS.

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]


Our interest in drones and how we’d use them, household standards for wearing Bluetooth headphones, whether we install OS betas and on which devices, and the rare app or service where we don’t mind seeing ads.


Federico, Federighi, and iPadOS 26

Federico Viticci of MacStories got to sit down with Apple’s Craig Federighi to discuss iPadOS last week, which is a tremendous thing to see, and you can read all about it:

After all the talk over the past 15 years about the “post-PC era”, why have we come full circle to reusing features and UI metaphors that the Mac got right decades ago? I ask Federighi about this. “When you’re designing in a new space with a new set of constraints with a different kind of user in mind, you do guard yourself against whether it would be too easy to just pull the old thing off the shelf and put it here because maybe that feels right, because we’ve lived with it since 1984”, he begins, acknowledging the Mac’s key role in democratizing graphical user interfaces and freeform windowing. “And you ask yourself”, Federighi continues “’Well, but what is the essence of iPad? And if that other world had never existed and one had designed from first principles for a touch-first device…what would a cursor be like? What would windowing be like?’”…

“At the same time, you have to not be allergic to learning from the past”, Federighi adds. “I think the balance we’ve landed on now is saying, ‘Listen, in the case that the right answer for iPad is a consistent one with another device, the Mac, then, of course, let’s use it. But let’s not reach for something on the Mac reflexively, just because it’s there’”.

The Federighi quotes are interesting and Viticci adds a lot of useful context, but don’t miss the fact that this article is also Viticci’s in-depth post-WWDC brain dump of his reaction to the massive changes in iPadOS 26. It’s basically two articles in one, either of which would be a must-read.


By Jason Snell

Apple keeps checking items off my Mac wishlist

clipboard history in macOS Tahoe
Clipboard history in Spotlight on macOS Tahoe.

A couple of years ago, I recalled that in the early days of Mac OS X, I built up an entire array of utilities that allowed me to use my Mac just how I wanted it. I felt utterly naked on a Mac without LaunchBar, for example. But in the intervening two decades since OS X’s early days, Apple has just kept improving the base features of macOS to the point where most of my old “must-have Mac utilities” had become ones I kept around more out of habit than necessity. And in some cases, I’ve stopped using old favorites entirely because Apple’s built-in tools did the job. That’s good, because a new Mac user shouldn’t need to install a half-dozen utilities to get about being productive.

WWDC 2025 has made me revisit this same subject, because it turns out that the two biggest limitations of default macOS productivity that I saw back then are both addressed in macOS 26:

Many apps can act as clipboard managers—I’ve been using the one in LaunchBar for years, and Pastebot is a popular favorite—and once you use a clipboard manager, it’s hard to go back to Apple’s concept, unchanged in nearly 40 years, that there’s a single clipboard and once you copy something new, the old clipboard is gone forever. I now reflexively copy multiple items in one app and then paste those items into a different app rather than doing the old back-and-forth. I rely on the clipboard history to dig out an item from half an hour ago without having to look it up again…

Another area of interest is file management and automation. I recently wrote about how Folder Actions is somehow still a thing in macOS. Think about offering users the ability to select a folder in Finder or Files and build actions that would occur when those folders changed. Folder Actions enabled some of that, and utilities like Hazel have taken it to the extreme. Sure, power users can run wild with features like this, but I think regular users might appreciate being able to say, “When a file in this folder is older than 60 days, file it away somewhere else,” or “Delete all the disk image files in my downloads folder older than 60 days.” There’s something there.

In macOS 26, there’s a built-in clipboard manager that can be accessed from the Spotlight interface, and a new set of Shortcuts triggers let you run automations when events occur on your Mac or at specific intervals.

I’m sure there are still corners of macOS that could benefit from new features from Apple—there are always new frontiers—but I’m struck by the fact that two of the most glaring areas for improvement have been directly addressed in macOS 26 Tahoe. I can’t wait to spend more time with it in beta this summer.


by Jason Snell

Apple’s built-in transcriber blows away Whisper

John Voorhees of MacStories took Apple’s new Speech framework, available to all developers, for a spin in the macOS 26 beta and got great results in making audio transcripts:

It’s still early days for these technologies, but I’m here to tell you that their speed alone is a game changer for anyone who uses voice transcription to create text from lectures, podcasts, YouTube videos, and more. That’s something I do multiple times every week for AppStories, NPC, and Unwind, generating transcripts that I upload to YouTube because the site’s built-in transcription isn’t very good.

I’ve been using OpenAI’s open-source Whisper system (mosty whisper.cpp) for a couple of years, and while it seems to be more accurate than Apple’s model, it’s also half the speed of the large-v3-turbo model I’ve defaulted to lately.

It’s great to see that Apple is in this game, and even better, that it’s handing the power of this model to app developers so they can built speech-to-text transcription features directly into their apps.


Zaz splits WBD, the neverending Paramount sale, listener letters, and TV picks! [Downstream+ subscribers also get: Apple TV stasis, Clooney on CNN, and the Great Netflix User Migration.]


The Summer of Fun begins with loads of WWDC follow-up, including some big-picture reflections on last week and a discussion of some of our favorite features in Apple’s beta OS releases.


By Glenn Fleishman

Reducing suboptimal iCloud Photos storage sizes

Glenn Fleishman, art by Shafer Brown

iCloud Photos requires mysterious files and processes and often consumes huge amounts of storage space. Six Colors subscriber John writes in with a question about one aspect of this:

I have photos set to optimize storage, and it’s currently (according to DaisyDisk) using 60GB of my 500GB MacBook Air.…But mediaanalysisd is also using 60GB – I understand that’s one of the processes that run when idle, but should it be using that much space, and can I restrict it?

The tl;dr answer is: make sure you’re running macOS Sequoia 15.3 or later. A bug in 15.2 apparently caused this problem for many people. If you’re already on 15.3 or later, I provide more instructions at the end of the article for deleting the cache and the consequences.

Let’s break down how iCloud Photos manages your storage, particularly when you don’t want to store full-resolution images and videos on your Mac.

All those moments will be lost in time, like tears in rain

Photos stores nearly all its data inside a macOS package named Photos Library by default, with a .photoslibrary extension. This package contains a number of folders that allow Photos to perform tasks like retaining your original image and recording modifications that can be reverted later. This structure isn’t designed for humans, but as an efficient way for the app to manage, display, search, and organize media.

If you enable iCloud Photos in Photos > Settings > iCloud, you have two choices presented:

  • Download Originals to this Mac
  • Optimize Mac storage
Screenshot of Photos iCloud preferences in macOS
You can choose whether to store full-resolution or optimized media on your Mac with Photos iCloud settings.

(Please ignore Apple’s capitalization. It drives me bonkers.)

I have advocated for a long time for people to devote enough storage on their Mac—or one of their Macs if they have multiple—to store the entire downloaded Photos library. Because iCloud storage of Photos is a black box, there’s no good way to interact with your files backed up there. Without owning a full-resolution local copy that you can archive and update via Time Machine, another local backup option, or cloud backup (and preferably a combination of those), you could find yourself reliant on the iCloud copy.

While Apple has been rock solid with iCloud storage for years, you can have other problems with relying on it: you could be locked out of your account, because of a reason Apple won’t inform you of. Or, you could have a catastrophic set of system failures or equipment loss—such as in a fire or natural disaster—or even an issue with password and security code record keeping that leaves you unable to prove ownership. If someone hijacked your account, they can also delete your media using iCloud.com or through a locally synced copy, although that’s less likely.

With my warnings noted, optimizing iCloud Photos storage can be quite effective when you have an enormous library relative to your local storage. I have a 2 TB SSD attached to a Mac Studio to keep my 800 GB Photos Library from filling the internal 1 TB drive. However, I have optimization enabled on my MacBook Pro, because that laptop’s 1 TB internal drive means I lack the storage to keep the whole thing. Question-asker John’s library occupies about 1 TB on iCloud.

Let’s dig into optimization.

Uncontrollable purging

Photos optimizes storage by retaining a thumbnail and other metadata about an image or video but dumping the full-resolution media file, which is retained in iCloud. If you double-click, edit, export, or otherwise preview an image or video, the file is downloaded, offering a circular progress completion graphic in the lower-right corner to let you know something is happening. On sufficiently fast networks, you rarely have to wait except for large videos.

Apple uses optimization in a few places, including iCloud Drive (System Settings > iCloud > iCloud Drive), and uses the same philosophy in each case. macOS has some background monitoring to avoid your drive filling up to 100% and rendering it unusable. (This doesn’t always work.)

When some unspecified threshold is reached, various background daemons that are set to optimize can kick in. With iCloud Drive, the least-used files are dumped from local storage first.

With iCloud Photos, however, Apple appears to be extremely aggressive. I can’t think of a single case in which I’ve had optimization enabled in which the optimized Photos Library is more than a range of 10% to 20% of the full library size. On the above-mentioned MacBook Pro, my Photos Library is under 40 GB of local storage.

Screenshot of Manage Storage from macOS System Settings showing a bar representing in green storage in use and individual apps listed below, including Photos.
You can see the storage that iCloud says your synced Photos Library takes up as part of your iCloud or iCloud+ storage.

(Photos and iCloud are often very erratic about reporting storage and quantities of media. For instance, while the Photos Library is about 800 GB with full-resolution downloads configured on my Mac Studio, iCloud reports it takes up 670 GB online. I’m unclear what extra material makes up a 130 GB difference.)

Having that data in mind, we can circle back to John’s primary question—I am a master of expositional sidetracking—about why the daemon mediaanalysisd and its associated storage are sucking up so much data.

The mediaanalysisd agent’s job is to process images in the background during low-activity periods to perform facial (person and pet) and object recognition. (Apple notes, “Face recognition and scene and object detection are done completely on your device rather than in the cloud.”)

You can see where this daemon stores its thinking by going to the Finder, choose Go > Go To Folder, and enter:
~/Library/Containers/com.apple.mediaanalysisd/
Data/Library/Caches/com.apple.mediaanalysisd

On my full-resolution-storing Mac Studio, the folder there takes up 205 MB; on the MacBook Air with optimization on, a bit more at 386 MB. As I mentioned at the outset, people have reported for a few releases that this file will grow uncontrollably, but Sequoia 15.3 and later appear to have stanched that.

You cannot turn this agent off without going through some command-line hoops, many of which are reversed when you install the next macOS update. However, because this is a cache, you can opt to delete it if the storage is causing problems.

Causing amnesia through file deletion

If your com.apple.mediaanalysisd folder is bulging like a failed lithium-ion battery, here’s what you can do:

  1. Quit Photos.
  2. Drag com.apple.mediaanalysisd to the Trash.
  3. Empty the Trash.
  4. Restart your Mac.

Deletion may cause Photos to start over with analyzing people, pets, and things. It might result in the folder swelling back to its original size! But given the reports online from people starting in Ventura, and particularly with people after upgrading to macOS 15.3, this is the best path forward.

For further reading

Our very own chief, Jason Snell, has the definitive book on the topic, Take Control of Photos, where you can find oodles of information and step-by-step instructions on working with the enormous beast that is Photos.

[Got a question for the column? You can email glenn@sixcolors.com or use /glenn in our subscriber-only Discord community.]

[Glenn Fleishman is a printing and comics historian, Jeopardy champion, and serial Kickstarterer. His latest book, which you can pre-order, is Flong Time, No See. Recent books are Six Centuries of Type & Printing and How Comics Are Made.]


By Dan Moren for Tom's Guide

Apple’s Shortcuts app is getting a huge upgrade in iOS 26 and macOS 26 — here’s how it will help you

You may not know it, but there’s an app built right into your iPhone, iPad, and Mac that can supercharge your experience using all those products — and it’s about to get even more powerful.

That app is Shortcuts, and it lets you automate tasks on your devices, even across apps. You can have it do everything from a simple job of creating a GIF of a Live Photo to a complex system that lets you annotate a podcast as you’re recording it, and way, way more.

You build these workflows in the Shortcuts app by selecting actions and stringing them together: passing information from, say, the clipboard and having the system scan any text in the image, and then overlay it on an image you specify in order to create a meme. With this year’s updates to Apple’s platforms previewed at WWDC 2025, Shortcuts is able to do even more, including leveraging Apple Intelligence features and automating actions on the Mac.

That will make this automation tool even more versatile and able to handle more complex tasks than ever before—even if it still has some flaws that could stand addressing.

Continue reading on Tom's Guide ↦


By John Moltz

This Week in Apple: Let’s talk about something else

John Moltz and his conspiracy board. Art by Shafer Brown.

Apple gives them something to talk about, enhanced Siri is definitely maybe coming, and congratulations sickos, you can now look through windows on the iPad.

Your ass is Liquid Glass

Congratulations! We have something new to argue about other than the intersection of politics and tech or AI or App Store rules or all of the other things we’re so very tired of arguing about! Yes, almost as if the company wanted to change the subject, Apple announced a new look and feel to all its operating systems.

Turns out, some people hate it! And the beta is so buggy!

Yeah, that’s kinda how early betas work.

There is undeniably a mess of messy UI problems right now with Liquid Glass, but as someone who has been to more than one rodeo (two, I’ll have you know), I can give you the 411 on what’s going to happen: many of them will get fixed before iOS 26 ships and some… will not.…

This is a post limited to Six Colors members.


iPad podcasting and glassy design

This is surely one of the first podcasts recorded entirely using the local recording feature iPadOS 26. We discuss new OS stuff and design stuff. Big week!

Become a member (members, sign in) to listen to this podcast and get more benefits.



Search Six Colors