With over 5,000 five star reviews; Magic Lasso Adblock is simply the best Safari ad blocker for your iPhone, iPad and Mac.
Designed from the ground up to protect your privacy, Magic Lasso blocks all intrusive ads, trackers and annoyances – stopping you from being followed by ads around the web.
Churn is a subscription-industry term that refers to losing subscribers. If you run a subscription-based business, you want loyal customers who don’t cancel their subscriptions. The more cancellations, the more money and effort you need to expend to replace them with new subscribers. Low churn rates are good.
Which is why I was fascinated to see an Antenna report that Apple TV+ has the highest churn rate among major streaming platforms. I’m surprised because Apple’s customer base is loyal and I would think that the Apple One Bundle helps them a lot, but I’m open to the idea because Apple famously hasn’t licensed much content for its streaming service, and so its catalog is considered to be scant when compared to others.
But Apple TV+ has been around since 2019, and that means that Apple has had more than five years to get its content-generation engine in gear and fill up that catalog. Maybe people just subscribe for a single buzzy show and then don’t bother to look through the catalog?
Anyway, in the spirit of helping out anyone who’s got an Apple TV+ subscription for a month or two just to watch “Severance,” I thought I’d give you a list of stuff to watch on Apple TV+. Just a note: this is stuff I have personally watched and like enough to recommend. I’m not listing shows my friends and family liked, or that I have heard are good: these are my personal favorites.
Netflix and Amazon prepare to celebrate NFL Christmas, the PVOD vs. Theatrical debate rages on, Gunsmoke is having a moment, Max rebrands again, Jason watched some Netflix ads, and did we mention that Julia is back?
With Apple having delayed its most ambitious Apple Intelligence features, you might find yourself thinking: it’s probably best if the company just focuses on other endeavors for the time being, like a rumored iOS redesign or even pushing forward its Vision Pro platform.
Nope, not so much. According to a report from Bloomberg‘s Mark Gurman, Apple is embarking on future AI-based features, including most prominently an “AI-based doctor” dubbed Project Mulberry.
Yep: Just when you think you’re done talking about AI, they pull you back in.
The latest episode of the Twenty Thousand Hertz podcast takes a stab at telling Apple’s accessibility story through sound—not only the sound of a host and his interview subjects, but the way Macs and iPhones sound when they speak to people who use their accessibility features. From the VoiceOver screen reader to AirPods Pro’s hearing assistance features, the episode—which includes interviews and other sound bites from Apple execs—traces a 40-year story of doing the right thing, combined with large dollops of product promotion.
The episode was clearly developed in close cooperation with Apple, and it offers a look at how the company sees its own history with accessibility. But marketing content aside, the episode is a good listen, especially in the first half, when it traces the history of accessibility initiatives at Apple, all the way back to Steve Jobs taking that first Mac out of the bag in 1984, and letting it have its say.
Apple traces its accessibility work back to 1985, and while the podcast skips the dark years in the 90s and early aughts when little was happening on the accessibility front, there is plenty of compelling history here, from the introduction of VoiceOver on the Mac to Stevie Wonder thanking Steve Jobs in 2011 for making iPhones and iPads accessible. That had happened two years before, as I recounted in my documentary 36 Seconds That Changed Everything: How the iPhone Learned to Talk
Apple folks featured in the Twenty Thousand Hertz episode include senior accessibility director Sarah Herrlinger, VP for Sensing and Connectivity Ron Huang, director of Apple Watch product marketing Deidre Caldbeck, and AirPods marketing director Eric Treski. There’s also archival footage of Tim Cook, giving a version of his famous comment about Apple’s support for accessibility being unrelated to ROI.
The second half of the show focuses on hearing-related accessibility features, including sound recognition and, of course, AirPods. Here, the vibe is more marketing-heavy, with stories of lives changed (they do ring true) and recitations of features. But because a podcast has sound-rich audio in its toolbox, the producers are able to demonstrate how AirPods features actually change the sound of the world around the wearer, whether it’s live listen or conversation awareness.
Some people can leave well enough alone, but not us nerds! You may recall a few years ago, I detailed my creation of a smart On Air sign using a color e-ink display.
While I feel that project was largely a success, one major obstacle has kept it from being as useful as I’d hoped: the long battery life.
What? How is long battery life a problem, I can hear you asking.1 The issue is that the battery life is just long enough that I forget to recharge it until I happen to check and see that it’s dead, invariably right before I start recording a podcast.
Now, I could just set a reminder to go off every few days to plug in the sign and recharge it, but simple solutions aren’t how we ended up here in the first place, are they? Time for some OverEngineering™!
The idea that came to me one day was sheer elegance in its simplicity: what if I built a system that let me know what the current battery level was so that I could charge it when it was getting low but before it had died?
So, last month, in a fit of pique over the continual increases in the cost of a Netflix plan, I decided to cancel Netflix.
This is a big deal because I’ve been a Netflix subscriber since the very beginning—back when it was all DVDs and no streaming. I checked in with my kids because I was worried Netflix was more popular with the youths than with people like me. But even my kids didn’t care and thought there wasn’t much to watch on Netflix.
So, I canceled it, figuring I’d pick it back up again on a case-by-case basis. “When there’s something to watch, I’ll resubscribe for a month, watch what I want, and then cancel again,” I thought.
This month, though, I already had some reasons to watch. Everybody’s talking about “Adolescence,” and I hadn’t seen the newest iteration of John Mulaney’s live talk show—which means I wasn’t able to talk about it on Downstream last week, which felt like a mistake.
(It turns out that even when there’s nothing you think you want to watch on Netflix, there’s still some stuff you sort of want to watch on Netflix.)
So I figured I’d resubscribe for the month—but I’d stick it to Netflix by opting for the ad plan, which is only $8 a month. Netflix insists that the ads on that plan are “short and seamless” and “won’t interrupt the action”—and it really pushes that ad plan hard because it makes a lot of money from it. (The price hikes for the ad-free plans are largely because the ad plan has such a high average revenue per user that Netflix has needed to jack up the price of the ad-free plans so that they earn similar amounts of money per user.)
My plan did not survive even the briefest of contact with the enemy.
The first thing my wife and I watched on our reactivated Netflix was episode one of “Adolescence,” a tense, serious, dark drama about children and crime. Each episode is one continuous shot—a technical marvel. But the flow was broken numerous times by bright, shiny, noisy ads, disrupting the mood repeatedly with 30 to 45-second interruptions.
I feel sympathy for whomever Netflix is paying to tag content for the best places to insert ads. There are no clear act breaks in “Adolescence,” and the fact that it’s one continuous shot means that literally any interruption is going to be incredibly disruptive to the content of the show. It was never intended to be shown with advertising inserted mid-stream.
Netflix programmed four separate ad breaks.
Oh yeah?
The truth is, outside of live sports, I haven’t really watched a commercial in 25 years. Back in 2000, I got my first TiVo, and since then, ad skipping has been the norm for me. I’ve been fast-forwarding through ads or, in some cases, clicking a button to skip ads automatically ever since.
So, to sit there in 2025, watching a countdown timer tick down during a loud, colorful ad, all while in the middle of a grim, dark, contemplative show… was unbearable. After 25 years of mostly ad-free viewing, I am irretrievably broken. I just couldn’t take it.
While the ads played on, I began creating a thought experiment: There’s a $10 difference between the ad and ad-free plans. If Mr. Netflix (he wears a top hat) came to my house and said, “Jason, I’ve got a great deal for you. I’m going to pay you $120 a year, and all you have to do is watch ads while you watch Netflix,” what would I do? When I started thinking about it, I thought it might be an interesting intellectual question. What would I accept in exchange for having Mean Mr. Netflix beam ads into every show I watch?
It turns out that whatever my price is, it’s a whole lot more than $120 a year. The next day, I upgraded back to the $18 ad-free plan.
The big news of the last month has been Apple’s inability to ship its most anticipated and most ambitious Apple Intelligence features on time. Granted, though this is a bit of a black eye for the company, it’s unlikely to hurt it in the long run.
Still, as the company Worldwide Developers Conference is fast approaching, there’s certainly an argument that Apple might want to take some of the pressure off by gently redirecting attendees to some other exciting announcements. Some have gone so far as to suggest that the rumored redesign of iOS 19 is just Apple trying to distract users and the media from its shortcomings in the AI realm.
Respectfully, I disagree. Not because Apple is above this, but because if the company really wanted to distract people from talking about Apple Intelligence there are way better things it could announce.
In the interest of helping out a trillion dollar corporation that definitely needs my help, here are just a few things that could show up in Apple’s keynote that would have people saying “Apple Intelliwhatnow?”
Announce Vision Pro 2: Remember back in 2023 when all eyes—sorry not sorry—were on Apple’s rumored headset? Ah, those were the days. Maybe it’s time for a pre-AI throwback to bring back the heady excitement of years gone by. I’m thinking a brand new Apple Vision Pro with an updated processor, lighter construction, and most importantly no screen on the outside to see a fake version of your eyes. Oh, also, as long as we’re going for shock and awe, it’ll cost half the price of the original Apple Vision Pro, and still nobody will buy it. And isn’t that exactly the kind of bananas decision that’s going to get everyone’s attention?
Revive AirPort: Apple abandoned the router game back in back in 2018, but there’s no time like the present to dig up the corpse of a beloved product line and reanimate it. Really, there’s nothing consumers love more than networking hardware—they’re downright passionate about it, at least when they can tell the difference between a router and a modem. Which one connects to my TV again?
Ship AirPower: It says something that AirPort is only the second most mythical Apple product line that starts with “AirP.” Look, sometimes it just takes a little bit of time to perfect your hardware, polishing all those minor details that Apple likes to focus on, like chamfered edges and not catching on fire.
Update the Mac Pro: Of course the existence of the M3 Ultra Mac Studio means that a more powerful chip is just waiting in the wings for the Mac Pro. Even better, I hear Apple’s crack processor team of Bigfoot and Nessie have been real hard at work on it.
Redesign the Magic Mouse: I know Apple just finally switched the Magic Mouse to USB-C, but hear me out: that means a redesign would be that much more unexpec—wait, what now? Damn it, stay in your lane, Apple.
[Dan Moren is the East Coast Bureau Chief of Six Colors, as well as an author, podcaster, and two-time Jeopardy! champion. You can find him on Mastodon at @dmoren@zeppelin.flights or reach him by email at dan@sixcolors.com. His next novel, the sci-fi adventure Eternity's Tomb, will be released in November 2026.]
Executive movements like this are not unusual in large companies and it’s usually a good thing when Apple promotes from within. So, congratulations to Apple’s Vanessa Trigub on her new role!
She started out at Apple as an intern around the launch of the original iPhone, and has since held roles in finance, mergers and acquisitions, and retail operations.
And that, dear reader, is when I crumbled to dust. “The iPhone launch?! But that was only… Dammit.”
Rumor rumble
Details of iOS 19 continue to drip out, enough to give some a very strong idea of what the next version of Apple’s mobile operating system will look like. And also enough for others to say they’re all wet. Bloomberg’s Mark Gurman says YouTuber Jon Prosser is wayyy off base.
As the old adage goes, when two Apple rumor mongers fight, the real winner is the viewing public. Please search for your own “Let them fight” GIF. Do you like the text low and tight? Standard? Dramatically appearing down the left? Dramatically appearing down the left with perspective? So many to choose from.
According to Gurman, the images that are “floating around” are based on “very old builds” or “vague descriptions,” and are lacking key features. Gurman says that we can “expect more from Apple in June.”
Well, where’s the fun in that? “Hey, heyyy, everyone! Let’s just wait and see what Apple announces!” Get out of here with that nonsense. We’ve got wild speculating and flights of rendering fancy to do.
It does seem that iOS 19 will feature at least some substantial changes to interface elements, giving the operating system a new look if not completely overturning, sorry, the apple cart.
How has no one used “The Apple Cart” as the name of a rumor site yet?
The shatter-curious Apple Watch
The rumor mill also turns for the Apple Watch and it brings sad news for fans of plastic (I assume they exist, I’m not checking). According to Gurman, it is looking less likely that Apple will produce a plastic Apple Watch SE because:
The design team doesn’t like the look, and the operations team is finding it difficult to make the casing materially cheaper than the current aluminum chassis…
If you’re lamenting the potential loss of fun, colorful, plastic Apple Watches for the kids, remember that Apple is perfectly capable of making aluminum Watches in cool colors and its current SE offerings are black, silver, and vaguely goldish.
OK, so plastic Apple Watches are off the table. What about ones made entirely of glass? Based, it seems, on an Apple patent filing, a Chinese leaker says the company could be working on a Watch that Jony Ive might love even more than one plated in gold.
Still, don’t hold your breath.
The Chinese leaker provided no evidence to back their claim.
So what else might be coming to Apple Watches? Two words: Ay. Eye.
Apple is exploring the idea of adding cameras and visual intelligence features to its smartwatch…
So, instead of having to take your phone out to point it at something and have Visual Intelligence maybe tell you what it is or why it’s significant, you can just point your wrist at it. There’s some utility to that but, honestly, as far as future Apple offerings so I kind of wish the company would actually sell the Lumon Terminal Pro that it coyly added to the Mac options on its online store.
I have a lot of feelings about numbers.
[John Moltz is a Six Colors contributor. You can find him on Mastodon at Mastodon.social/@moltz and he sells items with references you might get on Cotton Bureau.]
Nearly three years ago, I wrote AppleScript: Shortcuts Bridge or Crutch?, questioning whether accessing AppleScript via Shortcuts on the Mac was a feature to be celebrated or a red flag, fearing that Apple would use the integration to postpone or never release many of the system-level actions that were missing from Shortcuts’ debut on the Mac.
As I put it then, “if Shortcuts is to become the default way to automate tasks on the Mac, there needs to be steady, yearly progress to make macOS and its default system apps as Shortcuts-friendly as possible. There’s a role for AppleScript to play in Shortcuts that won’t go away anytime soon, but not as a way to fill the potholes left by missing Shortcuts actions.”
Shortcuts’ progress on the Mac has been anything but steady and yearly.
A few days ago, while writing my Podcast Notes update, I realized that I had (inadvertently?) created an automation that begins with a Stream Deck keypress that executes a Keyboard Maestro macro that kicks off a JavaScript script in Audio Hijack that runs an AppleScript applet that executes a Shortcuts shortcut. In recent days I’ve also edited shortcuts that run Python and AppleScript scripts, including some where the shortcut is really nothing more than a Mac UI-friendly wrapper around a bare script, much in the same way you can use Automator as a simple wrapper around AppleScript scripts.
That all these things are possible on the Mac is amazing, and it’s a testament to how flexible and powerful the Mac can be. But it also says something quite profound about how little progress Apple has made with Shortcuts on the Mac (or in general) in the last few years. (And of course, all these workarounds fail on iOS entirely.)
Maybe the drive toward App Intents will help make Shortcuts more powerful and less reliant on tools like AppleScript, Keyboard Maestro, and the rest. But even that isn’t enough, since the Shortcuts app is way too rickety and limited. Just the other day, Dan Moren said to me, “I was working on a shortcut and I needed an if-else-if statement,” and we both began laughing because conditionals are just so bad in Shortcuts.
Apple gave itself a lot of leeway by declaring that Shortcuts was the “beginning of a years-long process” to make Shortcuts the “future of automation on the Mac.” But that was almost four years ago. There’s not a lot of leeway left for me to give.
My thanks to Footnote Accessories and FlashFinder for sponsoring Six Colors this week.
FlashFinder is the first keychain flashlight that supports Find My. Wherever you put it—your keychain, your backpack, whatever—you’ll be able to find it later. And the flashlight itself is six times brighter than the flash on your phone, so you’ll always be able to find your way.
It’s discreet, USB-C rechargeable, and it’s a multitasker. I love a multitasker.
This is where we, the people, are apparently failing AI. Because in addition to being humans with jobs and social lives and laundry to fold and art to make and kids to raise, we should also learn how to tiptoe around the limitations of large language models that may or may not return accurate information to us.
Apple, [NYT tech columnist Kevin] Roose says, should keep pushing AI into its products and just get used to the idea that those features may be unpolished and a little too advanced for the average user.
And again, respectfully, I would ask: To what end?
Astute take from Morrow that sums up a lot of the issues with AI, specifically where it falls short. This is yet another case of people adapting to machines, when the point is that our technology should adapt to us.
The thesis of the piece is not about excusing Apple’s AI missteps, but zooming out to take a look at the bigger picture of why AI is everywhere, and make the argument that maybe Apple is well-served by not necessarily being on the cutting edge of these developments.1
One minor quibble with is that Morrow references the contentious “Crush” commercial as “one of Apple’s early ads for its AI”. That ad was, of course, for the iPad, and was released (and subsequently pulled) in May, a month before Apple Intelligence debuted at WWDC. ↩
It’s a personal floating holiday for Jason, so we get meta and discuss weird audience reactions, the close-knit Apple community, and the arrival of a WWDC date. [More Colors and Backstage members also get an update on some of our various home tech projects.] Tune in to Upgrade on Monday for much more of Jason and Dan!
Now we know that June 9 is the day Apple will kick off WWDC 2025. I’ve been covering WWDC since before Steve Jobs came back to Apple, and this year is shaping up to potentially be the most interesting, and certainly most dramatic, version of Apple’s most important event of the year.
If you didn’t know already, WWDC is important for more than just developers. It’s literally Apple’s New Year’s Day, the day that the company rolls out its plans for all of its platforms for the next year. With the impending release of iOS 18.4, macOS 15.4, and the rest, we are at the tail end of last year’s cycle. Work at Apple is now shifting even more to the stuff we’ll be using over the next year, beginning with the announcement of new features and new decisions on June 9.
Last year’s event, which featured the rollout of Apple Intelligence—including, yes, some features that never ended up shipping—was certainly dramatic. But at the same time, we all pretty much knew what was coming—Apple was desperate to be seen as a player in the AI game, and so it was going to blow that horn as loud as it could. This year’s fascination is more subtle: What now? Doubling down on Apple Intelligence? Apologizing for last year’s, er, overexuberant promises? Changing direction? Staying the course? The last year has flipped the table on Apple’s usually conservative and careful platform-building plans. This year’s possibilities are wide open.
A household appliance we’d improve with modern tech, a ridiculous tech solution to a silly problem we’ve had, the iOS or app interface we most want redesigned in iOS 19, and a pie-in-the-sky Apple Intelligence feature that would truly excite us.
The bottom row of my Stream Deck is devoted to Podcast Notes.
Three years ago, Dan and I collaborated on a project that allowed us to dynamically create editing notes on our Macs while recording a podcast. A recent query about my process made me realize that it would probably be worth revisiting the Podcast Notes project. After all, when we were first writing about it, a single podcast session could be educational enough to make me completely revamp my approach. Now we’ve had three years of podcasts.
First, a recap: We were seeking a way to take notes during a podcast without writing time codes down manually on paper like some of our friends. The desired end result was a text file featuring time codes next to notes about what needed to happen at that particular time, whether it was removing cross-talk or a bad word, or even just noting a change in topic that merited a new chapter. The goal was something that would have minimal cognitive overhead—unlike writing a time code down on paper. We ended up with a Shortcut that did the job by looking at the creation date of the active recording file, extrapolating a time code, and appending the time code and any passed input to a text file on the Desktop.
Turns out that Dan isn’t editing as many podcasts as he used to and so he doesn’t use our Podcast Notes shortcut much at all. I don’t edit as many podcasts, either, but that’s only made Podcast Notes that much more valuable, since I want to provide notes to my editor. I think a few of our friends are also using the Shortcut now, including my Upgrade co-host Myke Hurley—and believe me, getting Myke to forsake pen and paper feels like a real win to me.
So what has changed in three years? Not a lot, but little rough edges have been smoothed out. I’ve found that I really only need three pre-wired buttons on my Stream Deck: Chapter Marker, Crosstalk, and Cut This. I have a button that lets me choose from six different preformatted events, but I almost never use it. Those three types of note cover almost every eventuality.
They’re also all I need because of the other major change in my workflow: I always keep the note file open in BBEdit. Not only does this give me security—I can see my notes going in as I press the button—but it allows me to mark up the text file with added detail. Here’s how that generally works: When an event happens, I press the button and a new time code is added to my notes file. This often happens when I’m talking, so the last thing I want to do is type one thing while I’m saying something else! That’s a recipe for bad podcasting.
But when I stop talking and my co-host or panelist takes over, I will frequently switch to BBEdit and add some notes in, such as the name of the chapter in question or more detail about the thing that needs to be cut. This approach also means I don’t really need more buttons for less common events—I can just press one of my buttons, like “Cut,” and then edit the entry to say “background noise” or “swear” instead. BBEdit is very savvy to changes to underlying files it’s editing, so the BBEdit window updates with the new entry almost immediately.
Notes from this week’s episode of Upgrade
I’ve found a couple of ways to launch my shortcut in Stream Deck. The most reliable method is to use Keyboard Maestro, which offers an Execute Shortcut command that allows you to pass input along. I’ve got a Keyboard Maestro macro for each different text string that I’m passing, each bound to a different button. That said, it’s just as valid to bind the button to a shell command like shortcuts run "Podcast Note" -i "crosstalk". As a result, you can pretty much connect the shortcut to anything you can think of—Stream Deck is just the system that has clicked for me.
Because of my integration with BBEdit, my original vision of pressing buttons and not worrying about the resulting text file has sort of vanished. My notes file is pretty much always open when I’m recording a podcast, and I’ll annotate it when I get a chance. But the time codes are always accurate, because they’re recorded the moment I press the button. That’s important.
Using BBEdit led to a pretty weird issue with my workflow, though: if I edit the file in BBEdit and fail to save it afterward, the next time the shortcut runs it will overwrite the existing file, causing a conflict in BBEdit, which will force me to choose between losing my edits or losing the time code I just tried to save. As a result, I’ve added this AppleScript script to the very top of my Shortcut:
on run {input, parameters}
tell application "System Events"
if (name of processes) contains "BBEdit" then
tell application "BBEdit"
set docList to every document
repeat with doc in docList
if name of doc contains (item 1 of input) then
save doc
end if
end repeat
end tell
end if
end tell
end run
This script is pretty simple. It checks to see if BBEdit is running, and if it is, it searches for an open document that matches the filename format that Podcast Notes generates. If it finds one, it tells BBEdit to save that file, at which point the rest of the shortcut runs and appends a new entry.
While writing this article, I’ve realized that I manually open the note file every time after creating it, in order to audit it in BBEdit. As a result, I’ve just appended another AppleScript script at the very end of the Shortcut that checks if the note file is currently open in BBEdit and opens it if it isn’t! Ah, there are always more things to automate.
In any event, the Podcast Note shortcut is pretty stable, works really well, and has been a huge boost to my productivity. I pass notes to Jim Metzendorf and Chip Sudderth, who edit the audio and video versions of Upgrade, using this approach. (And Myke does likewise when he’s not on paternity leave.) I also pass these notes to Steven Schapansky, who edits The Incomparable and Downstream. And of course, even if I’m editing the podcast myself, it’s sure a big help to have a list of chapter breaks and edit points at the ready rather than having to guess where the edits might be based on memory.
Finally, an easter egg for those who have read this far. Recently I recorded a podcast late at night, the fourth Incomparable member special in three days. I was tired, and when one of my co-hosts ripped out a swear, I didn’t bother to press a note button. I didn’t even have a notes file open! I had assumed it would all be an easy edit.
So what do you do if you don’t have a podcast note? What I did was use Whisper to transcribe the podcast into subtitle format (which includes time codes), and then searched for the swears so that I could snip them out. I even built a python script that will check a subtitle file against a dictionary of bad words and automatically output a list of time codes to be bleeped. I don’t know how often this will come up in the future, but it was a fun little project.