Six Colors
Six Colors

Apple, technology, and other stuff

Support this Site

Become a Six Colors member to read exclusive posts, get our weekly podcast, join our community, and more!

By Jason Snell

Checking in on Podcast Notes

The bottom row of my Stream Deck is devoted to Podcast Notes.

Three years ago, Dan and I collaborated on a project that allowed us to dynamically create editing notes on our Macs while recording a podcast. A recent query about my process made me realize that it would probably be worth revisiting the Podcast Notes project. After all, when we were first writing about it, a single podcast session could be educational enough to make me completely revamp my approach. Now we’ve had three years of podcasts.

First, a recap: We were seeking a way to take notes during a podcast without writing time codes down manually on paper like some of our friends. The desired end result was a text file featuring time codes next to notes about what needed to happen at that particular time, whether it was removing cross-talk or a bad word, or even just noting a change in topic that merited a new chapter. The goal was something that would have minimal cognitive overhead—unlike writing a time code down on paper. We ended up with a Shortcut that did the job by looking at the creation date of the active recording file, extrapolating a time code, and appending the time code and any passed input to a text file on the Desktop.

Turns out that Dan isn’t editing as many podcasts as he used to and so he doesn’t use our Podcast Notes shortcut much at all. I don’t edit as many podcasts, either, but that’s only made Podcast Notes that much more valuable, since I want to provide notes to my editor. I think a few of our friends are also using the Shortcut now, including my Upgrade co-host Myke Hurley—and believe me, getting Myke to forsake pen and paper feels like a real win to me.

So what has changed in three years? Not a lot, but little rough edges have been smoothed out. I’ve found that I really only need three pre-wired buttons on my Stream Deck: Chapter Marker, Crosstalk, and Cut This. I have a button that lets me choose from six different preformatted events, but I almost never use it. Those three types of note cover almost every eventuality.

They’re also all I need because of the other major change in my workflow: I always keep the note file open in BBEdit. Not only does this give me security—I can see my notes going in as I press the button—but it allows me to mark up the text file with added detail. Here’s how that generally works: When an event happens, I press the button and a new time code is added to my notes file. This often happens when I’m talking, so the last thing I want to do is type one thing while I’m saying something else! That’s a recipe for bad podcasting.

But when I stop talking and my co-host or panelist takes over, I will frequently switch to BBEdit and add some notes in, such as the name of the chapter in question or more detail about the thing that needs to be cut. This approach also means I don’t really need more buttons for less common events—I can just press one of my buttons, like “Cut,” and then edit the entry to say “background noise” or “swear” instead. BBEdit is very savvy to changes to underlying files it’s editing, so the BBEdit window updates with the new entry almost immediately.

Notes from this week’s episode of Upgrade

I’ve found a couple of ways to launch my shortcut in Stream Deck. The most reliable method is to use Keyboard Maestro, which offers an Execute Shortcut command that allows you to pass input along. I’ve got a Keyboard Maestro macro for each different text string that I’m passing, each bound to a different button. That said, it’s just as valid to bind the button to a shell command like shortcuts run "Podcast Note" -i "crosstalk". As a result, you can pretty much connect the shortcut to anything you can think of—Stream Deck is just the system that has clicked for me.

Because of my integration with BBEdit, my original vision of pressing buttons and not worrying about the resulting text file has sort of vanished. My notes file is pretty much always open when I’m recording a podcast, and I’ll annotate it when I get a chance. But the time codes are always accurate, because they’re recorded the moment I press the button. That’s important.

Using BBEdit led to a pretty weird issue with my workflow, though: if I edit the file in BBEdit and fail to save it afterward, the next time the shortcut runs it will overwrite the existing file, causing a conflict in BBEdit, which will force me to choose between losing my edits or losing the time code I just tried to save. As a result, I’ve added this AppleScript script to the very top of my Shortcut:

on run {input, parameters}
    tell application "System Events"
        if (name of processes) contains "BBEdit" then
            tell application "BBEdit"
                set docList to every document
                repeat with doc in docList
                    if name of doc contains (item 1 of input) then
                        save doc
                    end if
                end repeat
            end tell
        end if
    end tell
end run

This script is pretty simple. It checks to see if BBEdit is running, and if it is, it searches for an open document that matches the filename format that Podcast Notes generates. If it finds one, it tells BBEdit to save that file, at which point the rest of the shortcut runs and appends a new entry.

While writing this article, I’ve realized that I manually open the note file every time after creating it, in order to audit it in BBEdit. As a result, I’ve just appended another AppleScript script at the very end of the Shortcut that checks if the note file is currently open in BBEdit and opens it if it isn’t! Ah, there are always more things to automate.

In any event, the Podcast Note shortcut is pretty stable, works really well, and has been a huge boost to my productivity. I pass notes to Jim Metzendorf and Chip Sudderth, who edit the audio and video versions of Upgrade, using this approach. (And Myke does likewise when he’s not on paternity leave.) I also pass these notes to Steven Schapansky, who edits The Incomparable and Downstream. And of course, even if I’m editing the podcast myself, it’s sure a big help to have a list of chapter breaks and edit points at the ready rather than having to guess where the edits might be based on memory.

Here’s a link to the current Podcast Note shortcut if you’re interested in viewing it, using it, or adapting it.

Finally, an easter egg for those who have read this far. Recently I recorded a podcast late at night, the fourth Incomparable member special in three days. I was tired, and when one of my co-hosts ripped out a swear, I didn’t bother to press a note button. I didn’t even have a notes file open! I had assumed it would all be an easy edit.

So what do you do if you don’t have a podcast note? What I did was use Whisper to transcribe the podcast into subtitle format (which includes time codes), and then searched for the swears so that I could snip them out. I even built a python script that will check a subtitle file against a dictionary of bad words and automatically output a list of time codes to be bleeped. I don’t know how often this will come up in the future, but it was a fun little project.



By Dan Moren

WWDC 2025 is kicking off June 9

WWDC 2025 logo

The cycle’s about to begin anew: Apple announced on Tuesday that its 2025 Worldwide Developers Conference will run June 9-13. The event, including keynote and sessions, will be available for free online, along with a limited in-person component at Apple Park.

As with the last several years, in-person attendance will be by lottery, with current Apple Developer Program or Apple Developer Enterprise Program members eligible to apply for a chance to join. In addition to viewing the keynote and Platform State of the Union, attendees will be able to get one-on-one and group lab time with Apple employees as well as tour the campus and participate in some other “special activities.”

Applications are open for just over a week, until April 2. Fifty “Distinguished Winners” of the Swift Student Challenge, who will be announced this Thursday, March 27, will also be automatically invited to the event.

Apple’s widely expected to announce the latest updates to its various software platforms, including iOS 19 and macOS 16, at this year’s event, but one big question mark hanging over the conference will be how the company handles the recent delay of Apple Intelligence features.

[Dan Moren is the East Coast Bureau Chief of Six Colors, as well as an author, podcaster, and two-time Jeopardy! champion. You can find him on Mastodon at @dmoren@zeppelin.flights or reach him by email at dan@sixcolors.com. His next novel, the sci-fi adventure Eternity's Tomb, will be released in November 2026.]


Stephen Hackett joins Jason to discuss A.I. fallout, EU moves to expand device interoperability, Jason’s recent shootout of TV streaming boxes, and more.


by Jason Snell

AirPods Max gets lossless, low-latency, and an analog cable

Apple Newsroom:

Next month, a new software update will bring lossless audio and ultra-low latency audio to AirPods Max, delivering the ultimate listening experience and even greater performance for music production. With the included USB-C cable, users can enjoy the highest-quality audio across music, movies, and games, while music creators can experience significant enhancements to songwriting, beat making, production, and mixing.

This is the other shoe dropping after the AirPods Max were updated with a USB-C port last year, but without the corresponding USB-C-to-analog jack cable that enabled ultra-low-latency and lossless audio input. (The earlier Lightning model could use a Lightning-to-analog jack to do this.)

Apple says that the included USB-C cable can now be connected to devices for a pure digital lossless experience, and is offering a $39 analog-to-USB cable to enable that connection from analog sources. The company also said that a forthcoming software update will enable 24-bit, 48 kHZ lossless audio on AirPods Max.

On the one hand, this is good news for AirPods Max users. On the other hand, it seems a little weird that all of this wasn’t just part of the product announcement back in September.


MLB is trying to open up local streaming, but it doesn’t solve the problem of reaching casual fans. Also: a weird café in Korea and our TV picks. (Downstream+ listeners also get: British TV vs. streamers, deep programming thoughts, and Looney Tunes.)


By John Moltz

This Week in Apple: Executive disfunction

John Moltz and his conspiracy board. Art by Shafer Brown.

Apple shuffles executives over Siri delays, there’s a small problem with services revenue and, hey, remember the European Commission? Because they remember Apple.

Hey, Siri, play “Musical Chairs”

Apple is making big moves to address the company’s failure to deliver advanced Siri.

“Apple Shuffles AI Executive Ranks in Bid to Turn Around Siri”

Or, well, moves at least. Look, at least one person is going to have to get new business cards. You think those grow on trees?

The Siri group has been moved from under Apple’s head of AI, John Giannandrea, to reporting to (now) former head of the Vision Pro group, Mike Rockwell.

According to Bloomberg’s Mark Gurman:

Chief Executive Officer Tim Cook has lost confidence in the ability of AI head John Giannandrea to execute on product development…

Not so much that he’s going to fire him, though. Giannandrea will stay on at Apple and retain responsibility for “research, testing and technologies related to AI.” Except for Siri, which is not AI. Get it?

If you do, fax me the answer because I don’t get it.

Many in the Apple community seemed to be calling for a firing over the failure to deliver on conversational Siri, something Apple had announced and even advertised as imminent. Now Apple is the target of a lawsuit claiming false advertising over the issue.

“Apple Facing False Advertising Lawsuit Over Apple Intelligence Delay”

Poor Bella Ramsey. First all those zombies and now this. It’s kinda hard to argue, though. And Apple pulling the ad sure doesn’t help it seem not false.

A reorg is not a firing, but it’s something.

Is it something that will do anything? Who know? Might as well ask Siri.

Please. My services revenue. It’s very sick.

Look, I don’t want to hear anyone complaining about price increases on Apple TV+ anymore. Apple is barely scraping by!

“Report: TV+ Losing $1 Billion Annually as Apple Services Falter”

Not to worry, though. It’s all part of the plan.

Apple’s initial business plan for ‌Apple TV‌+ predicted losses of between $15 billion and $20 billion over its first decade.

Hey, that’s my business plan! They stole my business plan!

Apple’s services revenue is growing quickly but apparently “other than iCloud+, Apple’s other services are said to be in poor health.” So, if you are shedding any iCloud tears, know that it’s for a good cause.

Apple News+, Fitness+ and Apple Arcade are said to be struggling with low usage and profits.

First of all, have you seen the news recently? Yuck. Who wants that? Second, I have been saying for years that Fitness+ would be a lot more popular with cheat codes. And, finally, just give up on Apple Arcade.

Commission emission

The EC is back in EC 2: Electric Boogaloo!

EC Mandates Apple Provide iOS Interoperability on Several Fronts

Honestly this is probably more like the Fast & Furious franchise now. What are up to? 12? 18? Has Vin Diesel jumped a new App Store regulation over a helicopter yet? I’ve never seen a Fast & Furious movie.

The EC is asking for quite a lot here, mostly related to making third-party peripherals have the same capability to work with iPhones that Apple peripherals do. For its part, Apple says — and this is a direct quote — “Uhnn! But we don’t want toooooo!”

OK, that’s more a vibes quote than a direct quote.

Dan explains:

As another part of this proceeding, businesses will have the right to ask Apple about how various features work and then submit a request for interoperability with their products.

There is a fine line between leveling the playing field and asking Apple to do a bunch of work for its competitors. It seems like Apple may find out just where that line is.

[John Moltz is a Six Colors contributor. You can find him on Mastodon at Mastodon.social/@moltz and he sells items with references you might get on Cotton Bureau.]


Streamer box battle and Apple TV

Jason spent time with streamer boxes to get a better sense about where Apple TV succeeds and fails. [More Colors and Backstage members also get our nearly hourlong monthly Q&A.]



By Joe Rosensteel

Wish List: Siri, Spotlight, and a unified search experience

A screenshot showing the Google search results for scanning and sending a document with Mail. There's a box with the AI summary from Gemini, and then the relevant Apple Support document right underneath.
Maybe this is why Apple executives want Gemini so badly?

There’s a lot of talk recently about Siri being behind the competition. Siri often can’t find what you’re looking for, or what you want to know, and there’s no telling when it might be able to. Many of the requests we make to Siri are basically searches, and when we are unhappy with Siri we turn to search on the web for answers, or in the case of local files or music, we just manually dig it up ourselves.

So here’s a thought for those who might suddenly find themselves in charge of Siri: Search is a foundational element of smart assistants, and the current state of Apple’s search technologies leaves much to be desired.

While all today’s web search engines are placing sparkly and unreliable AI-synthesized answers above everything else, they still generally deliver solid search results underneath. Refining Siri without bolstering the foundation is a recipe for disaster.

Using Siri for search

Apple’s recent announcement that it’s delaying several AI features began with a self-serving sentence about how much people love Siri. You guys know Siri. Among its touted new, revolutionary features was “type to Siri,” a feature that’s not really new (you’ve been able to do that via an accessibility setting for quite a while), but is not a bad idea at all. The problem is that I find myself typing to Siri like I would enter text in a search box. Word choice has a huge impact.

This is inferior to just opening a web browser and typing into a good ol’ fashioned search box. First of all, if you want to ask Siri how to do something, you have to prepend “how” to the request or it might treat your request as something to act on. I also don’t have to worry about a web search engine picking up on a keyword like “email” and trying to compose an email while it ignores the rest of my question just because I didn’t prepend “how.”

Even when Siri parses your words correctly, it’s really that focus on attempting to provide a single result, or perform a single action, that makes it less useful. Like I said at the top, a major factor in the usefulness of any search engine is that you have multiple possible matches for what you entered into that search engine. It’s a powerful tool because you may not have used words that exactly match the title of an Apple Support page, but are close enough that you should consider them.

What if you don’t happen to know the names of all the features for a task you want to do? Let’s say you need to update or change your credit card info. Asking, “How do I change my credit card info?” (See the left of the three iPhone images, below.) It’ll tell me I can do that in the Contacts app (center, below).

Please don’t store your credit card info in the Contacts app. If I ask, “How do I change my payment information?” it’ll tell me to remove a HomePod (that I don’t even own) from the Home app (right, below).

three iPhone screens with confusing Siri output

I have to know the exact words for the three places in Settings where credit card information is stored in order to form a question precise enough that Siri product knowledge will reveal the results for each individual feature I ask for one at a time. If I knew enough to be that specific, then I wouldn’t need to ask.

Searching the web for the same generalized questions works like a charm, but I do have to provide the specific context of the platform I am inquiring about. That’s a key advantage of Siri—it knows the platform I’m on already. When I ask Siri on my iPhone, “How do I scan a document?” Siri is going to return a result relevant for iOS. Unfortunately, it’ll only be the instructions for “Scan Document” in the Notes app instead of all the places in iOS where you can invoke “Scan Document.”

That expectation of context can work against Siri when it doesn’t apply it correctly. A humorous example: If you’re on an Apple TV, and say, “How To Train Your Dragon” into your Siri remote, it will not show you the info for the movie like it would for many other titles, but it will give you some training advice for your dragon. This is the same result you get on all Apple platforms because no context is being used in this instance. Saying “Show me How to Train Your Dragon” (If you’re typing it you need to title-case it or Siri will still give you dragon training advice) will display a list of the movies with that name.

A web search engine doesn’t have this issue, even though it doesn’t have context. It can interpret movie titles before trying to be literal with all the words in a request.

What about Spotlight?

Apple has another brand, Spotlight, that it uses as an umbrella for its various search technologies that return search results, but it’s mostly about finding stuff on your device.

It can’t do natural-language search, though—only Siri gets to do that. If you type a natural language request into Spotlight, it’ll likely put a link to do a web search for your request at the top of the list of search results. It’s not going to parse it into movie, tv show, or song titles unless you happen to have those as files.

That’s a real shame, because it would fit right in with our expectations of searching on the web if we could do that kind of search in Spotlight. Sure, it can still bail to the web, or Siri, if you ask, “Who won the Super Bowl?” but not everything people want to request concerns general knowledge.

Spotlight does a lot of things better than Siri. It displays a ranked list of search results. It live-updates the search results as you continue to type and refine the thing you’re looking for. “Type to Siri” has to digest a complete request, process it, and perform an action or display a blurb.

These two technologies need to work together. Spotlight needs to be able to handle more natural-language requests. Siri needs to be able to display those results when there are multiple, possible, relevant results for a request. We shouldn’t expect Siri, as the magic-sparkle box, to correctly interpret all meaning with no further action required. (Google buries the single-response option under it’s “I’m Feeling Lucky” label. Siri assumes we’re all feeling lucky, all the time.)

Improvements I’d like to find

Providing natural language search can be done in parallel with improving Siri and doesn’t stymy or dismiss the work of that team, and provides both a pressure-release valve and support for whatever Siri is doing.

As a user, I’d like to be able to use natural-language search anywhere there’s a generic search box on an Apple platform, and have the results be predictable. The more context and scope the device also can infer, the better. And offering options to perform different kinds of searches—of the Web, of the Spotlight index, you name it—wouldn’t hurt. That’s the flexibility of providing users with multiple, navigable results instead of a single magic outcome.

I shouldn’t have to turn to a third party like Google to ask about Apple’s platform, especially when Apple just shipped Siri’s product knowledge feature. Apple needs to improve Spotlight, integrate it better with Siri, and provide a more consistent search experience—with options!—across all its devices.

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]


By Jason Snell

Who’s the laggard? Comparing TV streamer boxes

If you’ve ever wanted your TV to be sponsored, Amazon has you covered.

Ever since I cut the cord, my only interface to the world of television has been my Apple TV. I think it’s a good product while also being well aware of every single thing it does wrong, or doesn’t do. tvOS needs a lot of attention. But the Apple TV 4K box itself is a powerful piece of hardware that runs pretty much every app you can throw at it—and in 4K HDR, no less.

So imagine my surprise when I read this from Bloomberg’s Mark Gurman: “Apple’s accessory lineup has been neglected of late… The Apple TV box — a laggard in the living room space — has gone without an update since 2022.”

Laggard! One of Steve Jobs’s favorite put-downs. But as someone who has tried a lot of different streamer boxes over the years, I couldn’t really understand what Gurman was trying to communicate here. Yes, the Apple TV box is a laggard when it comes to market share—it’s much more expensive than most of its competitors—but that doesn’t seem to be what Gurman’s saying. He is suggesting that, since the Apple TV hardware hasn’t been updated since 2022, it’s neglected and in desperate need of an update so it can stop being a laggard. I think?

I’m sure that Apple will eventually need to update the Apple TV with a processor more powerful than the A15 Bionic, but in my experience, Apple’s hardware has been vastly more responsive and useful than any other TV box I’ve tried, let alone the dog-slow embedded operating systems inside most TVs.

Gurman was probably just conflating the Apple TV’s low market share with the fact it hasn’t been updated in a few years. But regardless, I took this as an opportunity to buy three competing boxes—the $100 Google TV Streamer 4K, the $40 Fire TV Stick 4K Max, and the $95 Roku Ultra 2024—and compare them to the $129 Apple TV 4K.

At no point did Apple’s hardware feel slow or in need of an upgrade, though Google’s box felt just about as responsive. As for the rest of it, though, for a bunch of boxes designed to do the same thing, their interfaces and approaches are all quite different.

Continue reading “Who’s the laggard? Comparing TV streamer boxes”…



What we carry in our wallets, do we want dumb phones, how we feel about the new Pebble smartwaches, and whether we’d be interested in a cellular MacBook.


EC mandates Apple provide iOS interoperability on several fronts

Third-party smartwatches and wireless headphones are among the devices and technologies that Apple will have to provide better interoperability with, according to a ruling from the European Commission today.

The EC concluded the proceedings it launched last fall into Apple’s interoperability with third-party devices, and has identified several places that Apple needs to make changes.

…the specification decision on connected devices aims to enable, amongst others, the following improvements:

  • iPhone users will have enhanced possibilities to receive push notifications including pictures on their non-Apple smartwatch and reply to these notifications.
  • iPhone users will also be able to pair their non-Apple connected devices such as headphones and smartwatches more seamlessly and easily with the iPhone.
  • Non-Apple devices such as virtual reality headsets will benefit from better and faster data connections with the iPhone.
  • Developers will be able to integrate alternative solutions to Apple’s AirDrop and AirPlay services on the iPhone. As a result, iPhone users will be able to choose from different and innovative services to share files with other users and cast media content from their iPhones to TVs.

There is a lot to unpack there. The aforelinked document goes into more detail about not only what exactly is required, but also the timeframe on which the various features need to be put into place.

When these proceedings were announced, I wrote that they seemed “vague”, but it’s hard to argue that the findings here aren’t concrete. I can’t imagine that Apple is thrilled about them—and, indeed, in a statement provided to MacRumors, Apple says:

“Today’s decisions wrap us in red tape, slowing down Apple’s ability to innovate for users in Europe and forcing us to give away our new features for free to companies who don’t have to play by the same rules. It’s bad for our products and for our European users. We will continue to work with the European Commission to help them understand our concerns on behalf of our users.”

While some of these decisions do seem as though they would improve consumer choice, such as implementing audio switching for third-party headphones and allowing competitors’ smartwatches to show pictures received in notifications, but I am a little more skeptical when it comes to requiring competition to features like AirDrop and AirPlay.

In the case of the former, unless you are talking about allowing cross-platform file transfer with non-Apple platforms, like Android and Windows, I’m unsure what the benefit is. In the latter, AirPlay is not only available on most other platforms, but there are also existing third-party standards like Google’s Cast.

As another part of this proceeding, businesses will have the right to ask Apple about how various features work and then submit a request for interoperability with their products. Which, again, feels well-meaning, but also feels like it could open up a can of worms where anywhere from dozens to thousands of vendors want to request specific interoperability.

As always with the DMA, there are no shortage of both upsides and downsides to the legislation. But with the EC requesting some of these features be implemented in beta as early as this year, it certainly seems as though Apple has its work cut out for it.


by Jason Snell

Is there a chance the phone could bend?

This week on Upgrade, my guest John Siracusa and I pondered whether Mark Gurman’s report that the forthcoming iPhone Air wouldn’t be as big as an iPhone Pro Max because of a fear of a return to the days of “bendgate.”

We speculated that the Internet’s favorite structural engineer, Dr. Drang, might be able to fact-check that claim for us. And he did:

Here’s a simple problem assigned to me by John Siracusa… early in the show he and Jason Snell were discussing the “thin iPhone” that’s rumored to be coming out this fall. Apparently, there will be only one size of this phone, and there’s speculation that there won’t be a “Plus” or “Max” size because of concerns about another Bendgate. John thought that an unlikely reason, which led to my homework.

If you’re interested in the math behind the stress we put on big flat thin phones, this is your moment.


By Dan Moren for Macworld

This year’s WWDC keynote will be must-see Apple TV

If we go by the usual schedule, we’re probably a little under three months out from this year’s Worldwide Developers Conference—chances are it’ll be announced before the end of this very month.

It promises to be quite the event.

That’s in no small part because Apple’s found itself in an unusual position this year: the company has already copped to the fact that some of the most impressive features from last year’s WWDC keynote—Siri’s ability to use personal context and take action in various apps—will not arrive on time, and we don’t know exactly when they might ship—if ever.

So when Tim Cook greets the in-person crowd and fires up the keynote video, what exactly are we going to see, and how’s it going to play?

Continue reading on Macworld ↦


John Siracusa joins Jason to discuss Apple interface redesigns, Apple’s ongoing AI disasters, the management challenges of Siri, and Jason’s reviews of the new MacBook Air and Mac Studio.


By John Moltz

This Week in Apple: The piñata of Apple rumors

John Moltz and his conspiracy board. Art by Shafer Brown.

Apple releases some new entertainment, Gruber takes on Gurman, and Apple gets set to move all those things from where you’ve come to expect them to be.

Here we are now, entertain us

Good news for you Vision Pro owners out there: the Vision Pro is still a product in Apple’s lineup! As such, the company continues to churn out new experiences for the platform, to distract you from the unbearable weight of existential ennui that is part and parcel of the human experience.

This week Apple released an immersive concert experience featuring Metallica, which many users have praised despite it featuring, you know, Metallica.

Another product still in Apple’s lineup is Apple Arcade. Such a household name is Apple Arcade that I had to look it up because I kept thinking of Game Center, although I did at least know that wasn’t right.

Apple announced some fantastic new games for the platform like, uh, Rollercoaster Tycoon Classic, Katamari Damacy, Space Invaders, The Game of Life…

OK, OK. Still, they’re new to Apple Arcade.

And finally, while not much of a surprise, Ted Lasso has officially been renewed for a fourth season. Creator and star Jason Sudeikis even dropped some tidbits about the plot for the upcoming season and I don’t want to give away too much but it involves soccer.

Apple bloggers gone wild

We now turn to the world of inside baseball. Except instead of baseball it’s Apple blogging.

It has not gone unnoticed by many in the Apple community that Daring Fireball’s John Gruber has been needling Bloomberg reporter Mark Gurman of late. In the last few weeks, Gruber has pointed out that Gurman was wrong about the processor in the new base iPad, seems to get much of his information from Apple media briefings, was wrong several times about Apple’s cellular modem, was late on the story about Apple next OSes featuring big design changes and — and this is a direct quote, you can go find it on Daring Fireball — “hangs his toilet paper in an improper underhand fashion”. (DISCLAIMER: not an actual quote, you will not find it on Daring Fireball.

(Yet, anyway.)

None of this is incorrect, it is just a palpable trend of late.

Now Gruber has taken to the airwaves to blast some company, not sure if you’ve heard of it, it’s new to me, lemme just take a look here at the name…

Apple.

It’s apparently one of those new AI companies because Gruber is taking it to task over making wild claims about what its AI offering would soon be able to do and then failing to make good on those promises.

“Something Is Rotten in the State of Cupertino”

While I have never shared characterizations of Gruber as an “Apple apologist”, let us just say that this is still a bit of a departure for him. Clearly, he thinks something is amiss with the silicon and bits mill’s AI efforts and suggests that Steve Jobs would have flipped some tables by this point.

As if to tie things up with a neat bow, Mark Gurman reported this morning that Apple’s Siri chief said at an all-hands meeting that the delays of the more conversational Siri were “embarrassing” and “ugly”. But he still reportedly praised his team without making any noticeable table adjustments.

Oh, no

In case you missed it tucked into the previous section, the rumor that iOS 19 will feature a huge design change, perhaps the biggest redesign since iOS 7, is in full swing.

iOS 7. Everyone loved that, right? Ah, yes, it was universally adored, as I recall.

It must be a sign of the times because what once would have been very exciting has seemed to be met as the equivalent of a trip to the ol’ butt doctor. Not that a trip to a young butt doctor is much better but, oof, old Dr. Crumble? I mean, have you ever looked at the diploma on his wall? It says it’s from “The Wilkes-Barre, PA, Correspondence School of Proctology (A Non-Accredited Institute of Adjacent Learning)”. It doesn’t even say “higher learning”! It says “adjacent”! What does that even mean?!

Perhaps responding to the audible groaning, John Gruber points out that while these changes can be polarizing, sometimes you gotta take a swing at things.

He should know. He’s taken enough swings at Mark Gurman. Hey-ohhh.

[John Moltz is a Six Colors contributor. You can find him on Mastodon at Mastodon.social/@moltz and he sells items with references you might get on Cotton Bureau.]



Search Six Colors