Six Colors
Six Colors

Apple, technology, and other stuff

Support this Site

Become a Six Colors member to read exclusive posts, get our weekly podcast, join our community, and more!

By Dan Moren

WWDC 2023: watchOS 10’s redesign

Note: This story has not been updated since 2023.

Though all of Apple’s operating systems got attention during this year’s Worldwide Developers Conference, watchOS 10 got perhaps the biggest overhaul, not only changing fundamental ways that users interact with the device, but also how information is presented on the wearable.

Some of this information was shown off during the keynote, but the WWDC session Meet watchOS 10 goes into more detail about how the Apple Watch’s design language is changing—and make no mistake, this is a major change. In many ways, the evolution of watchOS 10 feels similar to the way iOS 7 upended the way iPhone apps were designed, if somewhat less radical overall.

Interactive

On the interaction side, Apple’s revamped both of the watch’s physical controls. The side button, which prior to watchOS 10 brings up the Dock, instead summons Control Center.

watchOS 10 Control Center
Control Center is now available anywhere with a single press of the side button.

Likewise, the Digital Crown’s usage has been expanded; in previous version of watchOS 10, rotating the crown while on the watch face was limited only to interactivity within certain faces—adjusting the height of numerals on Metropolitan, for example, or letting you advance or rewind daylight on Earth in the Astronomy face. But on many faces it did nothing at all, a strange omission that left one of the device’s key tactile interactions underutilized. In watchOS 10, turning the Digital Crown while in the watch face brings up a new set of widgets.

watchOS 10 widgets

Using the Digital Crown within apps is also now more standardized within watchOS 10. For example, you continue to be able to scroll through apps that use a list-based view—such as Mail—but in other apps with multiple screens, Apple has instituted a more rigid vertical pagination system that lets you essentially scrolls through multiple demarcated tabs rather than, say, an endless page.

It’s also worth noting that Apple has updated the increasingly unwieldy Home Screen of the Apple Watch in watchOS 10. Previous versions gave you the option of the classic Grid view1 or a List view. I’d always gravitated to the former, as frustrating as it could be, because it was easier for me to remember where something was spatially than scrolling through a lengthy list.

watchOS 10 Home
watchOS 10’s new home screen offers a tweaked layout.

The new Home Screen option seems to aim for the best of both worlds, providing a fixed width grid that Apple says is “consistent organization” of apps that you can scroll through with the Digital Crown.

Design

With watchOS 10, Apple’s pushing a strong change to the design language. Early rumors suggested a focus on glanceability, but what was missing from that discussion was that it wasn’t simply about redesigning the current interface, but about how Apple suggests developers design their apps.

One major way this is coming to the forefront is with three specific layouts that Apple is providing for apps: Dial, Infographic, and List. Of those, List is perhaps most similar to existing apps, like Messages and Mail, while Infographic is used in apps like the World Clock, where there’s one screen of information but you can toggle between different things like multiple locations. Dial, meanwhile, seems to take its inspiration from the watch face itself, featuring large central content with icons in the corners.

watchOS 10 interfaces

Overall, there’s a big focus on the “single page” nature of apps in watchOS, rather than apps that let you scroll through a bunch of content. Take, for example, the Activity app in the current version of watchOS. When you launch it, your activity rings are front and center, occupying the entire screen. You then have the option to both scroll down to see more information, as though you are viewing a web page, or swipe to other screens for additional information (Sharing and Awards).

In watchOS 10, those endless scrolling app pages are being discouraged. Instead, an app would present a single discrete page showing all relevant information on a single screen, and then you would page through different tabs of information. So, for example, in that Activity app, one page shows full-screen details about your Move goal, the next about your Exercise goal, and the third about your Stand goal.

watchOS 10 Activity
watchOS 10’s redesigned Activity app uses multiple discrete screens.

Likewise, horizontal tabs, as in the current version of the Activity app, are also being phased out, since it requires you to navigate by swiping, which can obscure the display.

Apple’s also encouraging the use of background content—a color, gradient, or even animation—that conveys additional information about the current screen. Like giving a red-tint to your Move ring page, or showing the current conditions at your location in animation in the Weather app. That provides more context about what you’re looking at, and helps ground you in a particular view.

Even further reinforcing the emphasis on single page views, Apple’s discouraging the use of hierarchical navigation in watchOS 10. Which makes a lot of sense: most times when you’re interacting with your watch, you don’t want to be diving through menu after menu.2

Fundamentally, Watch interactions are measured in seconds, not minutes. The design language that Apple is encouraging in watchOS 10 seems like it will help make those interactions much more efficient when it comes to helping users see the information they want at a glance and then go back to their day.


  1. Come on, Apple, it’s the honeycomb. We all call it that. 
  2. That said, there are exceptions to the rule for apps that really need it, like Settings or Mail. 

[Dan Moren is the East Coast Bureau Chief of Six Colors, as well as an author, podcaster, and two-time Jeopardy! champion. You can find him on Mastodon at @dmoren@zeppelin.flights or reach him by email at dan@sixcolors.com. His next novel, the sci-fi adventure Eternity's Tomb, will be released in November 2026.]


By Jason Snell

Eyes (and head) on with the Apple Vision Pro

Note: This story has not been updated since 2023.

Yes, I’ve worn it.

One of the advantages of being present at Apple Park for WWDC 2023 was that I got to experience—after having my face measured and my eyeglasses scanned—the Apple Vision Pro in a controlled environment, guided by two Apple employees.

I came away mostly impressed—but (as with so much of this product) with a bunch of questions, too. The hardware itself is impressive—a dense core like a mega iPhone you wear on your face, yes, but wrapped in humane materials like a gray woven elastic band. With its folds and ridges, it suggested a comfy sweater to me—and I think that’s intentional, as Apple seems to be working very hard to make this product not seem sleek and soulless.

Once I put the device on my face, it seemed familiar—I’ve got a lot of experience with the PSVR and the Meta Quest 2—but also different. I started with a brief set-up process, in which my eyes followed a dot moving around in my field of view so the device could calibrate itself, and the device automatically adjusted to the width of my eyes. (Apple reps had already prepped my headset with lenses based on my glasses prescription, so I could see clearly without my glasses.)

With all that, the device snapped on—and I was sitting on a couch with my two Apple guides in chairs on either side of me. In other words, the device defaulted to showing me the actual world around me, and in remarkable detail. The cameras on the Vision Pro are impressive, as are its displays. They don’t provide a view that’s quite as clear as reality itself, but it’s remarkably good. It was easy to get used to the idea that I was seeing the real world, even though it was really just a camera image on OLED displays right in front of my eyes.

At this point in the demo, my handlers walked me through numerous activities using the device. I opened the home screen (by performing a short press on the Digital Crown above my right eye) and launched various apps, all of which floated in the middle of the room.

Honestly, what I’m most excited by about the Vision Pro might be the fact that a lot of people at Apple have devoted years to figuring out the next evolution of computer interfaces. What they’ve done is build on the last 15 years of Apple touch interfaces, adapted to a new platform. It’s familiar—yet also new.

Let’s start with the pointer. There isn’t one. Nor are you expected to poke at virtual interfaces with your fingers. Instead, the Vision Pro’s eye tracking knows exactly where you’re looking at all times. (Items subtly highlight or move forward when you look at them.) Your gaze is the pointer. Look at an app and then tap your thumb and index finger together, and you’ve done the equivalent of tapping on app icon to launch it.

Other gestures are similarly intuitive. To swipe or scroll, you just bring your thumb and index finger together and then move your hand sideways (for swiping) or up and down (for scrolling). It took me no time to understand the gestures, because they’re clearly derivative of everything I’ve learned about using an iPhone or iPad.

Each app window on the Vision Pro has a small horizontal line at the bottom, just like what you’ll find at the bottom of an iPhone screen. On visionOS, it’s a grab handle. You look at it and bring your thumb and index finger together to grab it. Then you move your hand to relocate the window in the space around you in three dimensions—you can push it further away, bring it closer, or just choose to stow it off to the left or the right.

Among the apps I got to use was Photos, which shows off the high quality of the dual 4K displays in the Vision Pro. There was no graininess—the images looked good. Panoramas can be unfurled and wrapped around you. And of course, 3-D photos and videos look amazing. They’re intimate and personal in a way that flat images aren’t.

Apple made a big deal in the WWDC keynote video about how you can use Vision Pro to capture those videos, and I think it was a rare marketing misstep. The sight of the dad wearing a Vision Pro to record a family memory was jarring and inhumane, and that’s not the message Apple is trying to send with this product. Clearly the right use case here is that the iPhone will eventually be able to capture 3-D videos and photos—but since that iPhone doesn’t yet exist, Apple is left demoing the capture from the Vision Pro instead.

In any event, those dual displays really do 3-D content justice, and that includes sports footage—I lost it as I watched an infielder misplay a throw to first at Fenway Park from the first base dugout—and of course Hollywood movies. I got to view some clips from “Avatar: The Way of Water” and they looked fantastic. I’ve been a skeptic of 3-D movies in theaters, but on headsets they really do shine.

Given that Apple calls the Vision Pro a “spatial computer,” I should probably endorse the idea that it actually seems to work as a multitasking device. Each app is its own floating screen, and you can move and resize them with gestures. The apps are all familiar—they’re iPhone and iPad apps—and they feel quite easy to operate on the Vision Pro. I could definitely imagine being productive on a Vision Pro.

One of the key things to understand about the Vision Pro is that Apple doesn’t want you to consider it a device that cuts you off from the world. For that reason, everything defaults to displaying content layered on the real world around you. If you want a more immersive experience, you can turn the Digital Crown clockwise and the content begins to creep around you, filling your peripheral vision and ultimately even the space behind you. But immersing yourself in an alternate reality is a choice, one that can be easily dialed back, and one that can be “broken through” when someone is nearby. (Even when I took an immersive trip to Mount Hood, I could lean forward and see one of my Apple companions begin to appear.)

One of the things I hate about using VR headsets is that I have no awareness of my surroundings; the Vision Pro not only makes it easier to remain in your existing environment, but it will even let the real world break in when it needs to. I think that’s a great decision.

At one point, Apple had me scroll through a web page using Safari. The scrolling was smooth enough, but I was more impressed with how readable the type was. I’m not sure I’d declare it full-on retina resolution, but it was perfectly readable. I realize that doesn’t sound exciting, but I was deeply skeptical that type would render with enough quality for you to consider reading long documents on a VR headset. Apple’s device passed that quality test with ease.

I got to experience a FaceTime call with another Vision Pro user, which means I was speaking to her Digital Persona. (Calls with people using other Apple devices just appear in windows, but Vision Pro users are wearing something on their faces, so Apple constructs a 3-D avatar of your face and shoulders and animates it with your facial expressions as you hold the conversation.) I thought the audio portion of the FaceTime conversation was very well handled—when I slid her window to my left side, her voice moved over there, and her voice picked up room tone that matched the sound of my voice in the same room.

That said… I’m not really sold on Digital Personae as a concept. My FaceTime caller’s face fell into the uncanny valley. It looked sort of like a person, but the expressiveness was a bit wooden and weird. Apple hasn’t really shared a lot about this feature, but it seems to me that you should be able to personalize your persona (including in ways that make it look less like you in the real world) and give users options for other avatars, such as Memojis. I’m not against the Digital Persona as a concept, but it feels like this one isn’t quite good enough yet.

Throughout the process, I never experienced any lag or the sense that the VR system couldn’t keep up. I never felt queasy or uneasy. However, I could never get the device to really fit comfortably. My forehead began to hurt immediately when I started using it and I could never really find a setting that let me use it comfortably, despite tightening and loosening the headband, moving the band up and down on the back of my head, and adjusting the strap on the top of my head. Apple says that it’s still working on different shapes for the piece that fits between the Vision Pro and your head, blocking out light. I hope that addresses the issue I ran into—several of my fellow media compatriots said they had similar issues with fit and comfort.

Overall, I came away from my time with the Vision Pro being very impressed with the hardware and software. It’s Apple doing the Apple thing, bringing its unique combination of assets—custom chip design, a robust app platform from iOS, and an intense focus on interface and the user experience—to bear on the problem of building a mixed-reality computer. It seems to me that nobody on the planet is going to be able to match Apple at the game of building a device like this.

That said, does anyone want a device like this, at any price? Will people want to use one for work? Will people want to use them for entertainment? For all of Apple’s concern about not making a product that’s perceived as cutting you off from the world, don’t most of the use cases for this product seem lonely and solitary, and not appropriate for anyone with a partner or family at home?

I am now a believer that what Apple has built is an incredible accomplishment. This is the real deal. The unanswered question is, to what end?


WWDC 2023 has arrived and Jason and Myke react to the keynote announcements, live from a secluded location at Apple Park! Topics are just what you’d guess: Apple Vision Pro, new Mac hardware (!), and new OS features.


Live from the Apple Podcasts studio at WWDC!
We discuss how we would use Vision Pro in our everyday lives, the announcement(s) that made us clap, the Home & Audio features we were excited about, and the iOS and/or iPadOS features we can’t wait to use.


From a golf cart at Apple Park

Some very quick thoughts about the Apple Vision Pro. From a golf cart. Driving through Apple Park.


By Dan Moren

The new Apple OS features I want right now

Note: This story has not been updated since 2023.

Processing the keynote is like taking in a big meal: sometimes you have to just sit back and digest. So much information flies by that it can be hard to pick out the details that are important to you, but as I compiled the list of features that didn’t make the keynote cut, I found myself thinking about all the new capabilities that would make the biggest difference to me right now.1

A link to the past

Not mentioned in the keynote itself, but something that I’ve been looking for ever since Apple started really doubling down on the features it’s adding to Notes: the ability to link notes together.

Currently I’ve dabbled with Obsidian as a place to keep information about my novel writing, in part because the ability to interconnect my documents in a wiki-like format is absolutely critical to being able to keep track of what is basically a huge database of information.

I’m very curious to see how linking notes actually works, and whether it’s as straightforward as, say, Notes’s ability to add tags.

Hard pass

As the Six Colors Resident Passkey Enthusiast, I’m interested in any implementation details connected to the next evolution of security. And this year Apple delivered on my top feature request: the ability to share passkeys with others. Even better, it’s backwards compatible to passwords as well.

I share social media accounts for several podcast with my co-hosts, and while 1Password does make it easy to share credentials with them, not all of them use the app. But with this feature built into Apple’s platforms, it’s a much easier sell. Plus, it keeps the passwords in sync if one of us changes it.

This is also a big deal for my household, where most of the passwords to shared accounts like my streaming services are shared with my wife (and occasionally with my parents as well). Being able to easily keep that information synced between the two of us will ensure that she can log into the Apple TV if it suddenly starts asking for a password to be entered again.

Zoom zoom

There were more than a few criticisms of the Apple Studio Display’s webcam, but the company’s now addressed what might be the biggest: the previous inability to control the crop of your image. This could lead to some weird looking images, especially when using the Center Stage feature.

A shortcoming that has now been addressed! Both Apple Studio Display and Continuity Camera now offer controls to pan and zoom the image, letting you present exactly the face you want to show to the world. And users the world over have heaved a sigh of relief.

A FaceTime for television

I play in a regular Dungeons & Dragons game via Zoom, and my current setup is to extend my MacBook Air’s display to my Apple TV, and put a Split Screen of Zoom and our game board, in Chrome.

But this is annoying, because in order to get both my wife and I on camera, I have to put the Air on a footstool and drag it into place. And then we always end up with our eyes on the TV, and not looking at the camera. I’d actually tried to use my iPhone in Continuity Camera mode at one point, but it turned out that you couldn’t use both that feature and AirPlay at the same time.

So the announcement that FaceTime will now come to the Apple TV—and, more to the point, third-party apps like Zoom—will be a big improvement. I’m curious to see whether or not there will be a way to do a Split Screen so I can still put up my Chrome screen, though.

Oughta correct

My complaints about typing on my iPhone are well documented, and at long last, it seems Apple has heard the plaintive cries.

Apple promised a bunch of new improvements to typing on the iPhone, including improved predictive typing and better autocorrect. The latter is potentially a big deal, since I’ve found myself often struggling to type a coherent sentence on my iPhone, all too often ending up with words changed into other words I didn’t want, even while other seemingly obvious errors went unfixed.

This is one of those changes that, if it indeed pans out, is going to be a huge quality of life improvement. Especially if it means I can enter a web search phrase in Safari without typing ducking periods between each word.


  1. No, I am not going to install the betas on my devices while traveling, I am not a madman. 

[Dan Moren is the East Coast Bureau Chief of Six Colors, as well as an author, podcaster, and two-time Jeopardy! champion. You can find him on Mastodon at @dmoren@zeppelin.flights or reach him by email at dan@sixcolors.com. His next novel, the sci-fi adventure Eternity's Tomb, will be released in November 2026.]


By Dan Moren

The features that didn’t get discussed onstage at WWDC

Note: This story has not been updated since 2023.

Apple’s Monday keynote at WWDC was jam-packed as usual, but even at two hours there’s never enough time to cover everything that the company is rolling out in its latest platform updates—especially when you’re updating five major platforms and rolling out a brand new headset.

So, as usual, I’ve perused the product pages for Apple’s latest updates to pull out the details about all the new features coming later this year.

I’ve broken these features down by platform, though as always, many of Apple’s features are available across all its devices. As ever, there may be more to come, but this is the most current list I can find as of this writing.

Continue reading “The features that didn’t get discussed onstage at WWDC”…


By John Moltz

This Week in Apple: The keys to the note

In the waning days of the Apple headset as a rumor, please remember to tip your local rumormonger. Remember, they survive on tips. If you’re worried about Apple’s keynote being too long, don’t. It may be our last if AI takes over.

My conspiracy theory board is now full

Have you heard about this headset thing? Big deal. Supposedly getting announced next week. Huge, if true.

Even Apple is… “subtly” is not the word… ham-fistedly teasing big things for next week’s keynote, tweeting that a “new era” will begin. The company also dropped the tag line “Code New Worlds”, perhaps a reference to VR opportunities for developers. (If it’s supposed to be a Star Trek reference, I give it a “B”.)

Certainly other headset vendors believe Apple’s going to announce one, as both Oppo and Meta rushed out some “Hey, look at us!” announcements this week.

Apple execs are on the record as stating that they will not use the term “metaverse”, which makes sense when one of your primary competitors has the first four letters of that as its name. That would be like Pepsi calling its soft drink “a Coke beverage”. Or, more likely after Apple’s offering hits the market, Coke calling its soft drink “an RC Cola beverage”.

This week Hello Games released No Man’s Sky for the Mac, a seven-year-old game that the company promised to ship for the Mac in 2022. I’d give it a “finally” but as they’re teasing another announcement in “the VERY near future”—hint, hint, wink, wink, nudge, nudge, Monty Python reference—the timing seems like it might be tied to an announcement next week that the VR version will be available for the Apple headset.

And if you needed even more evidence, doubting Thomas—if that is your real name—display analyst Ross Young has some specs on the new device to share, including that the display has 5000 nits of brightness.

That’s a lot of nits, none of which you can pick with this rumor.

Go long

The question is, will Apple even have time to announce the headset? Because it looks like the keynote is going to be jam-packed with stuff. Of course, Apple will have to preview the upcoming versions of its operating systems; that’s kind of the point of WWDC.

But, according to Mark Gurman, Apple may also be announcing two desktop Macs with M2 Max and Ultra chips. Is this a new Mac Studio, or will a new Mac Pro finally make its appearance? If it’s the latter, that seems like the kind of thing that might take some time. We have questions. How are they handling graphics and expandability? Also, where the heck has it been? What took so long? Was it stuck in traffic?

Meanwhile, the company is widely expected to be introducing a 15-inch MacBook Air in the near future. Will that also be unveiled at WWDC? Maybe the company could do one of those Oscars things where it holds a second-string keynote to announce all the stuff it can’t jam into the main presentation. John Ternus can get up and say “Earlier today, at the Courtyard by Marriott across the street from Apple Park, I was thrilled to present the stuff we couldn’t jam into the keynote.”

This is all to say that maybe you should hit the restroom before the keynote.

We had a good run

I regret to inform you that Apple is woefully behind in a technology that experts say could kill us all.

Yes, according to the Center for AI Safety

Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.

Uh… wow. OK. That’s… concerning. Is… is it still OK if I use DALL•E to make images of cats in suits holding business meetings?

While Apple is roundly criticized for not keeping up with the “let’s destroy the world” Joneses, that doesn’t mean it’s sitting on its hands. As Mark Gurman noted, the company is looking to hire people “interested in ‘Conversational and Generative AI’”.

So, relax, investors and over-eager members of the tech press! Apple is also speeding us toward our endgame!

Look, if Siri ends up killing us all, we probably deserve it.

[John Moltz is a Six Colors contributor. You can find him on Mastodon at Mastodon.social/@moltz and he sells items with references you might get on Cotton Bureau.]


By Jason Snell

What I’m watching for at the WWDC keynote

Note: This story has not been updated since 2023.

Tim and Craig welcome people to last year's WWDC.

It’s almost time. Apple’s Worldwide Developers Conference (WWDC) is just days away. Come Monday morning, both Dan Moren and I will be somewhere on the Apple Campus, watching Apple unveil whatever it chooses to put in front of the world.

Here’s what I’ll be watching for:

The headset’s developer story. This is a developer conference, and Apple’s launching a new hardware platform. Presumably, that means that Apple is going to have a strong story for its developers about why they should develop for the new platform, whether it’s bringing existing apps over from iOS or creating entirely new AR or VR experiences.

The big question is, how are developers going to be able to build VR or AR apps without having access to the hardware? I’m sure there will be a simulator that’ll run on a Mac, but you can only simulate so much. And once Apple has announced the headset, it’s a lot freer to share it with members of the public—so developers may be able to sign up to try out the hardware at Apple’s campus or in other Apple offices around the world.

But a few days at Apple isn’t the same as having a headset to test with, day in and day out. That’s why I think it’s worth watching to see if Apple provides a pathway for developers to get hardware in advance of the general public. In the past, Apple has offered developer kits during Mac chip transitions and for the release of the tvOS app platform. Perhaps this summer, Apple will allow developers to sign up to be considered for a limited number of test units?

I keep hearing that the hardware is actually done, or close to it—and yet there are also reports that it won’t ship to the public until late this year. Maybe there’s an interim step that allows developers to build apps on final hardware that’s running a special developer build of the new headset’s operating system. It’s just hard for me to imagine Apple getting developers excited about its platform and then telling them all to wait many months until they can actually use the hardware. (Not only is that poor form, but it’s also how you get apps that don’t properly take advantage of the new hardware.)

The headset’s consumer story. Unlike those of us who have been bathing on Apple VR headset rumors for years, most of the general public is going to be hearing about the product for the first time on Monday. So what’s the first impression Apple wants to make?

I’m anticipating that Apple will highlight numerous potential areas of value, as it did nearly nine years ago with the launch of the Apple Watch. But what will those areas be, and how will Apple choose to show them off? Will games be a focus or a peer with other areas like fitness, media, and communication? Will Apple sell the headset as a portable computer with an enormous virtual screen you can use to work productively? The choices Apple makes in terms of how it shows off uses for the product will say a lot about the company’s priorities.

I feel pretty strongly that this product is designed to show the current state of the art in AR and VR, which is why it’s apparently using such pricey components. Apple wants to dazzle people with the synthesis of—stop me if you’ve heard this before—its hardware and software. But it’s also clearly just the first step on a much longer journey, which will include more affordable hardware coming in the next couple of years.

Apple famously doesn’t discuss future products, but the company could presumably blunt a lot of criticism if it made it clear that this is just the beginning of a long-term commitment. The company doesn’t want people to write off Apple or this category but to get excited about the possibilities. If someone walks away from the event thinking they might buy a product like this from Apple someday, that’s a win for Apple—even if they’re not sold on the current device.

How mixed is this reality? All through the development of the headset, we’ve heard reports that some people inside Apple have been very concerned with the idea that a VR headset fundamentally closes you off from the rest of reality. There have been reports that Apple has considered various approaches to counter this issue, from a Digital Crown-style dial that lets you switch between virtual reality and the real world to an external display that shows your face to anyone who’s watching you. (That second one sure sounds weird to me.)

But as someone who has spent dozens of hours using VR devices, I can see their point. Whether I’m home alone or if a family member is in the house, when I’m playing on my Quest 2, I have no idea if someone’s watching me or not. (Also, if a cat has come into the room and I’m in danger of stepping on them.)

My favorite Quest game is Eleven Table Tennis. I can play virtual table tennis in an arena or a ski chalet. But what I can’t do is choose to play it in my house. Part of that is down to the Quest 2’s external cameras being lousy, but assuming that Apple’s headset can provide you with a high-quality view of the world around you using external cameras, wouldn’t it be nice to have the option of playing that game in reality rather than being entirely cut off from the world?

I’m very curious how much Apple will lean into the idea of mixing VR and actual reality. A device that fundamentally prefers that software work in either mode—an overlay on what’s really around you or something entirely manufactured—would be pretty interesting philosophically. (And even in a “pure” VR environment, those cameras could do things like alert you when there’s a cat or a person in the doorway.) I’m curious how much Apple tries to lean into the idea that just because this thing is a headset, it doesn’t mean you’re cut off from the world.

The headset’s price. Or more specifically, is Apple going to announce a price? On the one hand, if the price is going to take people’s breath away (in a bad way), maybe it’s better for Apple if it lets people get used to that price for a few months. On the other hand, if the product’s not for sale and not going to be until fall or winter, what’s the rush? Apple is rarely a company that discloses anything until it has to. It could go either way.

The presence of Mac hardware. WWDC isn’t traditionally an event focused on hardware—except for all the times when it is. The fact is, there are only a handful of times a year that Apple can declare that it’s having a media event and draw the attention of the world—really, WWDC and the iPhone launch in the fall are the only ones you can bank on. And this year, interest is higher than usual because of the headset rumors.

If Apple wants a lot of people to see new Mac hardware, this would certainly be a good time to show it to them…. right? The problem is that the top headline or two or three coming out of the event will certainly be that shiny new headset, not some new Macs. Rumors have been swirling that Apple’s got a larger M2 MacBook Air laptop ready to ship, and since there’s an event, the company could release it. But is that a more effective strategy than waiting a few weeks, when headlines about the headset have faded, in order to put a new product in the spotlight?

I can see both sides of that argument, though I lean toward the idea that Apple can launch new Macs whenever it wants and doesn’t need to do it on Monday. And while reliable reporter Mark Gurman of Bloomberg keeps suggesting that new Macs are on the agenda, he’s mostly using phrases like “as early as at the conference” and “testing Macs ahead of the conference.” This doesn’t actually show a lot of confidence that those Macs, waiting in the wings though they are, are definitely making an appearance.

Gurman has also suggested that a revision of the Mac Studio might suddenly be on the agenda—after quite a long time when it seemed that the Mac Studio would skip the M2 generation entirely. (As Dan Moren and I discussed on Friday’s Six Colors podcast, it makes you wonder if the Mac Pro has been bumped back to the M3 generation, leaving the M2 Ultra chip nowhere to go but into a late revision of the Mac Studio.)

Again, seems like a weird thing for Apple to do—but given Apple’s supply chain since 2020, weird things do happen. The only case I think I can make for such an announcement to be made on Monday is to tie it specifically in with the rumors of the headset. What if an M2 Mac Studio was pitched as the perfect device for developers to use to create new AR/VR apps? At least there would be some thematic coherence.

In any event, I’m prepared to not see any new Mac hardware at WWDC, but if Apple has decided this is as good a time as any to revise the Mac product line, it can do what it wants—and balance the benefit of the increased attention with the fact that the new Macs will be lost amid the other announcements.

And hey, Apple could always tease the Apple silicon Mac Pro. Again.

A new page for watchOS. As I mentioned earlier, we’re coming up on the ninth anniversary of the announcement of the Apple Watch. watchOS has grown and changed a lot in the intervening time, but some of the basic assumptions of its design—watch faces with complications over there, apps over here—haven’t really been touched. Six years into the iPhone, iOS 7 totally reimagined the iPhone experience. It would seem like a good time for Apple to take a step back and reconsider some fundamental aspects of the Apple Watch rather than just tinkering around the edges. Will this be the year? I hope so.

Steady on, other operating systems. Reports suggest that this will be a lighter year for other Apple operating-system updates, given all the effort that has gone into launching the headset. But what does that mean? Historically, even “light” macOS releases like Snow Leopard and Mountain Lion boasted hundreds of new features. If this operating-system cycle is a little different, where has Apple decided to focus its attention? Which features are being introduced for specific platforms, and which ones are being spread across many of them?

Last year’s WWDC felt disorganized, in a way, because so many of the features Apple introduced were available on Mac, iPad, and iPhone—but the company chose to introduce those features in segments devoted to individual platforms. The result was disjointed—oh, this Mac feature is also on the iPad! Surprise!—and didn’t really sell the advantage of a single feature being available simultaneously across most of Apple’s devices. I’ll be watching to see if the company has rethought how it introduces new features for every operating system that isn’t a mixed-reality headset.



Packing for WWDC and marketing challenges

We’ll be at the keynote, but will Macs be there too? And how will Apple thread the needle on rolling out this headset thing?


The Max launch and prioritizing tech stacks; Netflix’s ad viewers and the future of ad-free streaming; and in Sports Corner the regional sports network collapse has begun. Also, we announce our own plus—Downstream+!


Our current device-charging setups, how we manage our windows, our weather apps of choice, and the travel items that have saved our bacon.



by Jason Snell

Reddit has it in for third-party clients

Every now and then I type something into Spotlight on the iPad and I see that Twitterrific is still installed. I can’t bear to uninstall it, but since Twitter killed third-party client apps, I don’t use Twitter very often. Maybe I check one of my lists once a day. That’s it. And I used Twitter a lot—with my third-party client of choice.

The geniuses who own Reddit have apparently decided to walk the same path as Twitter. Here’s the report from Christian Selig, the developer of Apollo, a leading (and beloved) Reddit app:

Had a call with Reddit to discuss pricing. Bad news for third-party apps, their announced pricing is close to Twitter’s pricing, and Apollo would have to pay Reddit $20 million per year to keep running as-is.

Apollo made 7 billion requests last month, which would put it at about 1.7 million dollars per month, or 20 million US dollars per year. Even if I only kept subscription users, the average Apollo user uses 344 requests per day, which would cost $2.50 per month, which is over double what the subscription currently costs, so I’d be in the red every month.

Not only is the price ridiculous, but (as Selig shows with some back-of-the-envelope math) it’s far beyond what Reddit itself makes on its users. As with Twitter, there is a path for Reddit to walk that allows Selig to build a sustainable app business and for Reddit to be compensated for its service. But this isn’t it.

If Reddit continues on this path, it may discover that some of its most devoted users are devoted because they love Apollo. And if it vanishes, many of those users will too.


By Dan Moren

The Back Page: Release Notes for Apple Reality 1.0.1

We’re delighted you’ve chosen to embark upon Apple Reality. Today is the first day of a whole new world for you, and we hope that you enjoy living in it as much as we did creating it.

With Apple Reality, we’ve taken it upon ourselves to not only provide you with an immersive experience but to actually improve on reality itself.

We’re committed to making Apple Reality the best reality you can experience, and to that end we plan regular updates to add new features, improve existing capabilities, and fix any bugs that may arise. A major update coming later this year will add one of our most requested features: the ability to experience multiple realities.

Today, we’re releasing Reality 1.0.1. This launch-day update is recommend for all Reality users and includes the following enhancements, bug fixes, and security updates:

  • Corrected inconsistent rendering of sky that could make it appear white or gray and fixed issue where it could leak.
  • Users can now drive on driveways and park on parkways.

  • Developers can now retrieve both position and speed of electrons.

  • Improved chances to encounter Loch Ness Monster and Sasquatch.

  • Honesty is no longer the best policy. The best policy is now “eat dessert first.”

  • Includes several patches to English language. “Through” and “tough” now rhyme. “Flammable” and “inflammable” are no longer synonyms. “Combobulated” and “whelmed” are now perfectly cromulent words.

  • Fitted sheets can now be folded neatly.

  • Improved reliability of weather forecasts.

  • Keys and eyeglasses are now where you left them.

  • Appendices, which could on rare occasion become inflamed and burst, have now been removed for all users.

  • Fixed a bug where some dogs couldn’t be petted.

  • Fixed a bug where Earth’s temperature rose consistently over time.

  • Fixed a bug where, in certain cases, P could equal NP.

  • Fixed a bug where the San Francisco Bay Area experienced a disproportionate number of earthquakes.

  • Fixed a bug where rainbows appeared to contain a color called “indigo.”

  • Fixed a bug where the original Star Wars films could be modified via privilege escalation.

  • Fixed a bug where cilantro could taste like soap for some users.

  • Fixed an issue where a remote attacker could gain root privileges over Twitter via an out-of-bounds error and $44 billion.

  • Fixed an issue with Earth rotation that required the creation of an extra day every four years, but not every hundred years, except every four hundred years.1

  • Fixed an issue where Freaks & Geeks was canceled after one season.

  • Fixed an issue where the U2 album “Songs of Innocence” appeared on every single iTunes user’s account.

Some features may not be available for all regions, on all Apple devices, or in every reality.


  1. Seriously, what the hell, Dave? 

[Dan Moren is the East Coast Bureau Chief of Six Colors, as well as an author, podcaster, and two-time Jeopardy! champion. You can find him on Mastodon at @dmoren@zeppelin.flights or reach him by email at dan@sixcolors.com. His next novel, the sci-fi adventure Eternity's Tomb, will be released in November 2026.]


by Jason Snell

How I edit podcasts (2023 update)

I’ve updated another eight-year-old article of mine, bringing my discussion of my podcast editing technique a bit more up to date.

Interestingly, my methodology hasn’t really changed much. I still use the Remove Silence command in Logic to separate sounds into visible blocks, and then edit left to right, looking for collisions and interruptions.

The big changes since 2015: I’ve abandoned Skype for Zoom, and Skype Call Recorder for Audio Hijack. And Ferrite Recording Studio on iPad is now in the mix. But my article also covers editing in GarageBand, since it’s free. It’s been literally a decade since I pointed out that GarageBand would be fantastic for podcast editing with a few very small feature additions that already exist in its big brother, Logic. Unfortunately, Apple has never bothered to add them.



by Jason Snell

‘MLB to produce and broadcast Padres games’

After the failure of bankrupt Diamond Sports to pay licensing fees, Major League Baseball will take over all San Diego Padres broadcasting starting Wednesday:

As a result of the new arrangement, Padres fans can now obtain a new direct-to-consumer streaming subscription for $19.99 per month or $74.99 for the rest of the season by registering at MLB.TV.  This offer is only for Padres fans in the Club’s Home Television Territory and is a separate service than the MLB.TV out of market package.  By offering a direct-to-consumer streaming option on MLB.TV in the Club’s territory for the first time, MLB is able to lift the blackout for Padres games previously distributed on Bally Sports San Diego.  Fans can also find more information about the availability of Padres games at Padres.com/tunein.

The oncoming failure of regional sports networks in the face of cord cutting is one of the more interesting media stories of our times. While some local cable channels have begun to sell games to cord cutters—Red Sox broadcaster NESN was the first—this is the first time that Major League Baseball itself has taken over all production for a team’s games, and is streaming them directly in the MLB app. (For continuity’s sake, the games will also be on local cable, satellite, and Internet TV providers in the Padres’ geographic territory.)

It sure feels like a milestone moment in the future of sports broadcasting—and the unwinding of the exclusivity of cable TV for sports broadcasting.



Search Six Colors