Six Colors
Six Colors

Apple, technology, and other stuff

Support this Site

Become a Six Colors member to read exclusive posts, get our weekly podcast, join our community, and more!

By Dan Moren for Macworld

iOS 17 is finally tapping into the iPhone 6’s full potential

Even for what is reputedly a somewhat smaller than usual annual update, iOS 17 still brings with it a host of new features. As the beta process begins, there’s plenty to investigate and try out ahead of the software’s full release this fall.

But as I perused the capabilities that are part of this latest upgrade, something interesting struck me: an older technology appears to be having its moment in the sun, as Apple embraces its utility in a bunch of new ways. That’s great not only because it means new features, but also because—as it’s something that’s been around for a while—those new features will be available on any iPhone that runs iOS 17.

I speak, of course, of the iPhone’s near-field communication (NFC) chip.

Continue reading on Macworld ↦


by Jason Snell

Google Assistant crosses openness off its list

Jason Marr, developer of the excellent AnyList app I’ve used for shopping lists for years:

Google is shutting down the Google Assistant Notes & Lists integration for non-Google apps on June 20, 2023. Unfortunately, this means that beginning June 20, it will no longer be possible to use Google Assistant to add items to AnyList. We know many of our customers rely on AnyList’s integration with Google Assistant and that the loss of this feature is frustrating and disappointing.

This decision by Google to kill third-party app integration in favor of Google Keep means that my Nest Home Hub just became a whole lot less useful. Going forward, I guess I’ll try to use Siri for this purpose. The key phrase is, “Hey Siri, add apples to my grocery list in AnyList.”

It’s a mouthful, but I don’t have a lot of great options here. Well done, Google, in wrecking the utility of your own little kitchen gadget.

Update: After talking to many AnyList users, it seems like the best way forward is to share a shared AnyList list with Reminders, and share that Reminders list with your shared AnyList participants via iCloud. At that point, you can add items to the shopping list by saying “Hey Siri, put x on my shopping list,” which will add it to Reminders. The next time anyone who is using the shared Reminders list launches AnyList (say, at the grocery store), all items in the list will sync.


Apple’s Fitness streaks need more humanity

Craig Grannell at Stuff argues that Apple’s approach to Fitness streaks ultimately does us a disservice:

The problem is the thin line between encouragement and assuming everyone is a robot. At best, Apple’s exercise system wants you to maintain a strict 100% record, forever. And it periodically nags you if you’re not improving your stats. Ran a marathon the previous day? “Your rings are usually further along by now, you slovenly disappointment!” 97% full of snot due to flu? “Get up, lazybones! Or I will hurl your streak into the sun!”

I am one thousand percent in agreement with Craig on this front. Many, including myself, have argued that Apple ought to come up with a system that’s more lenient: rest days, streak recovery1, streak pausing, whatever.

Ultimately, these systems are here to help encourage us to be better, which is great. But as Craig points out, that can backfire when a streak is broken, especially due to circumstances out of our control.2


  1. I absolutely love that Knotwords lets you recover a broken streak when you finish a new seven day streak. 
  2. Personally, as someone who in this past month has had both COVID and a flare up of a mobility issue, any bit of fitness streaks are utterly destroyed. 

By Shelly Brisbin

Vision Pro accessibility: Great potential, but many unknowns

Note: This story has not been updated since 2023.

The Apple Vision Pro announcement was not focused on accessibility, but the product will definitely be accessible to people with disabilities. Existing technologies like VoiceOver and Dwell Control will be integral to the way people with disabilities use the product. Apple is bringing an astonishing number of accessibility features found on other platforms to the headset.

There. That’s sorted.

For most observers, Apple’s WWDC sessions about how to build accessible apps for the headset is as far as they feel the need to go. It’s accessible. Apple has once again considered the needs of users who interact with their tech differently than most do.

It’s true… but there is a lot more to say, even many months before the headset ships. And more to say about who is excited for the device and how it can actually enhance accessibility of the world in which it finds itself. There are also a lot of understandable unknowns about whether the user experience might tempt a specific person with a disability to part with $3500, come 2024.

Biggest potential

The interaction method at the heart of Vision Pro and visionOS is eye gaze—interacting with an item onscreen by looking at it, rather than touching or clicking it to gain focus. That’s a method of interaction already familiar to people with disabilities who don’t use touch gestures or handheld input devices and trackpads to interact with their phones or computers. In many cases, an eye blink or a mouth stick are used to act on the focused item, when users can’t touch a screen or input device.

Eye gaze access is available in some, but not all contexts on Apple platforms. So Vision Pro potentially offers a better experience to someone who uses their eyes to scroll or select things onscreen. In this way Vision Pro could be the most accessible Apple platform yet for someone with motor disabilities like cerebral palsy or quadriplegia that prevents or limits the use of one’s hands. We’ll need to learn more about potential input methods for this community, but Vision Pro could be a game changer.

Head-mounted theater

Lots of people have assumed that I, a person with low-vision, must be incredibly excited about using Vision Pro. After all, content will be close to my eyes, where it needs it to be before I can see it fully. No need to sit super close to a TV or use magnification devices as I sometimes do today. The movie or show is right there, just in front of my eyeballs. And like Apple TV, audio description will be available to catch what I don’t when the content has been described, as it is on Apple TV+. True enough. If there’s one thing I’ll personally benefit from when I eventually strap a Vision Pro to my head, it’s consuming entertainment.

But it’s unclear to me at the moment how using Vision Pro as a computer—gazing at a specific item in order to act on it—will work for me. Will the promised support for Dynamic Type, zooming, and bolded text be enough to make getting work done on the headset possible? I really don’t know. Depends on how zooming in on the screen changes/shrinks the amount of information available—how flexible the “Finder” of Vision Pro is, visually. And I’m doubtful I personally can rely on eye gaze to focus in on particular items. I might need to rely on the VoiceOver screen reader to a greater degree than I do with iOS.

I’m sure there will be alternatives that allow me to use the Vision Pro, but the question for me, and for blind users, too—many of whom are already excited about the platform—is whether the device will give enough of an upgraded experience to make it worth seriously considering it as an alternative to a Mac or an iOS device when I’m doing anything beyond watching a movie or playing a game.

Dream scenarios

Like everyone who loves technology, those of us with disabilities dream about things we might do with a brand-new product that are better than we can right now. When Apple’s wearable-device rumors centered on a pair of glasses, rather than a set of goggles, many or us hoped we’d be able to use the wearable as a navigation aid. We already have human- and AI-driven tools on iOS that allow us to navigate our environment, identify objects and people and read text we find on sings or documents. And specialty accessibility devices from Orcam and Envision, with four-digit price tags, already fulfill some of this promise, as do a handful of really cool iOS apps. Vision Pro includes 12 cameras, LIDAR, text recognition and a number of other features it would need to become a navigation aid for blind people.

But third party apps won’t have access to the camera, and the size, weight and battery requirements – not to mention the aesthetics –of version 1 of the headset seem to indicate it’s not intended to be used for travel.

Many have already imagined games and other immersive experiences they could enjoy on Vision Pro. And some people with disabilities seem to be taking for granted that developers will go that extra mile to support alternative display and input methods offered up by visionOS, not to mention building custom apps with audio-first content.

I’m cautiously hopeful here, because while many iOS and macOS developers have prioritized accessibility, others, particularly in the gaming and entertainment world, have not. And it seems that from what Apple has shared so far, providing accessibility in visionOS apps will need to be even more intentional. At a price point that limits the initial user base of the headset quite a bit, convincing developers to do the work of bringing full accessibility to their apps will be even more important than it has been to date. It takes time and intention, and users will need to be able to encourage developers to understand that the learning and work involved to become accessible is worth it.

[Shelly Brisbin is a radio producer and author of the book iOS Access for All. She's the host of Lions, Towers & Shields, a podcast about classic movies, on The Incomparable network.]


Developers are getting their hands on (simulated) visionOS for the first time, and we speculate about how spatial apps might work. And for the Summer of Fun, we gauge our excitement level for various Apple-related product rumors.


By Joe Rosensteel

Vision Pro and the challenge of 3D movies

stereoscopic plane example
(Apple)

No matter what you think of the Vision Pro headset or 3D movies, it’s become apparent over the last few weeks that a lot of people need a primer on 3D, stereoscopic movies. Love them or hate them, there’s no escaping that they’re going to be a subject of conversation again, just as they were more than a decade ago.

Captive audience

Back in the 2000s there was a push to increase movie ticket prices without making major alterations to seating. Stereoscopic movies were an interesting possibility. Sure, they were more difficult and expensive to make, but the advent of digital projectors meant that theaters could be adapted to show them relatively easily. And of course, a 3D blockbuster with impressive visual effects that would give audiences a reason to pay a bit more.

Most 3D theaters are set up with a digital projector and a polarizer set up in front of the projector from a company called RealD. Left and right images would be projected onto the screen at the same time, and special glasses worn by the audience would filter the polarized light to display separate images in both eyes. It’s the same principle as polarized sunglasses or circular polarizer filters for cameras. (Extremely bright or contrasty parts of the image might bleed through from one eye to the other, creating “ghosting.”)

The problem with this approach is that the single projector can still only output an image at its maximum brightness, which is then cut in half by the polarizing glasses. The result is that 3D movies often seem dim. There are also the gross plastic glasses, which would also have to fit over any prescription lenses you might need to wear.

So to recap: They wanted you to pay more in order to see a movie that wasn’t as bright or clear, and you’d need to wear some weird glasses for the privilege.

Continue reading “Vision Pro and the challenge of 3D movies”…


By John Moltz

This Week in Apple: The mode less traveled

Apple and Reddit’s relationship with those who deal with their users continues to be a rocky road. Meanwhile, visionOS hits the streets.

Toward a more perfect union

They say both that no news is good news and that there’s no such thing as bad publicity, but if Apple’s relationship with its labor force is any example, the former is more accurate than the latter. Sadly, instead of improving, Apple’s stance on unions seems to have entered what experts in spiraling destructive thought call “irascible Facebook uncle mode.”

Apple Inc “coercively interrogated” retail employees about their pro-union sympathies…

They strap an Apple Vision Pro to your head, play Zack Snyder’s Justice League: Justice is Gray and repeatedly ask “Is it safe?” Truly barbaric.

…and restricted the circulation of union flyers…

Ironic that throwing that “1984” ad back in Apple’s face got played out years ago over important stuff like non-replaceable batteries and not allowing Flash on iOS.

The company somehow still seems to think that this is a winning move. Indeed, an unnamed Apple TV+ executive even admitted that the company is in favor of a hard line against the striking writers’ union, partly because a good outcome for writers “encourages industries in some countries to unionize in order to make more money.”

It’s OK for us to make more money, we just don’t want you to make more money.

You can argue that Apple is a business and that’s what businesses do. They’re in the business of doing business. But Apple does a lot of good things that aren’t obviously tied to the bottom line—like sustainability, privacy protection, and accessibility—and it could likewise choose to have a better relationship with its workers. If Microsoft can do it, so can Apple.

Reddit is going great

Thanks for asking.

“Reddit’s average daily traffic fell during blackout, according to third-party data”

Well, that’s just a small decrease. Surely when this blows over all will be well and…

“Some subreddits are now filled with porn to protest Reddit”

Now when your mom finds you looking at porn on your computer you can just say it’s to protest costly APIs.

Reddit communities are switching to NSFW to create some friction…

I see what you did there.

…and rob Reddit of ad revenue.

Reddit has since cracked down on this practice, removing moderators of forums that had voted to switch to NSFW. Some mods have since stepped down.

The r/formula1 mod is stepping down, citing “the admin handling of the API changes, the tone Huffman elected to use in every single interview, and because of the whole approach of u/ModCodeOfConduct.”

Apart from forcing beloved apps like Apollo out of business, Huffman’s attitude is not exactly making things better.

“It is really demoralizing,” Bucky said. Being a Reddit moderator and dealing with users is already difficult, he said. “‘I take all this abuse for you, and keep your website clean, and this is how you repay us?’”

Maybe Reddit moderators need a union, too.

Moving to stay still

Apple released the software development kit for visionOS this week, allowing developers to begin to experiment with the new platform. The new operating system includes a number of different environments, so if you’ve ever wanted to do spreadsheets on the moon, you’re going to love this.

Notable features include Guest Mode, which is how you’ll deign to let others try using your Vision Pro—when you surely get one—without allowing them access to all that stuff you’ve been posting to Reddit forums lately. It also features a Travel Mode which, ironically, “ensures that you’re stationary while you’re wearing the Vision Pro.”

In order to be traveling, you must be motionless. This is some real Zen stuff.

The “frosted glass” aesthetic of recent releases of macOS starts to make more sense when you see it in visionOS. Seeing the hint of a Finder window behind the window in front might have been annoying but seeing the outline of a kitchen countertop or desk or sinkhole can be a safety affordance.

We never knew doing spreadsheets could be so dangerous until we were walking around doing spreadsheets.

Developers will have their work cut out for them determining the best way to balance their user interfaces for this new environment, but as Commander Adama said in that Star Trek movie, this is where the fun begins.

[John Moltz is a Six Colors contributor. You can find him on Mastodon at Mastodon.social/@moltz and he sells items with references you might get on Cotton Bureau.]


Passkeys, voicemails, and discoverability

Finding the gaps in Apple’s passkeys for Apple IDs, the curious.case.of.iPhone.search.periods, some details about Live Voicemail, and Jason falls down an iOS discovery rabbit hole.



By Jason Snell for Macworld

Pro tip: The iPhone has a discoverability problem

I have spent the last couple of decades writing about new features in Apple’s operating systems. Tens of thousands of words about new items, large and small, that enhance the experience of using a Mac, iPhone, iPad, and other Apple devices. And yet this weekend, I was reminded that most users simply don’t notice new features, even when they’ve been available for years.

If you’re reading this column right now, you’re one of the most well-educated people on the planet about Apple stuff. But your friends, family, co-workers, and acquaintances? They might never know about flashy new operating-system features unless you personally show them off. It’s one of Apple’s most vexing problems: keeping devices relatively simple while also trying to make complex new features discoverable.

Continue reading on Macworld ↦


By Dan Moren

iOS 17’s Live Voicemail finally makes this iOS 13 feature useful for me

Note: This story has not been updated since 2023.

Some software features have a delayed impact, like the clap of thunder reaching you seconds after a lightning strike. They may not be useful right away, but at some point something comes along that makes you realize that they’re just what you needed.

For example, take Apple’s introduction of Live Voicemail in iOS 17. Not only is this technologically impressive feature—which uses your phone to answer incoming calls and transcribe the voicemail being left as it happens—handy in and of itself, but it actually reverberates backwards through time to make an older iOS feature more useful.

Because Live Voicemail finally means I can turn on another feature that I’ve been tempted to use since iOS 13: Silence Unknown Callers.1

I’ve always been reticent to turn on Silence Unknown Callers because I worry too much about missing important calls. There are simply too many times that I get a call I don’t want to miss from, say, a doctor’s office, or a delivery person, or a contractor. Let’s be frank: I’m not going to add all these people to my contact list. And in some cases, even if I do have them in my contacts—the urgent care line in my child’s pediatrician’s office comes to mind—a call doesn’t always come from the same number.

But in iOS 17, if you have Silence Unknown Callers active, callers with unrecognized numbers will go straight to Live Voicemail, allowing you to decide whether or not to pick up. (Meanwhile, numbers that are already marked as spam by, say, your carrier, won’t even trigger this.) For those of us old enough to remember answering machines, it’s the equivalent of screening your calls. It helps insure that you can still get the benefit of not having to answer every call while not ending up playing phone tag with that one person you’ve been trying to catch.


  1. Settings > Phone > Silence Unknown Callers. 

[Dan Moren is the East Coast Bureau Chief of Six Colors, as well as an author, podcaster, and two-time Jeopardy! champion. You can find him on Mastodon at @dmoren@zeppelin.flights or reach him by email at dan@sixcolors.com. His next novel, the sci-fi adventure Eternity's Tomb, will be released in November 2026.]


How we save links for later, the app we’d make for Apple Vision Pro, how we listen to music and podcasts, and tips for tech to use when moving across the country.


Why.are.there.periods.in.my.iPhone.searches

Dan Kois, writing at Slate, bemoans one of my biggest annoyances with the iPhone keyboard:

No, no, I’m not searching for “luxury poisons for the rich.” But my Google searches, like hers, are lousy with periods. According to prosecutors’ filings this week as they urged the judge to deny bail, internet searches found on her phone included “what is a lethal.does .of.fetanayl” and “how to.permanently delete information from an iphone remotely.” I, too, somehow end up typing searches into my phone that are full of periods where I wanted there to be spaces, as if I’m William Shatner, emphasizing.each.word.I.type.

This has driven me bananas for many years now. Is it just my big thumbs missing the spacebar? Or is Apple overcorrecting on making sure the period is in there if I want to type a web address. I’m not saying, unlike Kois, that I never want to put a URL in this field, but I definitely search much more than I enter an address by hand, and this does end up more frustrating than useful.

The latest updates to Apple’s platforms have promised improved autocorrect and predictive text—is it too much to hope they might eliminate the dreaded period problem as well?



By Dan Moren

Apple adds passkeys to Apple ID, iCloud logins

Note: This story has not been updated since 2023.

As our resident passkey beat editor, I was glad to see that Apple has now added the ability to log in to your Apple ID or iCloud.com using a passkey instead of your password. The feature’s been rolling out today, and can be tested on devices running the iOS/iPadOS 17 or macOS Sonoma betas.1

Apple ID Passkey

Using this feature on iOS/iPadOS is pretty straightforward: when you go to an Apple website that requires your Apple ID to login, including iCloud.com, the Apple Developer site, or the Apple ID management site, you’ll be asked if you want to sign in and authenticate with Face ID.

On the Mac side, when you enter your Apple ID in a browser, you’ll see a new option to Sign in with iPhone. Clicking this will bring up a QR code that you can scan with an iPhone or iPad, which will in turn authenticate you with Face ID on that device, and then log in on the Mac. I’ve confirmed that it works not only in Safari, but in Chrome on macOS Sonoma as well.

I do find it a little bit odd that the macOS implementation currently doesn’t seem to let you use Touch ID on your Mac to log in, rather kicking you to verify via your mobile device. On the one hand, that does bestow the additional security of using a second factor—an item that you have—but that’s not required on iOS or iPadOS, which would seem to be at more risk of being lost or stolen.

Another interesting tidbit: I can’t locate the saved passkey in the Passwords section of System Settings on my MacBook Air running Sonoma. This suggests it’s not synced between your devices, but perhaps using a distinct passkey generated on each iOS device. Neither is there an option right now to add such a passkey to a third-party password manager, like 1Password.

I also did test and was able to confirm that failing the Face ID authentication multiple times2 will revert to the device’s passcode, so it doesn’t add any additional security for those worried about someone with their passcode gaining access (or changing) their Apple ID details.

It’s certainly good that Apple is eating its own dog food here, given how much they’ve pushed passkeys, even if the implementation does seem a bit odd.3 While this may not provide as much additional security as the hardware security key support added earlier this year, it’s decidedly easier to use.


  1. I’ve verified that it also seems to work on macOS Ventura in some cases—specifically via third-party browsers like Chrome and Arc. 
  2. Which I achieved through the very scientific method of “putting my finger over the Face ID camera.” 
  3. Granted, every company seems to take a different approach to introducing users to passkeys at present, which is one thing that may slow adoption of the technology. 

[Dan Moren is the East Coast Bureau Chief of Six Colors, as well as an author, podcaster, and two-time Jeopardy! champion. You can find him on Mastodon at @dmoren@zeppelin.flights or reach him by email at dan@sixcolors.com. His next novel, the sci-fi adventure Eternity's Tomb, will be released in November 2026.]


Myke and Jason answer your questions about Vision Pro, catch up with some new beta details, saddle up for a new Rumor Roundup, process the Reddit debacle, and detail our summer plans.


By Jason Snell

WWDC 2023: More responsive iOS camera apps

Note: This story has not been updated since 2023.

An apple employee doing a skateboard jump on the session video.

Usually WWDC week is for new OS features. But this year, WWDC was stuffed full of new Macs and an entirely new Apple platform, so I’m slowly digging out of that and starting to dig into the new OS features announced last week. In the “Create a More Responsive Camera Experience” video, there are some exciting new features that should improve the experience of taking iPhone photos.

A lot of the secret sauce Apple uses to generate iPhone photos involves taking multiple images and then fusing them together. The most high-profile of these features is Deep Fusion (often jokingly referred to as “sweater mode,” referring to Apple’s demo images of detailed images of the weave in a sweater). Deep Fusion can generate great results, but it takes time to run. If you take a Deep Fusion photo, you may end up having to wait before taking your next shot—and you might miss something great in the meantime.

In iOS 17, a new deferred photo processing feature allows camera apps to push off image processing until after your camera session is complete. The result is that the shutter button becomes active almost immediately, so you can take more pictures—which is a good thing. The system saves a temporary, unprocessed image to your photo library as a placeholder, and when your phone is no longer busily shooting photos, it will fuse the image captures in the background and then replace the proxy with a full-fledged Deep Fusion photo.

The other update to camera-capture features in iOS 17 is designed to reduce shutter lag, which is the unfortunate effect where your camera captures an image a few fractions of a second after you pressed the shutter button. The iPhone camera is capturing images at 30 frames per second and can use multiple images to fuse together something nice, but it can’t go back in time—or can it? In fact, in iOS 17 the camera buffer can capture all the time during a shooting session, so when you press the shutter, it is able to capture the moment you intended—and use previous frames to help generate the final images, not just future ones. The result, according to Apple, is true “zero shutter lag.”

The iOS 17 responsive capture and fast capture features will work on iPhones with an A12 Bionic chip or newer.


By John Moltz

This Week in Apple: Fractionally cheaper

Get our your notebooks because we’re about to learn a valuable lesson in the relationship between output and cost. It will be on the final. Speaking of textbook cases, Reddit provides another while Apple continues to struggle with greasing the wheels of the game business.

Startling revelations about economies of scale

Wait! Don’t buy the Apple Vision Pro yet!

First of all, you can’t, so if you think you’re buying one right now you’re probably being scammed. Is it Carl? It’s probably Carl. Tell him I told you to tell him to call me because he has my Dead Milkmen CD.

Second, WERE YOU AWARE that a cheaper version will be coming later?!

Huge. If true.

Yes, according to Mark Gurman (why haven’t I already set up a shortcut for that attribution?), a cheaper Apple Vision something will be coming before 2026. Which leads to the question: what could they take out to make it cheaper? Removing the eyeballs (the virtual ones, not your real ones, that would probably be even more expensive) would seem like the obvious choice, and one that a lot of people might actually want. But Apple seems to consider that a key feature. A super creepy key feature.

Gurman believes Apple could “knock several hundred dollars off the price” just by making more of them, removing automatic lens adjustment and the 3D camera, and using a cheaper frame.

Is “several hundred dollars” enough to make a dent in a device costing $3,499? If the Apple Vision Air is $2,999, it’s still [checks bank account] $2,987 more than I have to spend on it.

As someone who gets queasy in VR, I hope Apple doesn’t skimp on resolution. If I do throw up from using one, at least I’ll be paying “several hundred dollars” less for the privilege.

Love it or Reddit

They say that if you find yourself in a hole you should stop digging. What they apparently didn’t tell Reddit CEO Steve Huffman is that if you’re in a hole you should also stop blasting with dynamite.

Facing criticism for imposing high fees and short timelines on apps using Reddit’s API—changes that have caused several third-party app makers to stop development—Huffman has doubled down on the company’s actions.

“It is essential for us to be a sustainable business, whether or not we go public,” Huffman said. “Now, we would like to be a public company.”

“Would I like to win the Wall Street lottery? Sure. Enough to destroy the community we built? Same answer.

Huffman said in an interview that he plans to institute rules changes that would allow Reddit users to vote out moderators who have overseen the protest, comparing them to a “landed gentry.”

Yes, there is nothing that says “huge passive income generation” like having the highly lucrative job of… let me just put on the ol’ reading glasses here… “Reddit forum moderator”. Certainly they are the ones to compare to foppish, handkerchief-waving 17th century land barons, as opposed to the guy who’s actually doing all this to make gobs of money, whether it’s from going public or forcing AI companies to pony up for Reddit’s grist for their large language model mills.

Reddit went on to send a threatening note to moderators of forums that closed to protest the company’s API changes, suggesting lower moderators who wanted to reopen could have them removed. Can we all agree that as a CEO it just makes sense to get into a huge argument with… let me just put another pair of reading glasses on on top of the previous ones… “the people who love your platform the most”? Definitely the smart takeaway from the last 12 months.

Speaking of which, no one is probably more thankful for Huffman’s continued public tantrum than new Twitter CEO Linda Yaccarino. Her vapid introductory tweet thread paying lip service to free speech just days before banning a prominent Tesla critic would probably have been bigger news if not for Huffman’s burning desire to be bean CEO. She should send him a nice fruit basket.

Hunting big game

One of the less-discussed pieces of news coming out of WWDC was Apple’s Game Porting Toolkit. By providing DirectX 12 support, the company is allowing developers to more easily test and port their games to macOS. This is terrific news and, as Christina Warren notes, could be—yes, literally and figuratively—a real game-changer for the Mac’s position in the game market.

Of course, despite the fact that Apple has made it much, much easier for developers to port games to the Mac, it does still require work. Worse, it requires work in Xcode, which is like telling a pour-over nut they now have to make their coffee in an AeroPress.

Like an animal.

Will this move the needle for games on macOS? Probably some, but maybe not as much as you’d hope.

But as long as we’re on games, if Bungie could bring back Myth, that’d be great. TIA.

[John Moltz is a Six Colors contributor. You can find him on Mastodon at Mastodon.social/@moltz and he sells items with references you might get on Cotton Bureau.]



Search Six Colors