Six Colors
Six Colors

Apple, technology, and other stuff

Support this Site

Become a Six Colors member to read exclusive posts, get our weekly podcast, join our community, and more!

Netflix adapts to selling ads, and your letters! [Downstream+ subscribers also got: Apple TV+ cancelling Jon Stewart and raising prices, YouTube’s news challenge and the value of YouTube culture, and Jason Kilar’s big idea.]


Apple’s event was shot on the iPhone 15, and that’s still impressive

Frequent contributor Joe Rosensteel writing at his own blog about the fact that Apple’s “Scary Fast” event was shot on the iPhone 15 Pro Max and that some people have taken issue with the fact that Apple still used professional lighting, rigs, and crews to accomplish that:

This whole kerfuffle is similar to something from only a couple months ago, where people got all worked up about The Creator being shot on the Sony FX-3. The camera, in and of itself, didn’t shoot that movie. The workflows enabled by having a smaller camera, were complimented by the nimble, resourceful team shooting the project. If someone ran out and bought a FX-3 they wouldn’t have The Creator any more than running out and buying an iPhone 15 Pro means you’re going to make an Apple video presentation by yourself.

Unsurprisingly, Joe’s take on this is smart and on the money. The iPhone 15 has an amazing camera, and being able to swap it in to a professional setup is pretty incredible. It doesn’t mean you’ll immediately be able to duplicate the results at home, but think about all the things you could do.


Apple’s missing Macs, our last Lightning peripherals, our Touch Bar experiences, and whether we have ever used burner accounts.


We discuss Apple’s “Scary Fast” event and critique the company’s commitment to the bit.


By Dan Moren for Macworld

Three ‘Scary’ details Apple didn’t want to tell you during its Mac event

Apple’s goal with its product events is manifold: it wants to introduce new devices to its customers on its own terms, while simultaneously putting a stake in the ground to both fire shots at its competitors and signal to investors that it’s continuing to come out with products that are in line with the Apple brand.

The Scary Fast event that Apple held on Monday night to introduce its new M3-based Macs was an unusually short and to the point for the company, whose iPhone and WWDC videos usually run an hour or more. There were just a few announcements of new Macs: 14-inch and 16-inch MacBook Pros in a variety of flavors, as well as a slightly revamped iMac using the new processor.

While the event itself might have seemed largely pro forma, it wasn’t without significant some significant details—even if you did have to root around a bit to find them.

Continue reading on Macworld ↦


By Dan Moren

The Back Page: The horror, the horror

Dan writes the Back Page. Art by Shafer Brown.

Halloween! The night when ghouls and goblins roam the streets, demanding treats of innocent residents. When fear haunts our every waking moment, and the line between the worlds of the living and the dead is blurred.

Also, when Apple (roughly) hosts an event to announce…*lightning, crash of thunder*, new Macs, mwahahahahahaha…

What seemed like it might be a one-off event from the year 2023 is destined to become an eerie tradition, as every year, Apple will announce a terrifying line-up of new Mac technology on the evening before Halloween.

With each year, however, the announcements grow more and more terrifying, until customers can hardly tune in to watch, lest their deepest, darkest fears be realized by the grinning reaper that is Tim Cook.

To alleviate these most frightful designs, we have cast the bones and consulted the omens of the two-headed oracle known as “Johnjohny” and can now exclusively reveal to you the spine-chilling announcements coming over the next several years.

2024: After having concluded its transition to USB-C, Apple immediately abandons the protocol for a new port, completely incompatible with all previous connectors, called C-BSU. It features a star-shaped connector that provides amazing throughput and unprecedented fast charging, but only if you put the correct point of the star facing upward. In all other directions, it will fry your devices and wipe all the data on them.

2025: A brand new model of Mac is unveiled, the Mac Quadformis 6300CD/AV. It features a 1080p webcam, one port, first-generation MagSafe, and a Touch Bar. It also includes dongles for backward compatibility with Apple Desktop Bus and Token Ring, and comes standard with a 14.4kbps modem, hockey puck mouse, and butterfly keyboard without inverted-T arrow keys. Apple’s stock price soars, despite the—and perhaps fueled by—the wailing of Mac users everywhere.

2026: On the eve of Apple’s half-century milestone, Tim Cook concludes the annual event by announcing that he is stepping down as CEO of Apple. In retiring, he will take up the mantle of Chairman of Apple’s Board and High Priest of the Coming Revelation. He does not explain what that means, but says that he is very excited for Apple’s future pipeline and says that the company continues to be dedicated to surprising its customers. His laughter echoes throughout the cavernous confines of Apple Park.

2027: The veil between this world and the next is pierced, and the dread god Glog-Raggopth (all praise his name) emerges, ushered into our plane via the conduit of his maniacal High Priest and strengthened by the years of despair from Mac users. He proceeds to devour the souls of those unfortunate enough to tune in to watch. Minds are sucked dry, people left as barely living shells, cursed to stare vapidly into glowing screens—if such a meager existence can even be called living.

2028: Glog-Raggopth is pleased to announce the release of the revolutionary M8 MacBook Air, lighter than ever, with unprecedented performance and multi-day battery life. It’ll be available for just $999 and is manufactured with a completely carbon neutral process, exemplifying Apple’s commitment to the environment. It’s a product that only Apple, with its unique combination of hardware, software, services, and eldritch energy, could make, and Glog-Raggopth proclaims it the best Mac the company has ever made. And here, they’ve made a video!

[Dan Moren is the East Coast Bureau Chief of Six Colors, as well as an author, podcaster, and two-time Jeopardy! champion. You can find him on Mastodon at @dmoren@zeppelin.flights or reach him by email at dan@sixcolors.com. His next novel, the sci-fi adventure Eternity's Tomb, will be released in November 2026.]


Jason got to spend some time with the new M3 iMac and MacBook Pro, and returns to spill the details! We delve into the differences between chips in the M3 family, the curious case of the new low-end 14-inch MacBook Pro, and much more.


By Jason Snell

A magic number: New MacBook Pros and iMacs usher in the M3 era

Note: This story has not been updated since 2023.

For the first time in the Apple silicon era, Apple isn’t using its lowest-end chip to usher in a new generation of processors. On Monday, Apple announced not just the M3 chip but its beefier siblings, the M3 Pro and M3 Max. The M3 chip powers a revised iMac, and all three models—yes, that’s right—power updates to the 14- and 16-inch MacBook Pro.

Meet the M3 generation

Three new M3 chips.

While it’s exciting that there are new Macs (available to order today and shipping next week), the new M3 chip generation has ramifications that go beyond those who are in the market for a new MacBook Pro or iMac today. That’s because these are the chips that will presumably be arriving in updates to every Mac model over the next year. If we’ve learned anything in the Apple silicon era, it’s that Apple designs a few chips and then rolls them out to more or less every model in the line-up. And an M3 in an iMac more or less performs identically to an M3 in, say, a MacBook Air.

Apple is claiming some speed increases across the board with these new chips, which use a three-nanometer process for the first time. While I was able to get a sneak peek at the first wave of Macs using these new chips, there was no way to independently judge performance. In many contexts, Apple is using the M1 processor as a baseline to compare speeds, which somewhat obscures the generation-to-generation improvements. My back-of-the-envelope calculations of Apple’s claims suggest a 10 to 15 percent overall boost from M2 to M3, but that’s just a guess. The proof will come in the testing, not the press releases.

Apple says the M3 is a huge leap forward in its graphics architecture, with speed gains that go far beyond just making a GPU core execute a little bit faster. These additional graphics gains come in a few ways. First, mesh shading and ray tracing are now both hardware accelerated, allowing them to run dramatically faster than they would just in software. (If you’ve heard this before, it’s because Apple made the same claims about the A17 Pro chip in the iPhone 15 Pro, which is based on a similar architecture.)

There’s also a big new feature Apple is calling Dynamic Caching. Put very simply, Apple’s chip engineers were extremely motivated to eke out even more performance from their graphics subsystem—and found that the way memory was traditionally allocated was inefficient. Memory is usually allocated to different threads at compile time, meaning that some threads allocate a larger amount of memory in order to handle peak need, while other threads might choose a smaller amount of memory but risk a bottleneck.

The M3’s graphics system dynamically allocates the memory per thread in a way that’s completely transparent to software developers. Apps don’t need to be rewritten to take advantage of the new system, which Apple says makes some huge gains by wringing a lot of memory efficiency out of the system. Memory that was previously reserved for a specific thread can be given to a different thread instead. A thread that’s in a bottleneck can be given more space. It’s all to the goal of increasing overall throughput.

If you look at the three levels of chips, you’ll see some small changes to their specs here and there. The M3 processor’s core specs are unchanged: like the M2, it’s got an eight-core CPU with four performance and four efficiency cores, has a maximum of 10 GPU cores, and maxes out at 24GB of RAM. (And I’m sad to report that it still only supports two displays, so any M3-derived systems with a built-in monitor—like the iMac and laptops—will only support a single external monitor.)

The M3 Pro sees some curious architectural changes. It’s still a 12-core CPU, but the core balance has shifted. The M2 Pro had a maximum of eight performance cores and four efficiency cores, but the M3 Pro has six of each. It also maxes out at 18 GPUs, down one from the 19 offered in the M2 Pro. Maximum RAM goes up to 36GB from the 32GB in the M2 Pro. Apple’s efficiency cores are pretty beefy in their own right, but offering fewer performance cores is an interesting trade-off. I’m looking forward to seeing how CPU performance compares.

On the M3 Max, the brakes are off. The chip’s got a 16-core CPU with 12 performance cores and four efficiency cores, up from eight and four on the last generation’s top-of-the-line M2 Max chip. GPU core count is up to 40 from 38. The RAM ceiling has been lifted from 96GB to 128GB.

Keeping in mind that many (if not most?) users don’t buy models with maxed-out core counts and RAM, many of these differences might be academic. But I do wonder if Apple is subtly shifting the positioning of the Pro and the Max chip to make them more clearly differentiated. More on this later.

New MacBook Pros… so soon?

If it seems like Apple just introduced new MacBook Pros, it’s because it did, back in January. With the announcement of new models on Tuesday, it’s the rare moment when Apple has revised the same model twice within a calendar year. (And it makes it feel even more like those M2 MacBook Pro models were late to the party.)

Still, here we are with some cutting-edge Mac laptops that take advantage of Apple’s latest chip advances. Apple has done essentially nothing to change the exterior of the 14- and 16-inch MacBook Pros, which still look identical to the design introduced in 2021. But there are still interesting differences that go beyond just adding in new chips.

With these updates, Apple has finally simplified the MacBook Pro line. The 13-inch MacBook Pro with Touch Bar, which was presumably updated for both the M1 and M2 generations so that there was a sub-$2000 laptop in the MacBook Pro line, has been discontinued. It is an ex-laptop.

In its place is a new base-model 14-inch MacBook Pro that uses the M3 (not Pro, not Max) processor. At $1599, it’s $300 more expensive than the old 13-inch model, but it’s a real 14-inch MacBook Pro, meaning that it’s got the spectacularly good Liquid Retina XDR display (now showing SDR content 20% more brightly!), MagSafe charging, and the usual complement of USB-C ports. (Apple says it’s 40 percent faster than the old 13-inch M2 MacBook Pro.) Users who want more performance and functionality will want to spend more for higher-end models, but this seems like a pretty good trade-off to create a base model that feels like it was designed in the 2020s.

The $1599 M3 MacBook Pro comes in two color options, Silver and Space Gray. But the rest of the line-up—the models with M3 Pro and M3 Max processors—come in Silver and Space Black, a new color that features a new anodization seal process designed to reduce the visibility of fingerprints. I got my greasy monkey paws on a Space Black laptop and can report that Apple’s as good as its word in the sense that it seems generally more resistant to fingerprints and other smudges.

But I don’t want to exaggerate this feature: you can still see fingerprints. They just aren’t as prominent. This is a progressive improvement over something like the Midnight M2 MacBook Air, but it’s not a cure-all.

Similarly, I need to warn you not to get too excited about Apple finally making a black MacBook Pro. Space Black is not actually as black as space. It’s a dark gray. Yes, it’s appreciably darker than the Space Gray on the current MacBook Pros (and the new base model), but it’s still a shimmery metallic gray. Fans of Darth Vader stand down.

Of course, you can spec up the 14- and 16-inch MacBook Pros (which start at the same base prices as the last generation) as high as you want to go, including up to the ultimate configuration of the M3 Max chip. (You’ll just pay for the privilege.) Overall, Apple says that the M3 Max is twice as fast as the M2 Max, which is a pretty impressive claim.

These new laptops are available to order now, with the M3 and M3 Pro models shipping next week and the M3 Max models shipping later in November.

It’s the iMac, but… with M3

The new iMac is familiar but now powered by M3. (The accessories, while still color-matched, are unchanged.)

It’s been more than two years since the 24-inch M1 iMac arrived. It skipped the entire M2 generation, but it’s been revised here at the outset of the M3.

As far as I can tell, the new M3 iMac is essentially the same computer as the M1 model, which was a completely new design that’s got plenty of life left in it. Everything else, from color options to the base price, seems the same. The base model still has an 8-core CPU, with pricier models getting the 10-core model. It’s very familiar.

Of course, the really big change is the M3 processor itself, which should make this iMac about twice as fast as the previous model… and much, much faster than the last generation of Intel-based iMacs. (I get the impression that the iMac might be a computer that is replaced on very long cycles, meaning there are still plenty of Intel iMacs still in service. This new model makes a compelling case to replace them.)

For the record, Apple says the M3 iMac is capable of editing 12 simultaneous 4K video streams. That’s a lot. The M1 model only claimed four. The M1 feels like a long time ago now.

If you were hoping that Apple might use this update to the iMac to continue its slow eradication of the Lightning port from its accessory line, I have bad news. Despite it seeming like the perfect time for Apple to fix the charging port on the Magic Mouse and the arrow keys on the Magic Keyboard and add a Touch ID surface to the Magic Trackpad, none of those things happened. They all still charge via Lightning. Same as it ever was.

The M3 iMac also doesn’t come in an optional M3 Pro configuration, which I admit surprises me a little bit, given that the Mac mini supports it. Best I can figure, Apple thinks that the Mac mini is used in applications that require a little more processor power, but that iMacs aren’t—and that the Studio Display and a Mac mini can fill the needs of those who want something like a larger, faster iMac. (Apple has the right to change its mind and introduce an iMac Pro at any point, of course. It just hasn’t done so recently.)

The third generation

M3 Pro
Space black is dark gray. But with fewer fingerprints!

I get the sense from this announcement that now that the Apple silicon era is in full swing, the company is beginning to tweak things here and there to better fit its overall product strategy.

As the M series chips get even more powerful, it feels like Apple is more comfortable in bragging about the remarkable power of the base model chip. The M3 seems to be shaping up to be powerful enough to fulfill the needs of iMac users and low-end MacBook Pro users, and presumably at a relatively low price. It’s the chip for the masses.

The high-end M3 Max chip also seems to have a pretty clear remit: keep going faster. The M3 Max lives up to its name by offering more cores, more RAM, and more performance… at a high cost. It’s the chip for the most demanding pros—high-end 3D work, medical imaging, that sort of thing—who need everything they can get from their computer and are willing to pay to get it.

Then there’s the M3 Pro, which has rejiggered its CPU core configuration, reduced its max GPU cores, and slightly increased maximum RAM. Call it a hunch, but it feels like Apple’s recognizing that the M3 Pro is going to be the chip of choice for most pros—and is refining the mixture in order to combine power and (relative) affordability. It will keep getting faster, of course, but maybe the Pro chips will improve a little more incrementally going forward while the Max chips will be further out on the cutting edge.

I don’t know. We’ve only seen three iterations of the Apple silicon approach, and it’s possible that Apple will revert its approach next time or try something even wilder. But from some of the subtle changes this time, I feel like the M3 Pro chip is the most interesting one. When the new MacBook Pros ship next week, we’ll start to get a sense of what pro customers think.

The removal of the 13-inch MacBook Pro and its replacement with a low-end 14-inch MacBook Pro model is a fantastic move. I realize that it raises the base price to get in the MacBook Pro line, but let’s be honest: that old 13-inch model wasn’t really a MacBook Pro; it was a MacBook Air with a better name and worse design. The new model has the display and ports that make it a true MacBook Pro, and potential buyers who can’t justify the price are probably better off buying a MacBook Air, which remains a remarkably great value.

The iMac didn’t really need a redesign, so it didn’t get one, but it’s a relief to see that Apple’s all-in-one—the best-selling all-in-one in the world, apparently!—is more powerful than ever. I sure wish those accessories had been updated to support USB-C, though.

Overall, this was a pretty good day for the Mac. Apple has launched the next generation of Apple silicon chips, and it’s done the whole family (barring the Ultra, which in the past has just been two Max chips attached to one another) at once. If you’re thinking of buying a new MacBook Air or Mac mini next year, look to the M3 iMac to get an idea about how it will perform. If you’re hoping for a Mac Studio, the profile of the M3 Max MacBook Pro will be pretty close to what you’ll get.

That’s the great thing about Apple silicon: The chips really tell their own story. Earlier, I almost referred to the MacBook Pro as “The M3 Pro with MacBook Pro,” not the other way around. A silly but telling typo: In a way, these first models are also vessels to carry the message about Apple’s latest chip designs. If you’re in the market for a new iMac or MacBook Pro, this is a big announcement. But it’s really just as big for everyone else who is wondering what the Mac line-up will look like over the next year.



By John Moltz

This Week in Apple: Jump Scares

John Moltz and his conspiracy board. Art by Shafer Brown.

Apple attempts to shock us with a surprise event, but the scarier thing is the new prices of its services. Don’t worry, though—it’s totally going to ship a cheap MacBook! Uh-huh.

Event horizon

Hey, remember how Apple was done with announcements for the year, and if you wanted to buy an Apple product, you should just go ahead and do that? Haha, wellllll…

Surprise! We are now expected to get new Macs, most likely with M3 processors, next Monday at the bone-chilling hour of, uh, 5 p.m. Pacific time. WhoooooOOOOOOOhhhhhh! If you hurry, there’s just enough time to switch your existing costume out for “sexy M3-based Mac.”

Mark Gurman and Ming-Chi Kuo believe new iMacs and MacBook Pros could be introduced but are mum on the big question:

Will there be another skit?

And will it be Halloween-themed?

Would it kill Tim Cook to switch up his patented “Goood morrrning!” with a spooky “Goood eeeevening!”? Seems to me that if you’re going to do an event on Mischief Night and not make the best of it, you’re just leaving money on the table. And leaving money on the table isn’t exactly Apple’s style. Segue!

The god of services revenue must be appeased

Good news! Apple is raising prices on its services!

Sorry. I mean that it’s good news if you’re Apple. I should have specified. It’s terrible news for everyone else.

Yes, falling in the footsteps of every other streaming service, Apple is raising prices in order to squeeze customers for every cent they’re worth. How long can this go on? They’re all determined to find out!

Well, you may be paying more, but at least you’re also getting more conten—

Oh. Huh.

Look, you don’t get to be the richest company on Earth in the history of money and the Earth by giving things away. You also don’t get there by charging a reasonable rate for things and then leaving it at that rate.

Apparently.

Return of the Mac(Book)

An interesting ontological thought experiment is to consider that the existence of the MacBook Air and the MacBook Pro implies the existence of a MacBook. (Actually, I think the existence of the MacBook Air more implies the existence of a MacBook Land and a MacBook Sea, but let’s not take this too far.)

For years, Apple made a MacBook with no second name, most recently the 12-inch device with few enough ports to make Jony Ive blush. Now it could be coming back.

$700 seems awfully low (let alone “less” than that), but that could be the educational price, meaning it would retail for $800. Or it could just be wrong.

This rumor was floating around early last month, too, and now has the backing of Ming-Chi Kuo who says Apple is “considering” it. What more proof do you need that it is absolutely, 100 percent happening? Seems like a done deal to me. Stop buying MacBooks right now.

The rumor mill claims Apple is concerned about the proliferation of Chromebooks in education and wants to claw back some of the market. I’m old enough to remember when the iPad was going to be Apple’s cheap laptop replacement. By which I mean I am older than about ten years old.

Apple being concerned about Chromebooks sounds a lot like Apple being concerned about netbooks. And what was the company’s response to cheap, underpowered, lightweight laptops? Yes, the iPad to some degree, but also the original MacBook Air, which sold for $1800. Don’t count your cheap MacBooks before they’re hatched.

[John Moltz is a Six Colors contributor. You can find him on Mastodon at Mastodon.social/@moltz and he sells items with references you might get on Cotton Bureau.]



Apple has announced a “scary fast” new product announcement, and so Myke and Jason have convened an emergency session to draft what they think will happen at next Monday’s mysterious event.


M3 vessels and sticker shocks

We theorize about Apple’s Monday night announcement, and are baffled by a feature of iOS 17.2.


By Jason Snell

iOS 17.2 beta sticker reactions: 👎🏻

Note: This story has not been updated since 2023.

Two stickers and an actual tapback.
This message bubble has two stickers obscuring text, and a real Tapback that doesn’t.

When Apple first announced iOS 17, one of the new features got me really excited. After years of wanting an expansion of iMessage Tapbacks, Apple was granting our wish—well, sort of.

Rather than letting users choose to tap back with any emoji, Apple was bringing a new “sticker reaction” to Messages, and since any emoji could now be a sticker, you can put two and two together and get pretty much any reaction you want to quickly tag to a blue bubble.

On Thursday Apple released the first developer beta of iOS 17.2, and after four months of anticipation we’ve finally gotten our first glimpse at Apple’s new sticker reactions.

I know this is just a developer beta, so there’s time for Apple to fix the problem… but as of right now, I am thoroughly disappointed.

There’s not even a shortcut for stickers in the Tapback menu.

This new feature has no connection at all with the fun double-tap gesture that’s synonymous with Tapbacks. I didn’t expect stickers to be a peer to Apple’s classic1 collection of six Tapback icons, but I did sort of assume that at the very least, performing the Tapback gesture would also give you the option of choosing a sticker. (And the right thing for Apple to do would be to display recently used stickers alongside the Tapback icons.)

Instead, to send a sticker response you have to tap and hold on a message and then choose Add Sticker from the resulting contextual menu, then choose a sticker or emoji. It’s an extra step that really shouldn’t be necessary and makes stickers feel like an afterthought, which they apparently are.

It gets worse. When you add a sticker reaction, it’s placed on top of the message you’re reacting to, obscuring part of the text! Why in the world would Apple choose a placement that makes it impossible to read the text being responded to? The right placement for these reactions is… wait for it… the same place that Tapbacks appear, in a little bubble snuggled up against the message that’s being reacted to.

It was always clear that Apple was just modifying its existing sticker system to create this feature, but I expected a bit more care in how reactions were deployed. If you drag out a sticker, you can at least try to drop it somewhere that doesn’t obscure the message you’re comment on. But if you just tap to insert a sticker in iOS 17.2, the emoji just pops down on top of text. That’s it.

I don’t know if there’s any chance this feature can be improved before it’s unleashed on the general public later this year. There are lots of ways it could be improved, but just offering a shortcut from the Tapback menu and choosing a more appropriate default placement would be enough to upgrade it from “what were they thinking?” status to simply “needs improvement.”

I can’t believe I spent four months anticipating this feature.


  1. That’s sarcasm, because they haven’t ever changed them. 

By Joe Rosensteel

Is there in communication no beauty?

The Louvre's iconic glass pyramid at night. Green dots from the lens are right above the top of the pyramid.
Oh no! By removing the green dots above the Louvre (left), the very fabric of reality has been rent asunder! (right)

With the new iteration of flagship smartphone cameras comes the new iteration of arguments about reality—and not the “fun” kind that you strap to your face. Google’s Pixel 8 and Pixel 8 Pro offer a new generation of AI editing tools, and Apple’s iPhone 15 and 15 Pro integrate the latest round of Apple’s Photonic Engine1 and Deep Fusion to process every last detail.

What we should really be talking about when we talk about these cameras isn’t some representation of “reality,” but communication. What a person shoots with any camera is not truth, but a timed amount of light from a selected angle shining through a lens focused onto a recording medium. Even without any editing tools, you can produce a photo that is real but isn’t true entirely in camera at the time it was shot.

Take a selfie outside the entrance gates to Disneyland, posted to Instagram. The image communicates the narrative that the subject spent the day at Disneyland—but that’s not what the photo shows, it’s an inference communicated by the context. There are no fancy generative AI tools changing the background. It was all recorded in camera. But whether or not the inference is true isn’t recorded in the photo at all—it’s one viewers make based on trust.

A person at a landmark wants their picture taken, they hand the phone to someone and the person takes two photos with slightly different framing. One has just the person with the landmark, and the other has the person and a rando with the landmark. The rando isn’t what anyone wanted to communicate to the viewer, and choosing the photo without the rando isn’t deceit. The two photos, one framed with the rando, and one without, are both real, and both true, but only one of them clearly communicates what the person wanted. If they only had the one photo with the rando, is it deceit to crop, or use a generative fill tool to remove the stranger?

The ultimate computer

Now that we’ve established that we can have real photos that aren’t true, let’s discuss fancy computer-assisted editing tools.

A group of three friends is taking selfies with their smartphones. They take three shots. A different friend is blinking in each shot. Each of those photos is true, in that the light was recorded, and that person blinked, but it is also true that there were moments that the friends were there with the camera and none of them were blinking, but it just wasn’t recorded by a camera. Is it a lie to combine two of the photos to communicate something true that simply wasn’t recorded?

A person with an iPhone takes a shot of city lights at night. Green dots from the internal reflections of the lens elements dance across the image. It is true that the camera recorded those green dots, and they physically happened in the lens elements, but is that what this photo should communicate? Are the internal reflections what the photo is about?

Google swapping heads in a group photo isn’t quite the problem that people seem to think it is. Apple having no proper retouching tools at all in Photos on the iPhone is a problem. If it’s based on a philosophical argument about the nature of things “looking real,” then it’s misguided as iPhone photos are an idealized result of heavy computational work anyway.

Apple’s goal should be to help people communicate clearly.

Even when we’re not making some big artistic statement with our iPhones, we might want to retouch something. I’ve taken photos of things on my desk that, after Photonic Engine and Deep Fusion did their work, showed dramatic contrast and detail… of the specks of dust on the table I hadn’t noticed before taking the shot.

Me carefully wiping off all the dust and retaking the photo won’t communicate something more pure about that photo, but the dust is a weird distraction. I’ve used third-party apps on iOS for many years just to remove those specks of dust, or those green dots. It’s silly that there’s no tool in the iOS Photos app to do it.

We’ll always have Paris

My issue with Google is that they have gone so far into generative editing tools that can produce strange technical artifacts in the final image which distract from what the image is supposed to be communicating. In Marques Brownlee’s video on the Pixel 8 and 8 Pro, he does some very unconvincing sky replacements and other edits with the Magic Editor. The same sorts of artifacts are present in DPReview’s article, where pieces of hair get chewed up, and in one case where a woman’s shoulder is replaced/caved in.

This is very similar to the problems introduced by Google and Apple’s first forays into post-processed blur, like floating liquid in wine glasses, or chunks of ears or eyeglasses missing. Those errors have become less glaring over time, but there are still errors. The problem here isn’t that people can alter their images, but that the alterations are sloppy. The photographer doesn’t have the skill or the eye to know there’s a problem in the final result they’re sharing.

The ones who will really be able to take advantage of the tools are the ones that will understand the situations in which the tools do and don’t work. Like framing a shot so a green dot from a bright light is over a solid-colored area of the frame, and not in someone’s hair, or an intricate pattern or edge. Framing a portrait mode shot to minimize any tangents with background objects and foreground objects. Then it really is an editing tool, and not just generative mush.

There is a concern that the general public will heavily edit all of their photos if the tools are easy enough to access. This is perhaps the least concerning part. People want to remember things that actually happened in their lives, and the desire to heavily edit everything is unappealing. They want to remember the ups and downs of their vacation to Paris. They want to remember loved ones as they were, or a night time stroll by the Louvre.

Context is for kings

There is a long, long history of altering photos and videos that goes back to the beginning of celluloid, even before we get to computers, digital cameras, or smartphones. Democratizing these tools, and making it so that technical errors are less obvious, consistently scares people that wish to assume everything they see is some kind of legal promise. We need to consider the context: the sources of the photos we see, and who’s sharing them with us.

Not all deception is malicious, either. As a visual effects artist, I’m a professional deceiver for entertainment purposes (which is a completely acceptable form of communication). The context is that you’re watching something categorized as entertainment. We don’t treat TV and films like historical documents, and we shouldn’t treat social media posts that way either.

The flood of malicious misinformation, with real consequences, is perpetrated by people that don’t need to use a Pixel 8 Pro, and aren’t stymied by the iPhone 15 Pro’s lack of retouching tools. If people are more aware of the tools that can edit photos, because they have them in their own hands, perhaps they will also be less likely to fall victim to deception.

We should all be receptive to what someone wants to communicate while knowing the context for that person, that medium, etc. This is especially true when something is inflammatory, and not just “look how great my life is” posts on Instagram. Not everyone is lying, but if you’re getting very emotional about something, seek out more information from trusted sources. Sharing something because of an emotional reaction to a photo without understanding if it’s true isn’t the fault of the image—that’s on the person sharing it. The sharer’s act of communication is spreading a lie. The editing tools didn’t lie.

A photo can communicate something that really happened, but the context for deciding whether or not it does exists beyond the photograph itself.


  1. The Photonic Engine always makes me think of the purposefully absurd photonic cannon from the excellent Stark Trek: Voyager episode “Tinker, Tenor, Doctor, Spy”. Coincidentally, an alien spy gains access to The Doctor’s daydreams and believes everything he sees is real. 

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]


By Dan Moren

A Mac event is the perfect time for Apple to show off its peripheral vision

Note: This story has not been updated since 2023.

Next week’s Apple event may not feature Tim Cook dressed as a mummy—more’s the pity—but Apple’s teaser video has all but confirmed what many had expected: this October event is all about the Mac.

The rumor mill continues to debate what exactly might be in store: will new iMacs been M2-based or usher in the era of the M3? Will there be new MacBook Pros with new high-end Apple silicon chips? Will that lonely 13-inch MacBook Pro be refreshed or banished into the night?

Amongst all those possible updates, I’m presonally looking for a dark horse announcement here, something that’s a bit…ancillary to the main event.

Mac peripherals

In 2023, Apple’s been on a mission to seek out and eliminate Lightning ports with extreme prejudice. We’ve already seen the introduction of USB-C iPhones, USB-C AirPods Pro, and even a USB-C Apple Pencil. But there remain a few stragglers in the line-up, and the ones that seem most ripe for replacement at this upcoming event are the Mac peripherals: the Magic Keyboard, the Magic Mouse, and the Magic Trackpad.

Now, Apple could just swap out the Lightning ports for USB-C and call it a day, and the smart money is on that being the case. After all, why mess with what seems to be basically a winning formula? But it’s also an opportunity to give each of these devices a more thorough refresh—and maybe it’s about time.

The Magic Keyboard has been revamped the most recently of the three, alongside the 2021 Apple silicon iMac’s introduction. At the time it gained a Touch ID sensor, rearranged function keys, and multiple colors to match the respective iMacs. But this update also made it an outlier in a couple ways: for one, rounded corners that give a bizarre shape to the keys there (the Escape, Function, Right Arrow, and Touch ID sensor) and, far more egregiously, the lack of an inverted-T layout for the arrow keys.1 The Magic Keyboard inexplicably stuck with the full-height left- and right-arrow keys and half-height up- and down-arrow keys in an era where Apple had abandoned that layout across the rest of its devices, including MacBook keyboards and the Magic Keyboard for iPad. This would be an ideal time for the company to return the layout, which many touch typists find much easier to navigate.

The Magic Mouse has remained unchanged since its introduction in 2015. Over the years, it’s achieved a certain degree of notoriety for the location of its Lightning port on the underside, which makes it impossible to use while it’s charging. While the charging is relatively fast, it’s still an awkward design that Apple could take the opportunity to update.

Finally, the Magic Trackpad, long my pointing device of choice, which also dates back to 2015. It’s hard for me to criticize it too much, since it does what it needs to do with aplomb, but it would be interesting to see Apple explore other options and capabilities, whether that means building in Touch ID or adding support for the Apple Pencil to turn it into a sort of mini graphics tablet.

I also want to call out two places where Apple could improve all of these devices: one aesthetic, one functional.

The first is color. While you can get very slick color-matched versions of any of these with an M1 iMac, those buying them on their own are relegated to just two options: white/silver and black/gray. And the compact Magic Keyboard only comes in the first of those. That’s a real shame, given that the iMac versions—complete with color-matched cables!—exist. I can understand Apple not wanting to manage all the various SKUs, but frankly, let people choose the color peripherals they want! It’s not too much to ask, especially when the rest of your products are woefully skimpy on colors.

On the functional side, Apple either needs to improve or replace its Bluetooth support. My Magic Keyboard and Magic Trackpad frequently disconnect from my Mac mini, which is all of about eighteen inches away, and the keyboard in particular is egregious in terms of the amount of time it takes for typed characters to show up on screen. This certainly seems like a place that Apple could use its vaunted engineering prowess to offer something that’s a little extra in much the same way that it does with AirPods. Not only would a custom wireless chip potentially allow for more robust and reliable connections, but it could also simplify switching peripherals between devices, a process that Bluetooth makes fairly painful.2

With eight years since the Magic Trackpad and Magic Mouse have been revamped, it’s about time for Apple to take a closer look at its Mac accessories. To be honest, I’ll probably replace my Magic Keyboard and Trackpad even if Apple just switches them to USB-C, but if the company’s looking to liven up what might otherwise be a ho-hum event, this could bring just a little bit of treat to a spooky occasion.


  1. The exception being for the extended Magic Keyboard, which features full height arrow keys in an inverted-T layout. 
  2. Many third-party devices have improved on this by building in the ability to pair with multiple devices at once, though even there it often requires a disconnecting/reconnecting dance.) 

[Dan Moren is the East Coast Bureau Chief of Six Colors, as well as an author, podcaster, and two-time Jeopardy! champion. You can find him on Mastodon at @dmoren@zeppelin.flights or reach him by email at dan@sixcolors.com. His next novel, the sci-fi adventure Eternity's Tomb, will be released in November 2026.]


Finally we discuss Lex’s problems, just before the podcast goes off the rails.


Using an all-in-one messaging platform, using an all-in-one calendaring app, using — and telling others about — Passkeys, and what we hope to see at Apple’s “Scary Fast” October event.


By Shelly Brisbin

The iPhone 15 Pro brings tangible accessibility benefits

Note: This story has not been updated since 2023.

What makes an iPhone accessible? Mostly, it’s the software: the operating system and apps that follow guidelines Apple has set for that purpose. If new hardware plays a role, it’s often in the opposite direction. For some with hearing or vision disabilities, the loss of the Home button has made hanging onto an iPhone SE feel worth the struggle.

But this year, there’s another hardware story (and I’m mighty surprised to be writing this): What’s inside Apple’s Pro iPhones is giving an important boost to usability for people with disabilities.

The number and breadth of accessibility features you can quickly enable with the action button are significant, as are the brief descriptions of each.

The Action Button

Turning the tactile, easy-to-use ring/silent switch into a multifunction button you can use to launch a shortcut, open the camera, or fire up Voice Memos is a fair way of giving back where something’s been taken away. There’s also a whole screen full of accessibility features you can choose to assign to the Action button: everything from turning on the VoiceOver screen reader or Live Captions to adding a color filter or starting AssistiveTouch.

Quick access to most of these options isn’t new: you can use Back Tap or Accessibility Shortcut (a triple-click of the Side button) to summon a lot of them. But particularly if you have a motor disability, the choice between invoking your preferred feature by tapping, triple-clicking, or a press and hold of the Action Button is just one more level of flexibility, not to mention the chance to program quick access to various tools at the same time.

Since you can launch a shortcut with the Action Button (also available via Back Tap), there’s no end to the ways you can customize your own accessibility by doing things more quickly. I have a blind friend who’s using the Action Button to quickly toggle the speed of podcast playback between two favorite settings. Using a shortcut means she need not open Overcast using VoiceOver and then swipe to the speed slider every time she wants to make a change. Sometimes, accessibility means saving steps.

The accessibility options available for the Action Button also come with brief descriptions that could introduce or explain these features to people who have never dived levels deep into Accessibility settings.

Wideband and Precision Finding

Precision finding is great for anyone who’s looking for their phone or for a friend in a crowd. But how great is it when one or more of the parties doing the looking is blind?

Find My was never precise enough to lead anyone to the exact restaurant table or funnel cake booth. But precision finding, powered by the second-generation Ultra Wideband chip, gives guidance with sound and VoiceOver speech.

Intelligent Portrait Mode and Other Photo Magic

If taking photos isn’t a big part of the way you use your phone, an iPhone 15 Pro or Pro Max could seem like overkill. For accessibility-related uses of the camera, like magnification, scanning text, and using AI or human assistants to describe or analyze a scene, an older camera system works just as well as this year’s best. But as cameras get better, so do the machine-learning tools these phones offer to improve your photos.

A visually impaired photographer can use any of the iPhone 15 cameras with more confidence, knowing that after-the-fact portrait mode (and other enhancements to an image he or she has already taken) are available. That’s a kind of accessibility of opportunity masquerading as a mainstream camera feature.

LIDAR

The Magnifier app shows the various detection modes available, on the left. With Door Detection enabled, there’s visual and spoken access to information about what the camera and LIDAR sensor find.

Pro phones have included a LIDAR sensor since the iPhone 12 range. It’s a camera system thing. But like the Ultra Wideband chip, LIDAR can help detect what’s in your environment.

In each generation of the Pro phones since the 12, LIDAR has given Pro users access to new detection features. First, you could have what the camera saw described to you: a red car on a dirt road, a wooden table with keys, and a pair of glasses. Then came People Detection, just in time for the mid-pandemic. LIDAR and the Magnifier app let you identify the presence of a person and how far that person is from you. Door Detection was next. Aim your phone at a building or down a hallway and find out not only where doors are located but whether they’re open or closed, wood or glass, have signage or not, and what that signage says.

This year, we got the unfortunately named Point and Speak. It’s not a child’s toy but a tool for reading text labels. Hold your phone up to a microwave, washing machine, or other gadget, and point to where you think a label is located. The phone helps you aim and reads text it finds over or under (you choose) your pointing finger. Great for a microwave, dishwasher, or any device with buttons to be pressed.

None of the detection features is perfect. Point and Speak, in particular, could use some seasoning. But these features give some insight into how Apple teams working on accessibility have been able to weave interesting features with relatively small potential user bases into the ways iPhone hardware uses develop over time.

Is it also proof of concept for greater things to come? You bet. But for at least a few people where accessibility is a main requirement, it’s also reason enough to splurge on an iPhone 15 Pro or Pro Max right now.

[Shelly Brisbin is a radio producer and author of the book iOS Access for All. She's the host of Lions, Towers & Shields, a podcast about classic movies, on The Incomparable network.]



Search Six Colors