Six Colors
Six Colors

Apple, technology, and other stuff

Support this Site

Become a Six Colors member to read exclusive posts, get our weekly podcast, join our community, and more!


By Dan Moren for Macworld

If Apple Intelligence is so great, why doesn’t Apple trust us to turn it on?

I know the road is well-trod by now, but yes, we’re back to talk about Apple Intelligence once again. Why? Well, for better or worse, it seems to be pretty much all that Apple wants to talk about these days, and when the company has put this much time, energy, and, yes, marketing attention onto a single feature, then scrutiny is, also for better or worse, what you get.

While the features under this Apple Intelligence banner have had their fair share of problems so far—everything from inaccurate news summaries to misidentifying spouses—none of that seems to have slowed Apple’s adoption of the technology. With the news that the upcoming iOS 18.3 and macOS 15.3 updates will activate Apple Intelligence by default, the company continues to plow full speed ahead, directly into a minefield that’s also somehow replete with both asteroids and icebergs.

Continue reading on Macworld ↦


Apple enabling Apple Intelligence features by default, the AI summaries that work for us, Samsung’s boring new phones, and our cable management predilections.



By Jason Snell

Two desks, but a single M4 Max MacBook Pro

Two office views from 2014. Left: My final docked work setup at IDG. Right: My first setup in my garage.

When I started Six Colors more than a decade ago, I maintained continuity with my corporate job in a bunch of ways. I had been using a MacBook Air as my primary computer, taking it back and forth on the bus to San Francisco in my backpack, and had set up a work-from-home space in my garage, with an external monitor that replicated my setup at work, so I just kept using that.

But that fall, Apple introduced the 5K iMac, and I was so smitten with the Retina display that I decided to become an iMac user full-time, leaving behind the laptop-only lifestyle. (Later on I went from iMac to iMac Pro to M1 Max Mac Studio with Studio Display.)

But things have changed.

It’s cold out there

A little background: Even though I live in California and our weather is mild, my garage workspace has a garage door that leaks cold air efficiently, and my sturdy electric oil heater can only do so much.1 As a result, in the winter, my garage is always cold, and I often find myself shivering uncontrollably in the late morning until I stand under the shower for half an hour.

Also, my daughter is now an actual adult with her own job and life a state away, and her bedroom is well-heated and sitting right there, largely unused. Last winter, I created an outpost in that room, moving in my first work-from-home desk and attaching my MacBook Air to a Studio Display. I was a lot warmer, but it meant using different computers in different rooms.

While cloud-based syncing makes everything much easier than it ever used to be—I can access most of the files I’m working on from anywhere—there are a lot of Mac features and functions that don’t sync, or don’t sync easily. When I switch between the two rooms, all context is lost.

One thing leads to another

This fall’s release of the new M4 Mac Mini made me consider whether I wanted to replace my M1 Mac Studio with a Mac Mini. While I decided against it, I had broken the seal on reconsidering my entire computing life. And I realized that maybe going back to the way I used to work, back when I commuted between Mill Valley and San Francisco, might work when going from the garage to my daughter’s old bedroom.

With a laptop, whether I’m just carrying it down the hallway or putting it in a backpack and riding a bus across the Golden Gate Bridge, I’m always maintaining context. All my documents are in one place, all my settings remain the same at all times, and I don’t need to update apps that were already updated elsewhere before I can get back to work. That sounds pretty good.

And then there’s the performance of the M4 Max chip on the MacBook Pro, which is a big boost from the M1 Max Mac Studio. The only real downside is that this decision required me to give up my beloved M2 MacBook Air, which doesn’t have enough horsepower to satisfy everything I need to do across video, podcasts, and more. When I do travel, I’ll be carrying almost a full pound (0.38 kg) more than I was before. But at the same time, when I do travel, I will now be traveling with the same computer I use all day at my desk(s), not one that’s sporadically used (especially in the warmer months).

benchmark chart showing the new laptop is really fast

The desktop/laptop lifestyle

So, after a few weeks of living with a laptop as my main computer—with nearly all of that time spent in lid-closed mode attached to a Studio Display—how has it gone?

Generally, it’s been amazing. When I update an app, it stays updated. If I download obscure command-line apps or update Homebrew or Python, I don’t have to worry about a mismatch on another device. All the right software is installed. Wherever I go, there’s just a single Mac, and it’s mine. And when I do pull the laptop out and use it somewhere else, I get to stare at that spectacularly good 14-inch MacBook Pro display.

What’s not so great is the extra pound, obviously, when traveling. I’ve had to deal with the fact that my two different docking locations have different inputs and peripherals, which can lead to some confusion with settings that don’t switch automatically.

Since setting up the laptop, I’ve noticed that how I use my computer has changed. Having my “real” computer available in the main part of my house, just a keypress away from waking up has meant that I will sometimes pop in on an evening or over a weekend and do something quick when I previously would’ve deferred it until the next workday. I’ll need to watch this—I don’t want to get sucked into work when I don’t want to be.

I’ve also, on a few occasions, pulled my laptop out of the dock and brought it into the living room. Again, I’m trying to keep up some barriers—this is why I mostly limit my in-home device use to my iPad—but these were unusual occasions where I probably would’ve been running back and forth between two rooms, and instead, I could just bring my MacBook Pro into the living room and make it all happen in one place.

A surprising side effect, at least thus far, is that I haven’t written on my iPad at the bar in my kitchen, a favorite winter change of pace. Maybe part of the reason I liked to sometimes work inside is because I was feeling cold out in the garage? I wouldn’t be surprised if I still do this from time to time, but it’s worth pondering if having the ability to take my main computer literally anywhere will reduce the amount of time I use my iPad Pro with the Magic Keyboard attached.

Even in the winter, I generally spend Mondays and Tuesdays out in the garage because that’s where I prefer to record Upgrade and MacBreak Weekly. (It’s a more photogenic space, and both podcasts offer video versions.) In the past, I’d spend the rest of the day working out there, but since switching to the MacBook Pro I’m only staying out there for the recording. I started writing this article in the back bedroom, transferred to the garage to record MacBreak Weekly, and then returned to the bedroom.

Now, many long-time laptop users will read my descriptions of my behavior and find themselves utterly boggled. “Yes,” they’ll say, “that is what happens when you use a laptop. You get to move the computer around.” And I know that, but for me, the big difference is that I’m really shuttling between two separate desktop setups. It’s something I haven’t experienced in more than a decade.

What’s on my desks

With that all said, let me detail my two new setups, since I’ve had to buy some new hardware to make it all work the way I wanted. A key to this entire approach is that I wanted to be able to plug into either desk with a single Thunderbolt cable to supply all data connectivity and power. One cable, one plug, and I’m in or out.

The big addition was the CalDigit TS4 Thunderbolt Dock in the garage. The Mac Studio had an awful lot of external ports that a MacBook Pro lacks, and of course I also wanted to reduce down to that single cable! While the CalDigit dock is not cheap, it’s loaded with ports. Even better, I was able to set my power-saving auto-switching power strip to use the CalDigit dock as the device that signifies whether or not to switch the rest of the strip on or off. By itself the dock doesn’t draw a lot of power, but when I attach the MacBook Pro, it has to supply power to the laptop—and power use surges, which crosses the outlet’s threshold and sends power to the Studio Display. I’ve got a few devices plugged into the back of the Studio Display, so when my laptop isn’t present, none of those devices are powered. That’s how I want it.

Garage:

Jason's garage office

Apple Studio Display on a Right Angle Hover Series 2 arm, Keychron Q1 keyboard (Kiwi switches, MT3 Dasher keycaps), CalDigit TS4 Thunderbolt Dock, Magic Trackpad, fully deconstructed Touch ID button, Twelve South BookArc, Elgato Stream Deck, Elgato Key Light, Elgato Wave Mic Arm LP, Shure SM7B microphone, Sound Devices USBPre2 audio interface, Uplift Standing Desk, Herman Miller Aeron chair, Smart Strip switching outlet.

Back bedroom:

Jason's back room office

Apple Studio Display, Vortex Race 3 keyboard (brown switches, MT3 Camillo keycaps), Magic Trackpad, Magic Keyboard with Touch ID, Twelve South Curve Riser, Anker 555 USB-C Hub (attached to Studio Display), Elgato Stream Deck, Heil PL2T boom arm, Shure MV7 microphone, HON Ignition 2.0 office chair, Beyond the Office Door VertDesk.

That’s it! Two rooms, one appreciably warmer than the other—at least, for now. The next question is, what happens when the temperature in the garage returns to normal. I’m assuming I’ll stay out there all the time, but you never know. (Also, what happens if my daughter decides to move home? She’s welcome to, but… let’s cross that bridge if we ever get to it.)


  1. We have opted not to convert the garage and attach it to our central heating system for several mundane reasons involving spending lots of money. 

“Jaws” myths exploded by Spielberg, via Vaziri


VFX artist and movie fan (and Friend of Six Colors) Todd Vaziri, in a post from 2018 that he updated Monday with new information that debunks the myth that there was an “actual shooting star” in a couple of shots in the movie “Jaws”:

I reached out to film historian Jamie Benning about this issue. He said, “let me ask Joe Alves.” Alves was the “Jaws” production designer and also worked with Spielberg on “The Sugarland Express” and “Close Encounters of the Third Kind”, and has spoken extensively about his experience on “Jaws”. Paraphrasing, this is the response he got from Alves in August 2023: yes, the shooting stars in the movie were animated. Yes, they were added in post-production….

Another source has access to Steven Spielberg. So this person asked Steven Spielberg in September 2023… Paraphrasing from Spielberg: Yep, it’s animated shooting star, animated by Albert Whitlock.

This was a bombshell for me. No, not that Spielberg confirmed that it was animated, but that it was supervised by none other than Al Whitlock who passed away in 1999, the veteran visual effects artist who contributed to some of the most amazing visual effects of all time. Not to mention that really terrific illusion in “The Blues Brothers” (1980) that I documented on Twitter.

Todd’s site FXRant seems to be slowly emerging from hibernation, and just as he does on social media, he’s using it to combat misinformation about how special effects are used in movies. (If you didn’t know, Todd is a longtime VFX artist for Industrial Light and Magic.) Just yesterday, he slapped down that meme about how it took five months to film one scene in ‘Severance’, which of course it didn’t.

But still… going to Spielberg to debunk misinformation about how he made “Jaws.” That’s next-level stuff, even for Todd.


Summing up Apple’s devil’s bargain when it comes to AI features; what TikTok says about the chaotic future of tech regulation; Apple TV+ in the era of “Severance” and “Silo.”


By John Moltz

This Week in Apple: Getting work/life balance the hard way

John Moltz and his conspiracy board. Art by Shafer Brown.

Sonos jettisons its CEO, Severance returns, and Goldman Sachs is still trying to wriggle out of the Apple Card deal.

“Apple CEO Tim Cook Will Attend Trump Inauguration”

What? How did that get in there?

Regardless, we are not talking about that.

Nope.

Nuh-uh.

I need a week off from that garbage. I have to take care of me.

Engage parachute

Big happenings at Sonos this week as CEO Patrick Spence stepped down in the wake of the company’s widely-panned new app launch and the lackluster sales performance of its new headphones. New “interim” CEO Tom Conrad then axed Chief Product Officer Maxime Bouvat-Merlin for good measure.

Heads will continue to roll until apps and sales improve.

It is apparently worth asking, would Steve Jobs have approved this app? He might have had the chance if he had listened to Tony Fadell.

After a report by The Information indicating an unnamed former Apple executive had encouraged Jobs to buy Sonos back in the day when kids were still saying things like “back in the day”, John Gruber reached out to Tony Fadell who confirmed it was he who made the suggestion to Jobs. The suggestion fell upon dismissive if not deaf ears as Jobs said in typically Jobsian fashion: “No one wants what they are selling.”

Well, no one wants that app, that’s for sure.

“Apple CEO Tim Cook Will Attend Trump Inauguration”

Or that! Stop it!

Putting the “fun” in “dysto… pi… an”?

Season two of the critically-acclaimed Apple TV+ show *Severance* begins today, so… happy *Severance*? That seems wrong for a show so bleak. Also, how am I supposed to start season two of this dystopian Apple TV+ show about people working in an office building when I haven’t even finished watching season two of the dystopian Apple TV+ show about people working in a silo?!

At least they’re currently filming another season of “Ted Lasso”. That’s not dystopian.

If you couldn’t wait until today for that hot, dystopian content that’s so popular with the young people these days, maybe you made it to Grand Central Station (motto: “It’s literally like Grand Central Station in here!”) where Apple staged a pop-up Lumon Industries set, complete with the actual actors from series and a visit from creator Ben Stiller.

[snort] Of course this makes no sense as we know that it is the elevator ride down to the offices that converts outties into innies. Therefore, they could not be in their innie personas in the middle of Grand Central Station. Thank you, but I shall consider this non-canon. Good day to you.

I SAID “GOOD DAY”!

“Apple CEO Tim Cook Will Attend Trump Inauguration”

I would rather be stuck in a glass box in Grand Central Station moving numbers around than discuss, watch or even think about that.

I was very clear about this.

Breaking up is hard to do

Particularly when there are legally binding contracts in place and expensive escape clauses. Still, Goldman Sachs wants out of this relationship!

“Goldman Sachs CEO says Apple card partnership may end before 2030”

Goldman isn’t sure that it can wait until the kids go off to college anymore. And it’s kind of understandable.

The business is housed within Goldman's platform solutions unit, which posted an $859-million annual net loss in 2024.

More like platform problems unit, amirite? Because $859 million here, $859 million there… big money.

Meanwhile, Apple is looking for its next victim.

“Exclusive: Apple in talks with Barclays, Synchrony to replace Goldman in credit card deal, sources say”

Any potential new partner is probably going to be a bit more careful reading the terms of service that Goldman Sachs apparently was.

“Apple CEO Tim Cook Will Attend Trump Inauguration”

Oh, my god, will you stop?

[John Moltz is a Six Colors contributor. You can find him on Mastodon at Mastodon.social/@moltz and he sells items with references you might get on Cotton Bureau.]


Video

January Backstage Zoom: Did an LLM summarize this?

We got together with Backstage pass members live on Zoom earlier today to discuss all sorts of stuff related to Apple Intelligence, Home tech, and more.

We’ve embedded the video below, or you can watch it on YouTube.

Thanks for being a Six Colors subscriber!


Apple begins to tweak iOS AI summaries

Chance Miller of 9to5Mac reports that in the latest iOS 18.3 betas, Apple has changed how it handles summaries, presumably in response to criticism of inaccurate headline summaries:

  • When you enable notification summaries, iOS 18.3 will make it clearer that the feature – like all Apple Intelligence features – is a beta.
  • You can now disable notification summaries for an app directly from the Lock Screen or Notification Center by swiping, tapping “Options,” then choosing the “Turn Off Summaries” option.
  • On the Lock Screen, notification summaries now use italicized text to better distinguish them from normal notifications.
  • In the Settings app, Apple now warns users that notification summaries “may contain errors.”

Additionally, notification summaries have been temporarily disabled entirely for the News & Entertainment category of apps. 

Turning off summaries for an entire category is a quick fix, though it doesn’t address the larger long-term problem of inaccuracy. Empowering users to turn things off more easily is good, and styling them differently from regular notifications is important.

In short, this seems like a good bit of damage control, but there’s much more work to be done here—and these features should all have been in iOS 18 to begin with.


Compress folders into separate archives

Sometimes a solution is simpler than you think it’s going to be. I was trying to figure out a way to write a shortcut to take a folder and have all of its sub-folders compressed into individual archives—i.e., one archive per folder. Alas, selecting them in the Finder and choosing Compress turns all your folders into one big archive.1

As easy as this feels like it should be, Shortcuts was thwarting me. I wrote a workflow that seemed like it ought to be fine, and it did nothing. I could have fallen back to a shell script, but I’ll be honest, I was feeling stubborn.

So I hit the old search engine and found this post by Lukas Polak, which from its description seemed like it should work, even if I didn’t really understand why.

Sure enough, it does. The secret is in opening the Archive Utility app, which is the macOS program that handles compressing/expanding files—yes, there’s actually a whole app with an interface! Drag the folders onto that app icon in the Dock and it will, by default, compress them into individual archives. You may also want to change the compression format to ZIP—the default, compressed archive, is a .cpgz file, though that’s more a matter of personal preference—the behavior is the same.

While I wish this was easier to do in Shortcuts, I’m gratified that there was a pretty simple way to do it with no scripting at all. Always a nice discovery.


  1. Weirdly enough, if you hold down option while bringing up the contextual menu, it changes the text from “Compress” to “Compress X items”…but the end result is the same: one archive for everything you select. I’m not even sure why it changes the text—perhaps it’s a bug? 

Calcification dilemmas

The danger of following Apple’s successful playbook; Dan and Jason get ready to solve some puzzles.


By Jason Snell

Counting almost-duplicates in very long lists

Feuding Families and the Upgradies

In the past couple of weeks, two different projects of mine have been released that were powered, at least in part, by a Python script that eliminated enormous amounts of labor from a process that used to take hours of drudgery.

Both The Upgradies and Feuding Families rely on compiling a list of most common answers from hundreds of submissions into a free-entry box in a Google form. As you might expect, this leads to some pretty inconsistent data entry. Poll people about their favorite Apple product of the year and you’ll get Mini, Mac mini, M4 Mac mini, The Mini, The Mac Mini, The New Mac Mini, and even things like Macmini and Mca mini and Macini.

I started down the path of automating because I just thought computers would do a better job of counting identical input than humans would. And that’s true, but the more I thought about it, the more I wanted the tool to go beyond counting identical entries—I wanted it to throw all the similar entries into the count as well. Why not?

Continue reading “Counting almost-duplicates in very long lists”…



Our use of CarPlay, Android Auto, or other systems for navigation; creative AI problem-solving cases we’ve encountered; thoughts on Sonos’s struggles and future as a customer; and whether we use TikTok amid its potential US shutdown.


By Joe Rosensteel

Apple wrote checks Camera Control can’t cash

Visual Intelligence disclaimer.

The Camera Control button control on the iPhone 16 family seemed like a good idea, but the devil’s always in the details, isn’t it? Apple made too many promises, all of them in conflict with one another because they all rely on using the same tiny hardware feature to function. And as it ships more features, things aren’t getting better.

A half-baked half press

It was definitely confusing that Camera Control was introduced as a shutter button that could also be half-pressed—but the half-press gesture didn’t do the thing it does on every normal camera, which is to lock focus and/or exposure. Apple shipped Camera Control with a complex swipe-and-press interface to move among different functions but said that the most basic exposure/focus function would be coming later.

The new half-press feature is in direct conflict with the original overloaded half-press feature. To enable it, you need to go to Settings and then to Camera -> Camera Control where there’s a toggle for AE/AF Lock.

In hindsight, it’s absolutely the right move to have this feature disabled by default. Not only because most ordinary people wouldn’t want to use it, not just because it is in such deep conflict with the tiny half-press menu overlay for the slider functions, but because it is terribly executed.

First of all, you don’t always want to do both AE and AF lock. Sometimes you do, but not always. We’ll set that aside for now. The way that the iPhone handled AE/AF before is that you could tap on something, and it would set focus and exposure for that region you tapped. If that subject, or your camera, moved, then the temporary lock would go away. If you tapped and held, then you’d get an actual AE/AF lock, in which the subject or the camera could move, and the AE/AF would stay in place.

A way to get around the lack of independent exposure controls in the Camera app is to tap the sun icon overlay next to your single-tapped region and drag the exposure up or down to perform exposure compensation relative to the exposure setting the Camera app picked for you. This comes in handy when I take photos of neon signs. You can also get exposure compensation in one of the overlay submenus revealed by tapping the top arrow to expose the bottom row, as you do, with the plus and minus in a circle. Not a sun. (It’s as perfectly logical and consistent as the rest of the interface.)

The problem with the AE/AF lock feature triggered by Camera Control is that it activates a large region of the center of the screen. With a camera, you can set this to be a significantly smaller center area. Basically a cross-hair or just a single phase detection point in the center. Even if you tap and hold on the screen for AE/AF Lock, the region of the screen is much smaller.

If a “subject” is in frame, like a person’s face, the Camera app draws a box specifically around the bounds of their face instead of the larger region box it draws for a landscape or other wide shot. It’s still not a tiny box you’re sticking to a person’s eye, but it does not cast the wide net that the oversized region box does.

The reason the region size matters is that if your subject is layered in depth—let’s say a foreground, middle ground, and background—then you’ll capture some of another layer in what you’re trying to lock instead of just the center-most point. It’s a lack of precision. That’s for both metering for exposure and focus together. Again, for some reason, you can tap, or tap and hold, to get a finer level of control than you can with the thing that has “control” in the name.

A side by side series of two screenshots. They both show the same scene of a living room with a Christmas tree in the background, and MacBook Pro screen in the foreground, with a couch in between. The Christmas lights are warm, and the display is cool blue. The first image shows the AE/AF lock region from the Camera Control. The second image shows the AE/AF lock region, which is 2.5 times smaller.
In these two screenshots, you can see that the Camera Control is going to grab and lock on to a region that is 2.5 times larger than the region you get from tapping. The overlays are highlighted in red to read them more easily against the warm environment.

You can still get to the layered object you want to lock to by moving more broadly to capture only that subject in the large center region, but that’s more effort than tapping and more movement than you’d have to expend using a real camera with a smaller center region since you need to get what you want in that large box.

There are no deep menus to go into to refine the region size or lock only exposure or focus. This is the entirety of the feature enabled by the buried toggle. On or off. Press the button gently, but not too gently. Also move a lot, maybe.

Otherwise, you can simply give up and tap the screen, which anyone with any model of iPhone can do. What a selling point for Camera Control!

This is absolutely where third-party camera apps can fill a void, but then what was the point of doing all this not-so-useful work for the official top-dog Camera app?

Lacking in Visual Intelligence

Apple also included Visual Intelligence in iOS 18.2, and it’s a huge disappointment. The two on-screen buttons always divert you to two different third-party services. If you select Ask, the image will be sent to ChatGPT. If you select Search, it will be sent to Google. There are appropriate warnings for both services, but again, Apple’s vaunted new feature is primarily a quick image upload to a partner.

Two screenshots side by side of the prompts for 'Ask' and 'Search'. The two buttons for each are 'Continue' and 'Not Now'.
We’ve got both kinds. Country and Western.

Other options can be triggered if it detects certain criteria, but it’s pretty picky about it and doesn’t tip you off that it can do more until you press the shutter. Unlike constantly showing “Ask” and “Search.”

In one case, I held my phone up to a yellow warning sign in Spanish, and it offered up a Translate button, but only after I hit the “shutter” button, which isn’t saving the photo, but hitting pause on the input for the software to more thoroughly examine it. Google’s apps and Apple’s own Translate app offer live translations without needing to hit the shutter to pause, but Visual Intelligence doesn’t have that option.

A yellow, diamond-shaped warning sign with the image of a hand being punctured. There is text below that has been translated as 'Danger'.

There is also the option to summarize the text you took a photo of with the shutter button. It’s probably the least likely thing I would want to do, but hey, it’s something the software can do, so why not?

Apple has many other machine learning models for all kinds of image recognition, but only the ones that use optical character recognition are present. I can’t use this to identify a plant, for example. I have to take a photo of a plant, go to the photo in my Camera Roll, expand the photo, thumb the whole thing upwards to reveal the info panel, and then tap on the plant identification option there. The same goes for animal and landmark identification.

Conversely, you can’t use Visual Intelligence “Ask” and “Search” features on a photo that you’ve already taken from inside of the Photos app, like you can use those other features. You can certainly send those images off to ChatGPT or drop them in the Google app. What gives? Why not put the “Ask” and “Search” buttons under every photo? Why not put them in context menus?

Maybe, someday, all of those things will be true, and Visual Intelligence will act as an umbrella for all the image-based models Apple has. Why make a promise about shipping this right now when it’s really not terribly beneficial to anyone —including Apple?

If it was to appear like they weren’t behind (Google Lens shipped one million years ago), then unfortunately the shipping product reveals that they are more behind than they were if it was something in a lab they were still promising. There is a danger that this trains customers that Visual Intelligence is not worth using, especially since it’s so hard to get to.

Dialing back the dial

Speaking of training customers, I’ve reached the point where Camera Control has trained me to turn off the features I keep accidentally triggering. Settings -> Camera -> Camera Control -> Accessibility -> Toggling off both Light-Press and Swipe. I’m not interested in accidentally triggering them, and there’s no reward for trying to do anything with those on purpose.

Apple has not addressed any critiques of Camera Control other than the “we are totally shipping a half-press focus lock” promise from the launch. Anecdotally, most people use it as a Camera app launcher or shutter button that’s easier to reach than the volume-up button. Yay?

I’ll leave AE/AF Lock on for the time being, but the truth is that with the way it’s enabled, it’ll likely return to the default too, and all of that will be for naught. I currently regret that so many of us asked them to give us this, because it was only ever going to be another thing on top of this complicated stack of decisions they already made about what Camera Control is. They can’t take these things away, but maybe they can make profiles, or group them into modes to make the button do less under certain circumstances instead of people not wanting to mess around with it. Perhaps it’s time to exercise some control.

[Joe Rosensteel is a VFX artist and writer based in Los Angeles.]


By Jason Snell for Macworld

Apple’s historic successes have bred its recent failures

Apple Intelligence

Truly, Apple’s rise from near-bankruptcy to being one of the most valuable companies in the world is a story for the business history books. But as legendary football coach John Madden frequently said, “winning is the best deodorant.” When you’re riding high—and have been riding high for a couple of decades—it’s very hard to notice the parts of your business that have begun to emit a bit of an odor.

To take another page from football, Apple has a winning playbook, and it keeps using it. But if you keep calling the same plays without adapting and reacting, a winning playbook can become something much worse. In the past year, two major Apple product launches show just how calcified the company’s strategy has become—and how much it needs to change.

Continue reading on Macworld ↦


Bonnets off to the BBC, the trouble with headlines, Socrates on mountain skis, some slight existential dread about Mac software, irrational love of old computers, Apple’s smart home strategy for 2025, and the long wait for some AI features.


Watch Duty, the crucial wildfire tracking app

The Verge’s Abigail Bassett profiles Watch Duty, the remarkable nonprofit app that’s become a must-download utility during the Los Angeles fires:

Watch Duty is unique in the tech world in that it doesn’t care about user engagement, time spent, or ad sales. The 501(c)(3) nonprofit behind it only cares about the accuracy of the information it provides and the speed with which the service can deliver that information. The app itself has taken off, rocketing to the top of Apple’s and Google’s app stores. Over 1 million people have downloaded it over the last few days alone.

The reasons we’re appreciating Watch Duty right now are tragic, but how great is it that this app exists?



Search Six Colors