I’m torn on the iPhones 16’s Camera Control – it’s handy but unfinished

News Room

If you’ve read my previous thoughts on iPhones here at TechRadar and its sibling site Tom’s Guide, you’ll know I have fairly firm opinions on Apple’s smartphones.

Since moving from Android to iPhone at the end of 2021, I’ve not gone back to the platform Google built, despite trying some of the best Android phones. The ease of iOS has taken in me; I love the titanium construction, I’ve found Ceramic Shield glass to be a minor game changer, I enjoy the Action button, and the cameras almost never let me down on iPhones.

But for once, I’m on the fence.

What’s got me pondering is the Camera Control ‘button.’ In some ways, it’s a cool new feature that uses haptics well. In other ways, it’s superfluous and not fully featured.

I’ve been trying out the iPhone 16 Pro Max for a couple of weeks now, and when it comes to capturing a photo, l try and use Camera Control as much as possible. As I’m 37 and a millennial, I still like snapping photos on my phone in landscape orientation, so having a physical button where my finger naturally sits is good for capturing a shot without messing up the framing by tapping on the screen or trying to hit the Action button – I have this mapped to trigger the ‘torch’ anyway, which is surprisingly helpful.

I also like flicking through zoom ranges with a swipe on the Camera Control without the need to tap on small icons. The exposure control is kind of cool, though swapping between the features Camera Control can control doesn’t quite feel intuitive to me yet, and often, my taps cause me to lose the precise design of a scene.

So yeah, Camera Control is interesting. But…

Did anyone really ask for it? It feels like a feature for the sake of Apple’s mobile execs to have something new to talk about at the September Apple event. It’s just about a ‘nice to have’ feature, but it’s hardly a phone photography game changer.

Not my tempo

However, maybe I’ll warm to it over time. Yet, the biggest issue is the lack of AI tools at launch for Camera Control. Apple actively touts the AI features for Camera Control that can be used to smartly identify things the cameras are pointed at and serve up all manner of information. That hasn’t happened yet, with a rollout arriving post-launch when Apple Intelligence fully arrives; there’s a beta option, but I’m not willing to try that on my main phone.

I’ve yet to understand that. Sure, other phone makers have touted AI features that will come after their phones are released and may be limited to certain regions, to begin with, but at least they launch with some of the promised AI suites. The iPhone 16 range launched without any Apple Intelligence features.

This is not what I expected from Apple, a company that famously doesn’t adopt new tech until it’s refined and ready for polished prime time. So, for it to launch smartphones without next-generation smarts is baffling to me. But it’s also the primary reason why I feel torn about Camera Control; if it had Google Lens-like abilities at launch, baked into a hardware format, I can see myself being a lot more positive about Camera Control.

Of course, Apple’s use of such a camera button will undoubtedly cause other phone makers to follow suit. I only hope they don’t skimp on features when their phones launch.

As for Camera Control in the here and now, I’ll keep an open mind and keep using it; I’ll just cross my fingers that it’ll become seriously handy once it gets its prescribed dose of AI smarts.

Read the full article here

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *