“Why haven’t the camera makers incorporated a cellular capability in the camera bodies?”
Simple answer: it’s a cost. A big cost. You’d have to buy both the communications chips and antenna from someone like Qualcomm, but you’re buying in lower volume than the smartphone makers. Then you have to certify the result in dozens of worldwide agencies, and then further certify that it is compliant with hundreds of cellular providers. Let’s not forget that you’ve got a lot of software integration to do to make it feel/work right. The time and energy to do all that becomes yet another cost.
That said, I think it’s time that Canon/Nikon/Sony decide to do that on their flagship cameras. They’re trying to skirt around the issue with apps (e.g. Nikon’s NX MobileAir app), but that’s a kludge, and cumbersome to use in reality (a USB wire between camera and phone, too). For a Z9 II, for instance, it would make a lot of sense to incorporate cellular communications in the top end camera. The photojournalists and sports photographers using that level of camera would certainly benefit from having cellular as one of the output options on the camera, even if they have to have their organizations code something at the other end, and even though they often find themselves in places that have huge hits on cellular performance (e.g. packed stadiums). As it is, we’re all using bigger, clumsier kludges to get our images out to news services as fast as the smartphone user standing next to us. Moreover, having cellular and H.265 built-in opens up the ability to live stream from anywhere, too. We do that today by pushing HDMI to our phones and having the phone do the dirty work, but again: an awkward cable is involved, as well as an extra device sometimes.
It doesn’t help that the Internet sharing world is a mess. Is it worth trying to code for Twitter anymore? Musk has once again proven that he shouldn’t be involved in software. Not that Meta (Facebook, Instagram, et.al.) is all that much better. The current Internet social networks are all trying to figure out how to keep others away from their customers and closing off API access, it seems. There’s always email and FTP servers, though ;~). And, despite the clear opportunity, I’m surprised no one has taken on building the right software to reliably use Wi-Fi, which the cameras have. (Yes, I know there are some simple projects here, such as AirNEF, but I’m talking about something more than just a bucket to move an image from A to B.)
Bottom line is this: cameras still haven’t started truly using 21st century communications. The camera companies keep forgetting that they’re not just building a box with a hole in its side, but should be creating a whole ecosystem, with emphasis on the “system” part of that. Moreover, a system that fits into other known and used systems well.
“You wrote about upgrading from smartphone to camera. What about the opposite, downgrading from camera to smartphone?”
You beat me to the punch. Five years ago almost no one said they’d give up their dedicated camera for a smartphone. But one thing I noticed in those five years is that more and more people were effectively doing that, particularly in what we call the mid-range zoom focal lengths (e.g. 24-70mm).
Consider this reader comment: "For the past 10 years, my Nikon gear has been in the closet. I've not been taking photos. But, over this past year, my cell phone camera has found its way into my hands -- more and more. Given the technical limitations, using it has challenged me to concentrate on COMPOSITION once again. As my wife and I go for our walks along the forest trails, my cell phone camera (a Pixel 6 Pro) pops out of my pocket as my eye catches a possible composition. And my photography today is satisfying me more than ever, with photos that I can enjoy printing and sharing."
Since the pandemic eased, I’ve noted more and more people on safari with one dedicated camera and long lens, but using a smartphone for everything else, including video. In the last year, I’ve gotten more and more “my smartphone has me using my camera less” messages. Recently, I’ve gotten a few “I give up on cameras and am downgrading” messages.
Downgrading is absolutely a trend, and it should be a disturbing one for camera companies. Only Canon seems to want to try to build the convenience consumer camera any more; most of the camera companies are trying to move away from the <US$1000 market and build only speciality or high-end cameras, though they don’t seem to be all that good at doing anything but the highest end ones.
“I look at the Apple Vision Pro and wonder how people will actually use it. Apple says it is their first true 3D camera as well as media viewer. What does that mean? Is it like a Viewmaster? Has the Matrix finally arrived? I imagine this headset on vacations, taking panoramic views in 3D. I can see it having a game changing effect in the same way the iPhone affected the camera industry. But I have no idea what this effect will be! Is it too expensive for people to use? It seems people will buy an Apple product no matter what it costs. What will photo editing on it be like? Questions, questions, questions.”
Yes, a lot of questions. Some of them have early answers, at least for those that have actually had a chance to try the headset. I’ll give you my take on a few of your questions.
First, that 3D capture and playback loop that the Apple Vision Pro enables is a bit like Live Pictures (and Nikon’s old Motion Snapshot), but very much enhanced; 3D done right is more “involving” on view than just video clips. What people aren’t quite understanding yet is that “photography” has been moving from a stationary 2D thing mounted on a wall, towards a 3D video experience that’s better than pretty much any 3D theater/TV experience you’ve ever encountered, and Vision Pro probably gets us most of the way there. The unknown here is just how Apple is going to manage the urge to create huge data sets this way. After all, most Apple devices seem to be 8GB RAM, 256GB storage these days. That’s going to be limiting for people wanting to do that “truly capture my vacation in 3D” thing. You’re also not likely to be walking around on your vacation with the Vision Pro on all the time, so it will take a unique experience for you to want to put it on and drain its battery. Maybe even a bodyguard to make sure you don’t fall over ;~).
Expense should be ignored for the moment (at least in terms of whether the Vision Pro will change things or not). Apple’s long history says that pretty much every new device they created after the Apple II was accused of being overpriced at introduction (and sometimes beyond). To do complex technology things “first” generally means you aren’t cost efficient, so you either have to take a loss on the first units (the old TI management system), or you try to at least breakeven as you work to convince the world they need your new gizmo as you try to drive costs out so that it comes down in price. I do think Apple made a bit of a mistake here: US$2999 is psychologically different than US$3499, and that increases the marketing friction of getting Vision Pro adopted.
In terms of game-changing impact, there’s one area that I have only seen given minor lip service so far: privacy. In essence, the Apple Vision Pro can be your monitor(s). Ever sat next to someone on a plane doing work? I can say that in the million+ miles I’ve flown, I’ve seen far too much “secret information” that shouldn’t have been disclosed (I actually don’t look to see the information; I’m UX curious to see how people interact with technology, i.e. what controls they use, how they type, what apps they’ve chosen, how they copy between apps, etc.).
The other area that isn’t getting talked about is porn erotica. Every new technology has been locked onto very early by that industry, including mundane ones such as payment methods over the Internet. Coupled with the privacy thing and the 3D thing, the Apple Vision Pro has OnlyFans written all over it. At both ends.
But these aren’t photographic things in the sense those of you reading this site would be interested in. The closest thing to what we’re doing with cameras would be that 3D vlogging capability, and I know 90%+ of you are going to respond with “but I don’t need video in my camera.” (Hey Apple, I’ve got two different African Safari experiences coming up. Need a beta tester to prove them wrong? ;~)
It seems to me that the only thing that intersects closely with photography right now that some might consider using a Vision Pro for might be some form of post processing (e.g. Photoshop VP). A properly designed software product might allow you to use your hands and eyes to do selection/masking/painting in ways that feel more intuitive. Call it a natural step beyond the Wacom tablet. As Adam Engst at TidBits put it: “the Vision Pro is just another way to do what you already do." But not many of you would pay Apple’s (and Adobe’s) price for that unless it had a very clear advantage you couldn’t get otherwise. (I take that back. I can think of one clearly photographic thing that a Vision Pro might be tuned to help with. But I’m going to keep that to myself for the moment.)
Thing is, Apple isn’t going to have sell millions of these for us to start seeing what it can really do and what it is really used for. The Apple developer ecosystem is going to sort that out very fast.