LEDE ON
Had you listened to me when I first warned about the coming storage shortage late last year and just bought stock in the key storage companies, you would have doubled your money twice with SanDisk, doubled your money once with Micron and Western Digital, and almost doubled your investment with Seagate, and that's despite the fact that the war with Iran has everyone now betting against semiconductor and tech companies because they won't be able to keep it up due to a shortage of Helium. Here's my update: storage constraints are likely to continue all this year and next, and that's assuming the semiconductor plants can still get gassed up.
That said, don't invest in the storage companies at this point. While they still have upside growth potential, they also are now riskier than before. The new bet—assuming you're a gambler—is on the short side against companies that rely upon buying storage (the exception to this is a couple of players such as Apple, which has been buying up long-term commitments). You can't, for example, tell someone to buy a US$1000 camera and put a US$1000 card in it, or to get a new computer that suddenly costs twice as much as it used to because of storage component costs (or it stays the same price and reduces RAM and storage capacities).
Bottom line: SSD, card, RAM, and even hard drive supplies are short and getting shorter. Do not delay purchasing these if you can find stock at a good price. However, the shortage will ultimately create another problem: counterfeiting. The guy on the corner who as you pass says "psst, dude, you need some memory?" is not your friend. Let's be careful out there...
——————————
News
Nikon in space
The crew of the Artemis II was given extra photographic instruction by National Geographic, and is using Nikon cameras as they make their mission to circle the moon. Unfortunately, the images that are mostly coming out of NASA that you're seeing appear on other sites have gone through processing (Lightroom Classic tags appear in the EXIF, and there are a few small changes when compared to the originals, though NASA does not appear to have used Dehaze or noise reduction; see pixels view, below), so be careful about assessing them via what a Web site posts. Apparently, NatGeo's "training" didn't have the astronauts populating IPTC data, either, as I'm seeing none on the originals. Maybe if NASA had actually asked a Nikon expert for help... ;~)
Almost all of the first images I've seen from the mission were taken with a Nikon D5 (DSLR) and Nikkor 14-24mm f/2.8G (F-mount). The two aurora image shown above, for example, is D5, 22mm, f/4, 1/4 second, ISO 51200, manual exposure, matrix metering, exposure compensation of +1EV.
I took the original and ran some simple processing on it to emphasize the "little blue marble" idea, and came up with this:
The crew has other Nikon gear with it, including a Z9. If you want to see the images from the mission, you can do so at images.nasa.gov. When you go there, you'll see a Show EXIF Data button underneath each individually viewed image that allows you to see a fair amount of the camera data. As I write this, all of the images have been either from a Nikon D5 or a GoPro Hero 4 (too bad Nikon dropped the KeyMission, right?).
——————————
Tip
Staying on top of security
More photographers use Macs than the general public does when measured as a percentage. The myth that the Mac, because of its lower market share, was less targeted by malicious hackers and thus more secure than a PC is just that, a myth. The entire Apple ecosystem is the reason: macOS (Mac), iOS (iPhone), iPadOS (iPad) and the rest compromise a huge user base while sharing a lot of the same code base, so targeting Macs is pretty much the norm now. When I update this site, I'll be updating some of my Mac security advice (current advice is here).
Basically, the change to my advice is this: you need to be on the most recent version of macOS now. That, coupled with a firewall/virus protector package, and using something like SilentKnight to verify that all the security updates are getting installed and run is your best protection if your computer connects to the Internet. If you don't connect to the Internet, how are you reading this? ;~)
Recent events, most notably the DarkSword exploit, are making what I wrote in the last paragraph the safest way to run a Mac now. That's because the current OS's get first fixes (it took over a week and Apple going back on a previous policy before some older versions got fixes). To put it simply, DarkSword lives off of vulnerabilities in a number of key areas from the kernel through all of the app layers, and just visiting a malicious Web site can lead to full device compromise that will be unseen by you and requires no action on your part. We now know that DarkSword has been in the wild since November 2025, which is one reason why you need a security package that can scan your system for it (Apple's XProtect, in theory, also does that, but I've seen XProtect fail to scan for days at a time).
The problem, of course, is that Apple is aggressive about deprecating older things as they update their operating systems. This means that some of your older software and devices won't work with the new macOS as it launches. The way around this is to have a machine with enough RAM and internal storage so that you can virtualize an older version of macOS within the new one. For instance, I'm running macOS Monterey inside of macOS Tahoe (I even put the Monterey dock on the left and the Tahoe dock on the right so I know which one I'm using). This is not a trivial process, nor is it particularly complex. It's a bunch of simple steps using a virtual host (such as Parallels). But I make the point about RAM and storage because you're probably going to consume double the RAM and storage to do this right, perhaps more depending upon your older apps and data.
I wish Apple did this right (they have a virtualizer that developers use so that they can test across multiple versions of the macOS). At the point where a completely new named version of macOS comes out, when it installs, it should offer to install your older setup intact in a new virtual machine. Until then, you and I have to do this manually, and it's something that can and should be automated, not waste our time learning the steps and then carefully following them.
——————————
Commentary
Behind the scenes can be brutal
One thing you might have noticed about my sites recently is that they load faster. My first redesign, filmbodies.com, essentially snaps onto your display (assuming you've got a reasonably fast Internet connection). Even though I'm now using larger images, the average load time is down to 1.2 seconds for a first time (no cache) hit. I've seen it go as low as 906ms. And that was with a Google font API lookup that I still need to resolve and remove, otherwise it would have been less than a second. The site I use to do this analysis grades that as a B (by contrast, the photography site I show an example of below gets a D). I'd like to get that to an A, but I still have some work to do on filmbodies. (By contrast, bythom currently is graded as A, despite loading twice as slow, so it isn't all about speed.)
The more background services you use (Google fonts, analytics, ad tracking, user tracking, database calls, etc.), the more a site is likely to sometimes load slowly or seem to sputter as it loads, as the demands on each of those services running behind the scenes can produce slower responses at times. Here's an example of how much is going on in the background for a site you probably know and how one weak link in the chain can make it sputter into ridiculous load times:
One of the things I've noticed since the US attacked Iran is that Web servers are clearly getting attacked more frequently. Which makes for a higher chance that a service a site is using responds more slowly. While I positioned the removal of ads and tracking mechanisms from my sites as a change from making you the product to making my sites' information the product, one thing I was clearly thinking about as I started the redesign process was bringing as much as possible back to the server I control and whose performance I can monitor and maintain.
——————————
Commentary
Overextending
Yes, I know that the dearth of new cameras has got all the photography Web sites in a tizzy, but a headline of "Two Legends Return" to describe what executives said at CP+ about considering development of a LX100III or OMPen F was just pure clickbait. I can add six characters to their headline and keep it clickbaity but more accurate: "Will Two Legends Return?" And the answer to that question is that we still don't know, but at least now the two companies in question, OM Digital Solutions and Panasonic, have expressed that they're clearly considering it.
While we're at it, I'll tell you the likely reason that the kimono is being opened slightly about future developments (typically the Japanese executive response is always something along the lines of "we don't talk about potential future products"). It's resource planning, basically. With the supply chain cutting off access to so many parts and delaying introductions of cameras already in progress—and that will worsen over the coming year—having a better idea of what resonates most with customers is going to be absolutely necessary to stay in business.
For instance, in OMDS's R&D they have enough resources to do one major release in the foreseeable future, so should it be OM-10 or OM-Pen? It really should be both, as they fill different needs in an overall lineup, but I don't think OMDS has the capital and resources to do both near simultaneously, let alone the ability to market and sell two lower level models simultaneously. So by saying they're considering making a Pen replacement, they can better gauge fan response to that. If I'm reading between the lines correctly, they either already have an OM-10 ready and are trying to figure out the next model after that, or they're distrusting whether an OM-10 is the right "next camera" for them.
The way Panasonic is responding to the "will there be an LX100III" question is more amusing than functional in my assessment. It's as if they're waking up to the "sudden" popularity of compact cameras. I put the "sudden" in quotes because it was clear to me that the camera companies were cutting off compact production arbitrarily starting in 2018 (and later to deal with parts shortages and supply chain problems due to the pandemic), not because people didn't want to buy them. I believe the Japanese camera industry 100% missed the mark by aggressively eradicating compacts from their lineups. Okay, they saw that as a way of creating an arbitrary method of raising per unit value, which is a form of optimizing component acquisition to produce better gross product margin. The pandemic simply exacerbated that. But the customer demand for compacts was still there at a much higher level than the Japanese were delivering, and only recently do we see them acknowledging that.
I'd have to knock Nikon for this, too. The camera they should have introduced by now is the Coolpix Z. Essentially their update of the Coolpix A to compete with the likes of the Fujifilm X100VI and the Ricoh GRIV. This would fit well with their stated (but not always followed) attempt to cater to prosumer and pro users. To this day, the A is a usable, competent camera that produces excellent imagery. Imagine if it had been updated to Z System levels and included those Flexible Picture Controls at the press of a button. However, Nikon basically didn't know how to market the Coolpix A when it was introduced (together with the forgettable P330), particularly since they also had Nikon 1 models they were trying to push at the time. The A was just one of 12 Nikon compacts (and 3 Nikon 1's) that were introduced that year, which Nikon marketing had troubles keeping up with. Meanwhile that year the D610 was an emergency response to the shutter splatter problem they were dealing with, and the Df got all the rest of the marketing attention that year. Nikon thinks that the Coolpix A didn't succeed because no one wanted it. Well, they're right, they didn't tell anyone why they'd want it. Shimatta!
One problem with everyone giving up on compacts for so long is that we're once again on the cusp of the smartphones making another advance up the lower end of photographic capability. That said, all three potential cameras I just mentioned (Coolpix Z, OM Pen, and LX100III) would likely be high enough in capability and status that they could be hits. But have they returned? Masaka.
-------------------
Wrapping Up
And in other news
▶︎ Atomos Shinobi controls ZR. Firmware update 11.07.00 to AtomOS now allows a Shinobi II to control the ZR with current firmware, including doing touch focus on the remote monitor.