Last Update: 6/7/2010

All About VR


Nikon's VR system explained

Original: 4/28/2010

After talking with several design engineers I've made a few modifications to explanations in this article. However, despite changing some of the technical description, there are no changes to my recommendations. Even after being online for some time, this article continues to generate responses and controversy. Because there are conflicting posts on other sites, people tend to disbelieve what I've written. All I say is: do so at your own risk. This article is based upon years of experience with Nikon's VR system, very close analysis of testing results, talks with many other professionals, and even information from Nikon insiders. At this point, the only thing about Rule #2 that isn't fully understood is the "why." For that we'd need more access to the design engineers.

The first and most important rule of VR is this: never turn VR on unless it's actually needed.

Yes, this rule flies in the face of what most everyone in the world seems to do and what Nikon implies with their advertising and marketing. The simple fact is that VR is a solution to a problem, and if you don't have that problem using VR can become a problem of its own.

To understand that, you have to understand how VR works. In the Nikon system, VR is essentially a element group in the lens that is moved to compensate for any detected camera motion. Because this element group is usually deep in the middle of the lens, usually near the aperture opening but not exactly at the opening, you have to think about what is happening to the optical path when VR is active. Are there times when it shifts where it imparts a change to the image quality other than pure stabilization? I believe there are, though the impact is visually subtle. Some of the mid-range distance bokeh of certain VR lenses appears to be impacted by VR being on. Put another way, the background in the scene is slightly moving differently than the focus point in the optical path. This results in what I call "busy bokeh," or bokeh that doesn't have that simple shape and regularity we expect out of the highest quality glass.

Most people using VR don't question the mechanics of the system. They simply believe it's some special form of magic. It's not. Physics are involved, not magic. And one of the physics issues is the sampling frequency. The sampling frequency of the motion detection mechanism determines what kind and how much movement can be removed. Care to guess what the sampling frequency might be? 1000Hz according to Nikon. That sounds pretty good, doesn't it? Nope. 1000Hz is 1/1000 of a second. Nyquist tells us that we can only really resolve data accurately below half the sampling frequency, thus it can accurately only take out movements as small as 500Hz (1/500 second). While this sampling frequency is of the camera motion, it is not completely uncorrelated with shutter speed. For example, the shutter curtains only travel across the sensor at speeds above 1/250, exposing only a portion of the image at a time. Another aspect of the VR system is that it "recenters" the moving element(s) just prior to the shutter opening. Simply put, there's a lot that has to be right at very short shutter speeds in order for there not to be a small visual impact, especially with long lenses.

But that's not all: when you have VR turned on, your composition isn't going to be exactly what you framed. Yes, the viewfinder shows the VR impact, but Nikon's VR system re-centers the VR elements just prior to the shutter opening. This means that you can get slightly different framing than you saw.

Rule #2: VR should normally be off if your shutter speed is over 1/500.

Indeed, if you go down to the sidelines of a football game and check all those photographers to see how their lens is set, you can tell the ones that are really pros: VR is usually off (unless they're on a portion of the stadium that is vibrating from fan action). Those pros have all encountered the same thing you will some day: if you have a shutter speed faster than the sampling frequency, sometimes the system is running a correction that's not in sync with the shutter speed. The results look a bit like the lens being run with the wrong AF Fine Tune: slightly off.

The interesting thing is that pros demanded VR (IS in the case of Canon) in the long lenses, then it turns out that they very rarely use it! I'd say that less than 10% of the shooting I do with my 400mm f/2.8 has VR turned on (and by the way, I hate the rotating VR switch on some of these lenses--it's so easy to not notice what position it is in). A word of advice: some of those previous generation non-VR exotics are relative bargains now. Consider it the VR bubble. Some day people will stop paying such silly premiums for VR over non-VR. At least they should. I know of several photographers who blew US$3000+ making the switch from non-VR to VR versions of lenses. That's too much of a premium, I think.

Anecdotal evidence continues to pile up about VR and high shutter speeds. In hundreds of cases I've examined now the results are the same: the lens seems to have more acuity with VR off above 1/500. That's my own experience, as well. A small handful of people have presented with me with evidence of the opposite (VR improves their results above 1/500). In most of those cases I've been able to find that it's not VR itself that's helping remove camera motion, but that their handholding or tripod technique is such that they're not getting consistent autofocus without VR, but they are with it. My contention is that they'd see even more improvement by dealing with the handling and focus consistency issue and turning VR back off above 1/500.

However, as with virtually everything in photography, there's a caveat to the above. For instance, what if you're sitting in a helicopter shooting at 1/1000, should you use VR? One of the things that Nikon just doesn't explain well enough is the concept of "moving camera" versus "camera on moving platform." If the source of motion is your holding the camera steady, then what I wrote above about turning VR off above 1/500 is absolutely true. Ditto for semi-steady situations, such as shooting off a monopod. However, if there's a platform underneath you causing vibrations (car, boat, train, plane, helicopter, etc.), things are a bit different. This is what Active VR versus Normal VR is all about, by the way. Active VR should be used when you're on one of those moving platforms. Normal VR should be used when you're on solid ground and it's just you that's shaking. Basically, if you're vibrating due to outside source, Active VR should be On. If you're the only source of camera movement, then use Normal.

Rule #3: If something is moving you, use Active. If it's just you moving the camera, use Normal.

The difference between Active and Normal has to do with the types of movements that are expected and will attempt to be corrected. Platform vibrations tend to be frequent, constant, and random in direction. Handholding motion tends to be slower and move in a predictable path (e.g. when you press the shutter release hard the right side of the camera moves downward--it's Newton's Law, not mine). Knowing which type of motion the VR needs to deal with lets the system optimize its response.

So, getting back to our 1/1000 example while shooting on a helicopter, we have a conflict. The motion that you impart by your handholding may not get corrected right by the VR system because your shutter speed is faster than the frequency with which corrections are done. But the platform you're sitting on is imparting small, frequent, and random motions that might actually be corrected (but probably not fully) by having VR on. The question here is whether the improvements due to removing some of the platform movement are better than the possible degradation due to the shutter closing faster than the VR is working. There's no clear answer to that, as every situation is going to be a little different, but my tendency is to experiment with Active VR being On versus VR being totally off when shooting from platforms at high shutter speeds. I closely examine my initial results, and make my final decision based upon that. Of course, that in and of itself can be a problem for some, as examining a small screen in a moving vehicle isn't exactly easy and precise. Still, I sometimes see an improvement with VR as opposed to without it when I'm shooting at high shutter speeds from a vehicle. At the same time, that's not as much improvement as you'd see using a dedicated gyroscope instead of VR. If you regularly shoot out of helicopters, a gyro is a better investment than a more expensive VR lens.

Aside: There are a number of photographers that say that using VR above 1/250 (or the flash sync speed, if slower) should be avoided. Some explain that shutter speeds above that are done by moving an opening across the image rather than having the full image exposed simultaneously (this is a simplification, but it's good enough for this discussion). Thus, VR corrections done on shutter speeds above 1/250 are correcting only a portion of the image at a time. In practice, I believe I can sometimes see very small changes at 1/500 shutter speeds versus 1/250 when VR is On. Not enough change, however, for me to alter my 1/500 limit. Above 1/500 I can much more clearly see visual changes, and changes I don't like in my images.

At the other end of the movement spectrum, we have subject motion. If the subject is moving, using VR with longer shutter speeds is problematic. I've seen people use 1/15 with VR on for moving subjects. Well, even a slow-walking human has enough movement in 1/15 to cause edge blur.

Rule #4: If your subject is moving, you still need a shutter speed that will stop that movement.

This is a tough thing to learn, and it's usually learned the hard way. Because camera makers essentially tout VR by making assertions like "allows a four-stop improvement over hand holding," users start thinking like this: "if I can handhold my 100mm lens at 1/100, then VR would allow me to hand hold it at 1/6." Well, maybe. But the only motion being removed is camera motion. If your subject moves during that 1/6, it's still going to produce subject blur. Looking back at my Nikon Field Guide (page 51 for those of you following along), we get 1/125 for the minimum shutter speed necessary to freeze a person walking across the frame (1/30 if they're walking towards you). This is, of course, a generalization. There's a more detailed table below the one I just referenced that shows how distance impacts the shutter speed, too. Plus the size of the subject in the overall frame makes a difference. Expecting VR to remove ALL motion is something everyone has to get over:

Rule #5: VR doesn't remove all motion, it only removes camera motion.

Another type of motion comes with panning the camera, and VR has impacts there, too. I've seen people say that they think you should turn VR off when you pan with a subject. There may be times when that's true, but my experience is that VR should be on while panning. That's because the Nikon VR system is very good about detecting a constant camera movement. If you're doing a smooth pan in one direction, the VR system will focus on removing only motion on the opposite axis. That's the way it's designed to operate. The trick is to make sure that your pan is relatively smooth, and not jerky. Most people start to jerk when they press the shutter release during pans. You need to practice NOT doing that and to continue the pan while the shutter is open, not stopping. Indeed, try practicing this at your local track (or other place with some runners present). Pan with the runner and take a picture. When the mirror returned and the viewfinder view is restored after the shot is the runner still in the same spot in the frame? No? Then you didn't continue panning through the shot. Tsk tsk. Try again. Practice until you can take a series of shots and the runner stays in the same spot through the entire sequence, both in the shots and while you're panning between shots. You shouldn't be having to catch up to the runner.

Aside: Back in high school my photography mentor at the time broke me of the habit of stopping during pans in a brutally sadistic way: he sent me to track meets with a TLR (twin lens reflex). You look down into the viewfinder of a TLR. But here's the thing: left to right is reversed. So if the subject is moving right to left in front of you, they appear left to right in the viewfinder. You don't have a chance of following motion with a TLR unless you can relax your brain and make it just mimic the motion of your subject in your own body's motion. You can't look and react, look and react.

Rule #6: If you're panning correctly, VR should probably be On.

Yet another aspect of VR that confuses people is activation. Nikon's manuals don't make this very clear, but it really is quite simple: only the shutter release activates the VR system. A partial press of the shutter release engages it and allows it to begin a sequence of corrections. A quick full press of the shutter release engages it but it really doesn't get much in the way of samples to rely upon and make predictions from (on the pro bodies we're talking 1/33 of a second or so between the press and the shot being taken). Basically, if you engage VR prior to the shot, you tend to get slightly better and more consistent results. That doesn't mean you should always wait for VR to engage before fully pressing the shutter release. If it's time to take the picture, take the picture! VR will give it its best shot at fixing your motion when you just punch the shutter release. But there are two factors that tend to make early VR engagement a better choice if you can do it: first, the VR system gets a stream of data it can predict from; and second, it's difficult to move the camera as much by jabbing the release if you've already partially pressed the release!

The usual issue that comes up with the preceding is the line in Nikon's manual about "VR doesn't function when the AF-ON button is pressed." This is one of those places where the translation in Nikon's manuals is out and out misleading. If the line had read "VR doesn't engage when the AF-ON button is pressed" it would be more correct. In my testing VR is not "turned off" by using the AF-ON button, it simply isn't engaged by that button press. Only the shutter release button engages VR. Thus, if you use AF-ON to focus instead of a partial shutter release, VR is not engaged during the pre-shot focusing. But it is during the shot.

This, of course, creates a slight issue. Optimally, we want VR to have a stream of data just prior to pressing the shutter release fully. If we're using AF-ON to focus, our fingers usually aren't pushing the shutter release partially down, too. But you should practice doing just that. Sigh. That right hand is starting to do a pretty complicated dance: AF-ON up and down for focus, shutter release partially down for VR, right thumb dialing in shutter or aperture or exposure adjustments, maybe right middle finger dialing in aperture adjustments, shutter release fully down with the index finger for the shot. This, by the way, is one of the reasons why I prefer Nikon's ergonomics to Canon's: at least when I'm doing all that hand juggling, my hand and finger positions aren't really moving, especially my shutter release finger. With Canon the tendency is to move the index finger between the top control wheel and shutter release. You can react with the shutter release faster if you're not moving that finger.

There are a few more caveats. If you've got a built-in flash on your camera (basically everything but the D1, D2, and D3 series), while the flash is recharging the VR system is inactive. That's because VR takes power to perform and the assumption is that you want the flash recharged as fast as possible. Thus, the camera turns off the power to the VR system while it's charging up the camera's flash capacitor. If you're shooting flash near full power and doing a lot of consecutive flashes, the flash recharge time can start taking a few seconds. How do you know if power is restored to the VR system? Well, you can't, exactly, but the flash indicator in the viewfinder is a fairly reliable indicator: if it's not present with the flash up and active, VR is probably Off.

Rule #7: If you rely upon VR and use flash, use an external flash instead of the internal one if you can.

I've been holding off on the tripod issue to the end of this article, partly because it's not as clear cut as Nikon seems to think it is. But by now you've probably turned VR off, anyway ;~). Part of the problem is that Nikon hasn't clearly labeled and distinguished their various VR system iterations. Technically, the VR II system on some of the modern lenses should detect when the camera is on a stable platform and not try to jump in and correct. But not all modern lenses have what most of us regard as the full VR II. The recently introduced 16-35mm, for example, comes long after the intro of VR II, but it does not appear to have tripod recognition. Thus, we have another rule before we get to the real rule:

Rule #8: You MUST read your lens manual and see what it says about use on tripods.

Two basic possibilities exist:

1. The manual says turn VR off when on a tripod (sometimes adding "unless the head is unsecured")
2. The manual specifically says that the VR system detects when the camera is on a tripod

Okay, I lied. Forget what the manual says.

Rule #8 For Real: If your camera is on a tripod, even if you're using something like a Wimberley head where it is almost always loose, turn VR off. If your tripod is on a moving platform or one that has vibrations in it, strongly consider turning VR on, but test it to be sure you need it.

So why do I disagree with Nikon? Even with a loose head on a tripod, motion should be fairly easy to control, and you should have removed one possible motion almost completely (ditto with monopods). The problem I have, and which many other pros have noticed, is that the VR tripod detection system sometimes has "false negatives." In other words, the tripod detection mode of the VR II system should be detecting when the system is "quiet enough" to turn off corrections. Most of the time it does just that (Nikon says that the system is smart enough to detect as many as three different types of motion--handholding, platform vibration, and support system movement--because the "vibrations" caused by each of these are recognizable different in wave form). Every now and then, though, VR thinks it needs to correct when it doesn't (or perhaps is still correcting for a previously detected motion that will no longer be present in the next sampling). When that happens, the VR element(s) are moving when they shouldn't be. Usually not a lot, but enough to make for less than optimal results.

Indeed, this is the very same problem as with using VR over 1/500: sometimes it works, sometimes is doesn't. The problem is that you won't like it when it doesn't, and you won't know when it does. If I were to tell you that out of 100 shots you take 10 were going to be bad due to the VR doing the wrong thing, would you still use VR? Remember, when you're on a tripod, all 100 shots should be good without VR (otherwise you have the wrong tripod and head, see this article, or you're using poor technique). I'm not a gambler: I prefer the known to the unknown, so I don't like having random shots spoiled by VR.

Which brings up a whole different topic: what does a spoiled-by-VR shot look like? Well, "spoiled" is perhaps too harsh a term. Sub-optimal is probably a better one. An optimal shot has very clean and well defined edge acuity. Assuming a "perfect lens," edges should be recorded basically as good as the anti-aliasing filter, sensor, and Bayer demosaic allow. What a lot of us find when VR is not quite correcting as well as it can/should is that edges get a little bit of "growth" to them, and sometimes there's a directionality to that growth. It's sort of like camera movement, only much more subtle. I tend to say that the detail "looks busy" when VR isn't fully doing its job or is on when it shouldn't be. And when you apply sharpening to busy edges, that busy-ness gets busy-er. Without VR active at all while on a stable tripod, it's like a veil gets lifted and you suddenly see how sharp your lens really is (assuming you correctly obtained focus on your subject and had a stable platform, that is ;~).

Yes, there's some nitpicking going on here. VR not correcting right is a bit like tripod mount slop (fixed with a Really Right Stuff Long Lens Support) or ringing vibrations in the tripod legs (fixed by using the right legs for your equipment): you don't see it until it's gone, and even then usually only if you're pixel peeping. But someone using a 400mm f/2.8G VR lens on a D3x spent a lot of money on equipment to get the best results. They expect to be able to catch every bit of detail and blow it up into a large print. As always on this site, you need to understand that I always write about the search for optimal bits. If you're shooting with a 16-85mm on a D300 and putting 640x480 images on the Web from that, well, whether the VR missed doing its job by a little bit probably isn't so important.

If there are more questions on VR I'll address them in the Discussion at the bottom of this page. Until then, here's your motto: VR stays off unless I specifically need it. VSOUISNI and prosper.

Discussion:

  • "Which lenses are VR I and which VR II, and what's the difference?" The difference is vague, as Nikon hasn't really released enough information to say much more than VR I claimed to give a three-stop advantage while VR II claims a four-stop advantage. Yes, in practice, the new VR seems to do a slightly better job, but it's unclear as to why it does a better job. Lenses that are still VR I include: 18-55mm, 18-105mm, 24-120mm, 55-200mm, 80-400mm, and the 200mm f/2. Lenses that are VR II include 16-35mm f/4, 16-85mm DX, 18-200mm II, 70-200mm II, 200-400mm II, and 300mm II.
  • "Does VR make a lens more likely to fail and need repair?" Possibly. I've had one VR failure that needed repair and I know of others who've had similar failures. Still, it's rare that a lens has a mechanical failure, though adding the complexity of the VR mechanics certainly must increase the likelihood of encountering a problem.
  • "My VR is on occasion very jumpy." Check your camera's battery level when that happens. I'll bet that it is low. When you run batteries way down and activate VR it appears that the VR system can sometimes demand more power than the camera can supply instantaneously. The result is "jumpy VR" as the VR circuitry cuts in and out. I consider it just another "low battery" warning ;~). But see the "jumps after a shot" comment, below.
  • "Don't you get some effect from VR even if your shutter speed is above 1/500? After all, the VR elements are probably moving between samples." Yes, sometimes you get a VR-like effect above 1/500, and it's probably because the elements are in near constant motion and the designers have picked a movement frequency and smoothing curve that takes advantage of the known sampling frequency. But the problem with using VR above 1/500 is that you will get clear image degradation often enough that you'll get burned by it. And I believe you get burned by it more often than you'd get burned by having VR off. Again, Nyquist tells us that when we sample something, we can only be "precise" about our data at one-half the frequency. Above that you don't get useful data, and a mechanical system can induce ringing effects as it tries to adjust. Let's see if I can explain it simply (a very gross generalization and simplification coming up): Pretend we're moving enough to impart a different motion that needs new correction ten times a second. Assume we're sampling five times a second. So what if we move Left, Right, Down, Left, Up, Right, Right, Left, Down, Up? The sampling sees Left/Right, Down/Left, Up/Right, Right/Left!, Down/Up! See the problems? If we take images at ten frames a second, the system is lagging us in the first few samples and may settle down while we're still moving in the last two. The problem with Nyquist is that there's a strong chance that the system is going the opposite direction you want it to when you exceed the sampling frequency. But, yes, there's a chance that it's going the right direction, too. Not a good enough chance to use VR, in my opinion. Moreover, I don't know of a working sports or wildlife pro using the long lenses that hasn't discovered the same thing by practice: VR tends to degrade shots above 1/500.
  • "Does VR stabilize the autofocus system?" Yes. And this can be important in a few instances. It's one of the reasons why I argued that not putting VR into the 24-70mm lens was one of Nikon's bigger mistakes in the last decade. If you're moving the camera enough that the autofocus sensor(s) you're using isn't staying stable on the point you want focused, there's a chance focus will shift to someplace you don't want it. Your Lock-On and other autofocus settings interact here, so it's not a 100% certainty that VR will improve your autofocus results, but it does just enough that I find it useful to have the option. At wide angles, the AF sensors can easily get distracted by backgrounds. Nikon vaguely warns about this in their manuals (fifth example, D700 manual page 80). So if you're moving the camera enough that the background is getting onto that autofocus sensor with regularity, that can be a problem, and VR might help.
  • "The viewfinder jumps after a shot." This is normal. Note that the Nikon VR system operates differently for pre-release focusing: the viewfinder image is stabilized, which means the VR elements may have moved off center to provide a stable view (also impacts focusing, see question just above). But during the exposure the system does a few different things. First, it recenters the VR elements. Second, it uses a different algorithm for doing its correcting. It very well may be the recentering action that causes some of the above 1/500 issues, by the way.
  • "What about monopods or beanbags?" Nikon tends to recommend having VR on with monopods in most of their manuals. Personally, I think this really gets down to a handling issue, though. One of the primary camera motions that VR is often correcting is the "shutter release stab," which tends to impart a forward or backwards tilt in the camera. Proper use of a monopod tends to (mostly) remove that component, leaving side-to-side as the primary camera movement needing correction. So it starts to depend upon what's causing that side-to-side motion. Following action that moves in one direction? That's panning (see above). Following action that moves back and forth? Be careful of the shutter speed. At low shutter speeds (which would need VR) subject motion is going to be your biggest issue. At high shutter speeds, you're turning VR off anyway. This gets back to my "VR should be off unless needed" rule: there's actually a very narrow window of shutter speeds that make sense to have VR on with moving subjects, perhaps as narrow as 1/125 to 1/500. Subject isn't moving but you can't hold the camera still on the monopod? This would be a case where VR probably should be on.
  • "Do sensor-based stabilization systems have the same rules?" This article is specifically about Nikon VR. It also seems to apply fairly well for Canon IS, which is a very similar system. I can't say that the sensor-based systems do or don't act the same. I have a suspicion that they do, which means that burying the on/off for it in menus is the wrong approach for optimal results. That's because it encourages people to just leave it on all the time. Nevertheless, I don't know enough and haven't tested enough to know for sure.

Something that's coming up with many of the email questions is more basic than the specific questions. It appears most people just want to be told "use it for X, don't use it for Y." While I can broadly suggest probable use patterns, VR is just another one of those decisions that photographers have to make when evaluating each situation they encounter. Taking shortcuts with decisions ultimately leads to less-than-optimal results. For casual shooting, shortcuts perhaps work just fine for most people, and I've suggested a bunch in this article. But for serious shooting where quality matters, a good photographer is always evaluating, always testing. In some ways, digital is great for that, as we have an immediate feedback loop and can test a setting assumption almost immediately, plus we have the ultimate loupe in our large computer monitors.

Thus, one other point I'll make is that I can't tell you every possible time you need to use VR and every possible time you shouldn't. What I do know is that when VR has been on when it shouldn't be, my images suffer. And yes, when I shoot without VR on when it should be, my images suffer, too. However, generally I know when I'm imparting substantive motion to the camera during shooting. Thus, VR is off unless I know that I'm imparting motion, and then I only turn it on if I can guess--and verify with a field test--that it will remove that motion.

One thing I've noticed is that those of us who shot with long lenses back in the film days prior to VR aren't quite so fast to turn it on as someone picking up a camera today. Part of that is the marketing message ("up to four stops better!"), but the real reason why the old-timers tend to use VR only on occasion and mostly correctly is that we already had to figure out when we were imparting camera movement prior to VR being available. We either had to correct the underlying problem or not shoot. Thus, we tend to know when we're on the margin where VR might be helpful. I'd argue that leaving VR on and turning it off only when you see a degradation (which may be too late if you're seeing it when you get home and looking at images on your monitor) isn't easily learned. Leaving VR off and turning it on only when you see a degradation is much more easily learned.

 

bythom.com | Nikon | Gadgets | Writing | imho | Travel | Privacy statement | contact Thom at thom_hogan@msn.com


All material on www.bythom.com is Copyright 2010 Thom Hogan. All rights reserved.
Unauthorized use of writing or photos published on this site is illegal, not to mention a bit of an ethical lapse. Please respect my rights.