HOWTO: Enable LightBoost in Linux (NVidia closed source)

Anything About Anything...
Post Reply
User avatar
Z-Man
God & Project Admin
Posts: 11706
Joined: Sun Jan 23, 2005 6:01 pm
Location: Cologne
Contact:

HOWTO: Enable LightBoost in Linux (NVidia closed source)

Post by Z-Man »

May send you on your way if you are using a different video card or the open source driver, I cannot test that. Basically, the information I am about to give you is already available, just not in ready-to-use-in-Linux form. At least, I could not find it.

First, since you probably stumbled into here, what is LightBoost and why would you want it? You would want it because it gives LCD displays that support it a CRT like feel (for better and worse). Under the right circumstances, almost all of the typical LCD blur is gone even for the fastest motions.
What it is is a thing some newer 120(+) Hz monitors have for better shutter glasses 3D support. See, both LCD monitors and the shutter glasses are SLOW, so in order to avoid ghost images, the LCD glasses turn each side transparent only for a very short time, resulting in a darkened image compared to regular viewing. LightBoost fixes that by using the fact that while LCDs are slow, the background light of the monitor can be very fast; if LB is active, the monitor background light only turns itself on for 1-2 ms per frame, just when the shutter glasses are open and when the LCD image on screen has stabilized.
This post is about ABUSING that. You don't need 3D glasses to benefit from it.

Preemptive question: "But Z-Man, the human eye will perceive anything from 24 frames per second as smooth motion, why would one need more? Surely 60 Hz/fps is plenty."
Answer: Get out. Firstly, you have been brainwashed by the movie industry; secondly, that is only in the right area if you assume your eye stays focussed at the same spot. But it is not. It moves around, following objects as they move. If you truly cannot tell the difference between 30 and 60 fps, good for you, but still: Get out, there is nothing of benefit for you to find.

Still here? Good. Step One:
WARNING: We are going to edit xorg.conf, add a new ModeLine or two and tell the driver to ignore what the monitor says it can and cannot do. If we make a mistake or you apply it without checking whether your monitor supports LightBoost in the first place, this may fry your monitor and set your house on fire. Though monitors have become very good at not letting themselves get fried on illegal input lately, especially the digital variety.

Step Two: read this. It gives you more details on what is being done and what the benefits are; it is a little overenthusiastic (there still are differences between LB and a CRT you can spot with the naked eye, but yeah, they are pretty insignificant), but otherwise a good read.

Step Three: Activate LightBoost once, somehow. This can be the official way with glasses and IR emitter, but if you do not have those, the link from step two says how to do it without one. You will need a Windows machine with an NVidia card for that. Maybe a Quadro NVidia card using the Linux support for 3DVision works too, but I don't have that, so I can't test.

Step Three and a Half: Between step three and actually using LightBoost mode, do not unplug the monitor from power. You can turn it off, but if you unplug it, it will (probably) forget it is allowed to use LightBoost. Annoying. You can unplug and re-plug the signal cable, so the PC used in step three can be a different one than the one you want to use Linux with. And of course, you can always just repeat step three. It's easier the second time when the system is already prepared.

Step Four: Now it's xorg.conf edit time. LightBoost modes are almost the same as regular modes, they just have a wider vertical sync back porch. Presumably, that is to allow the LCD to settle completely, even in the lower section, before the background light flash makes the image visible. Source for the 120 Hz mode in the first comment, the 100 Hz mode was derived from that and the regular 100 Hz mode. To the monitor section, add (only new lines are not ...)

Code: Select all

Section "Monitor"
    ...
    Modeline "1920x1080_120lb" 286.7 1920 1968 2000 2080 1080 1083 1088 1149 +HSync -VSync
    Modeline "1920x1080_100lb" 236.7 1920 1968 2000 2080 1080 1083 1088 1138 +HSync -VSync
EndSection
That would be it, but the modes get rejected by validation. We need to tell Xorg to not do that. Once we do, the list of valid mode is going to fill with many modes from the standard lists the monitor really does not support, so we need to disable those lists as valid sources. All in all, edit the Screen section and add (this is the NVidia only part, I am sure similar options exist for other drivers):

Code: Select all

Section "Screen"
    ...
    Option "ModeValidation" "DFP-1: NoXServerModes, NoVesaModes, AllowNonEdidModes"
    ...
EndSection
Replace "DFP-1" with the connection name of your monitor or leave out the "DFP-1:" if you only have one or all your monitors support LightBoost.

Finally, you need to actually select the modes. For that, there are several ways. Just adding them to the Display subsection should do it:

Code: Select all

Section "Screen"
    ...
      SubSection "Display"
        ...
        Modes "1920x1080_120lb" ... 
    EndSubSection
EndSection
Or the metamodes option for a multiscreen setup (the second screen here is my old 1280x1024 screen, adapt to your situation):

Code: Select all

Section "Screen"
    ...
    Option "metamodes" "DFP-1: 1920x1080_120lb +0+0, DFP-0: 1280x1024_60 +1920+0"
EndSection
Or they simply appear in the NVidia Setup utility in Advanced Mode and can be picked at runtime (there will be two options for 120Hz, one with and one without LB):
NVidia Settings
NVidia Settings
What does not work is selecting the new modes via their resolution/refresh combination because there are now two modes in the same slot. What I do to avoid potential problems is simply disable the non-LB modes. The only way known to me is to disable ALL pre-defined modes and re-adding the ones I want, like so:

Code: Select all

Section "Monitor"
    ...
    Modeline "1920x1080_120lb" 286.7 1920 1968 2000 2080 1080 1083 1088 1149 +HSync -VSync
    Modeline "1920x1080_100lb" 236.7 1920 1968 2000 2080 1080 1083 1088 1138 +HSync -VSync

    Modeline "1920x1080_144" 317.49  1920 1944 1975 2008  1080 1083 1088 1098 +hsync -vsync
    Modeline "1920x1080_85"  253.25  1920 2064 2272 2624  1080 1083 1088 1137 +hsync -vsync
    Modeline "1920x1080_60"  148.50  1920 2008 2052 2220  1080 1084 1089 1125 +hsync -vsync
    Modeline "1920x1080_60"  148.50  1920 2008 2052 2220  1080 1084 1089 1125 +hsync -vsync

    Modeline "1280x720_60"   74.25  1280 1390 1430 1650 720 725 730 750 +hsync -vsync

    Modeline "960x540_60"   40.75  960 992 1088 1216  540 543 548 562 +hsync -vsync
EndSection

Section "Screen"
    ...
    Option         "ModeValidation" "DFP-1: NoEdidModes, NoXServerModes, NoVesaModes, AllowNonEdidModes"
EndSection
Source for the additional modelines are mostly the verbose X logs from previous runs (launched with "startx -- :1 -verbose 10 -logverbose 10 2>&1 | tee xlog"), the quarter HD one is from one of the many modeline generators, forgot which. They may or may not work for you. Definitely only put the 144 Hz one in if you know it will work.

Step Five: Restart X and enjoy! To test whether LightBoost is indeed active, check your monitor's menu, there should be a LB item that is only active if that is the case. Or wave a thin object in front of the screen, such as a pen or your finger. Without LB, it should more or less produce a continuous washed out shadow; with LB, you will see a series of rather distinct individual shadows.

Preemptive Question 2: "Is it worth it?"
Depends. First, yes, the benefits if it all comes together are huge. However, for that to happen in a game:
1. You need to be able to select the refresh rate, the game must not simply pick 60 Hz modes on its own (in Linux, you could just remove those)
2. The game needs to support 120 Hz updates. Some games will happily render 120 FPS, but two consecutive ones will be identical (Bit.Trip Runner does that). Others will play some animations at twice the intended speed (Super Meat Boy does that).
3. Your PC needs to be fast enough to support 120 FPS on that game. If it does, power consumption and noise is likely to be higher.
(All of that works fine for Arma, of course. While you can't explicitly select the refresh rate, it will stick to the OS default.)
Then there are the costs:
1. The monitor itself. Not really an issue any more money wise, they're just dropping below 300 Euros.
2. 120Hz monitors are only available as TN panels. That's a big one. If you like moving around or have a second onlooker from time to time, their viewing angle sucks. Color reproduction is also a big issue. I personally don't mind either one, the chair in my PC cave can move about 10 cm left or right at most.
3. To you, 60 FPS will turn into the new 30 FPS. Seriously. Well, half seriously. It won't be that bad. But you will start to see some choppiness at 60 FPS, and you will be just a little bit annoyed. Maybe you will turn into that annoying snob complaining on forums how this otherwise fine game is totally ruined, RUINED I say, by the fact it has no proper 120 Hz support.
4. LB mode flickers. A bit. The frequency way above perception threshold, but again, that only applies while your eyes are not moving. Glancing around the screen will make you see the flashes on hard contrast edges, kind of like what happens to those tiny LED displays.
5. Most adjustments you can make to your monitor's image are disabled in LB mode.

And yeah, this screenshot was just me fishing for comments. "Gee, isn't 144 FPS a little low?", you'd say. "Not with VSync enabled!" I'd answer smugly. Bah.
User avatar
Jonathan
A Brave Victim
Posts: 3391
Joined: Thu Feb 03, 2005 12:50 am
Location: Not really lurking anymore

Re: HOWTO: Enable LightBoost in Linux (NVidia closed source)

Post by Jonathan »

Could shutter glasses be made to open both shutters during each strobe? That would be an interesting way to reduce glare.
ˌɑrməˈɡɛˌtrɑn
User avatar
Z-Man
God & Project Admin
Posts: 11706
Joined: Sun Jan 23, 2005 6:01 pm
Location: Cologne
Contact:

Re: HOWTO: Enable LightBoost in Linux (NVidia closed source)

Post by Z-Man »

In theory, yes; Sony has a patent and sells a monitor/glasses combination that uses it for single screen multiuser gaming. That one will trigger each pair of glasses at 60 Hz, though. And I doubt you could convince the NVidia glasses to to the same. It would probably be a better solution (for you, not the people around you) to just crank up the brightness and wear dark sunglasses :)

And better, get those glasses and keep your head straight. And hope the glare is not polarized in the same direction.
User avatar
Jonathan
A Brave Victim
Posts: 3391
Joined: Thu Feb 03, 2005 12:50 am
Location: Not really lurking anymore

Re: HOWTO: Enable LightBoost in Linux (NVidia closed source)

Post by Jonathan »

I was thinking more about working in broad daylight, where max brightness alone is not enough (not enough to get a decent contrast ratio, anyway). Improving efficiency like this would be quite clever. Concentrate backlight energy, then block out everything else to lower the noise floor. All without having to darken the room or hope for beneficial polarization or the like.
ˌɑrməˈɡɛˌtrɑn
Post Reply