motion blurr explained and solved for oleds

Post Reply
cgp44
Binocular Vision CONFIRMED!
Posts: 281
Joined: Tue Oct 29, 2013 10:21 pm
Location: christchurch NZ

motion blurr explained and solved for oleds

Post by cgp44 »

Just found an excellent explanation and link list.

http://www.blurbusters.com/faq/oled-motion-blur/

To solve the sample held problem (the image on screen)
we have to blank the screen during rapid head rotation.
This requires such indication from the motion chip
directly connected to the mipi chip. Hence we need
customized mipi solution. I presume oculus have
hired mipi IP providers right now to do this.
It still calls for open source fpga hardware designs
that can do this and beyond for the DYI scene.
Oculus will not open source this hardware, but of course will
be putting it in $5 metal gated chips for their consumer < 15ms
lag time and no screen effect.


PS build your scenes with a primary green colour and you will
get the full 441ppi resolution of the ams499qp01 (Samsung Galaxy S4 screen).
Last edited by cgp44 on Sun Jan 05, 2014 6:46 pm, edited 2 times in total.
VRus
One Eyed Hopeful
Posts: 29
Joined: Thu Oct 18, 2012 10:45 am
Location: Lithuania

Re: motion blurr explained and solved for oleds

Post by VRus »

wow, very nice find! For the lazy ones, here's amazing LCD-blur test page from that article:

http://www.testufo.com/#test=eyetracking

while 300+ fps (possible with properly driven OLEDs) it's a pipe-dream for current generation gfx-cards,
but the inside-Rift interpolation of panning motion could be quite doable

and it reminds me GeekMasters's suggested method lost somewhere in MTBS, his idea was that Rift's
internal hardware maps last HDMI-received frame onto a sphere and pans sphere-mapped texture in
real-time according to head tracker's data, that way giving much faster response to head motion
then usual Rift-Tracker->USB->CPU/GPU->HDMI->LCD-Controller route

for me motion blur is still killing otherwise amazing Rift experience,
innovations like these give hope, Im crossing fingers of my
both hands for this to somehow get into the consumer Rift..
User avatar
Fredz
Petrif-Eyed
Posts: 2255
Joined: Sat Jan 09, 2010 2:06 pm
Location: Perpignan, France
Contact:

Re: motion blurr explained and solved for oleds

Post by Fredz »

cgp44 wrote:motion blurr explained and solved for oleds
The article doesn't explain how to get rid of motion blur with OLED. It more specifically says that OLED displays - even though they have very fast response times - still suffer from motion blur since they're not used in low persistence configurations.

Most LCDs have the same problem with persistence because of the sample and hold approach, but LightBoost monitors are specifically designed to avoid this, and hence produce minimal motion blur.

They explain that there is an inherent limitation with brightness in OLED technology which prevents to do this (very high brightness is needed to compensate strobing), although there is still ongoing research in this direction. For now TV manufacturers rely mostly on motion interpolation to get similar results, but this generates input lag.
VRus wrote:it reminds me GeekMasters's suggested method lost somewhere in MTBS
Here : http://www.mtbs3d.com/phpbb/viewtopic.php?f=138&t=16543

The "Time warping" technique published before by John Carmack is a detailled explanation of basically the same thing but done on the GPU, not in an external device : http://www.altdevblogaday.com/2013/02/2 ... trategies/

I'm not sure the hassle and cost of an external device is worth it, I'm not even sure that it would be faster since the same processing needs to be done anyway. Done on the GPU it basically gives only 2 ms of latency on the software side instead of 16 ms with the current approach according to Oculus VR.

Image
cgp44 wrote:for me motion blur is still killing otherwise amazing Rift experience, innovations like these give hope, Im crossing fingers of my both hands for this to somehow get into the consumer Rift..
I've the same problem and since John Carmack is now working at Oculus VR and that they announced 15 ms end-to-end total latency, I'm quite confident it'll happen. :)
cgp44
Binocular Vision CONFIRMED!
Posts: 281
Joined: Tue Oct 29, 2013 10:21 pm
Location: christchurch NZ

Re: motion blurr explained and solved for oleds

Post by cgp44 »

Actually, motion blur does not kill VR for me. I don't wave my head around nor play FPSers where
I would. It's resolution and limited 'scuba face mask' lens eye distance FOV which is nowhere near
advertised, plus sweatiness. I think the heat from a lcd and a biggish controller board heats your face.

A 0.6 watt OLED display and an small fpga would make for easy comfortable Vreality.

I have put together a VR goggles made from plastic safety goggles, OVR B lenses, paper mached to a cardboard
enclosure for a single ams499qp01 and a small fpga spartan 6 based board.

Note I have just heard from the maker of the pippestrello board, and his design does not
connect the bank 0 to the WING output header. Bank 0 has the differential paired outputs
which will be used for the mipi lanes and clk interface. He suggested the saturn board
which will be out soon with 45 sized spartan 6.
User avatar
Fredz
Petrif-Eyed
Posts: 2255
Joined: Sat Jan 09, 2010 2:06 pm
Location: Perpignan, France
Contact:

Re: motion blurr explained and solved for oleds

Post by Fredz »

cgp44 wrote:Actually, motion blur does not kill VR for me.
It doesn't for me either (that would be latency) but that would still be a great thing to have. Also it would have direct incidence on resolution when the eyes are moving relative to the display I think, which is almost all the time.
cgp44 wrote:It's resolution and limited 'scuba face mask' lens eye distance FOV which is nowhere near
advertised
These are conform to what was advertized, and even beyond for the FOV. It was announced at 90°x110° and is 114.5°x125.5° in reality. Are you using C cups, are you sure the Rift is correctly placed on your head ?
zalo
Certif-Eyed!
Posts: 661
Joined: Sun Mar 25, 2012 12:33 pm

Re: motion blurr explained and solved for oleds

Post by zalo »

Pixel persistence smearing is the difference between looking through the DK1 and looking at the real world through a bug screen.

It really feels like that big of a difference.
cgp44
Binocular Vision CONFIRMED!
Posts: 281
Joined: Tue Oct 29, 2013 10:21 pm
Location: christchurch NZ

Re: motion blurr explained and solved for oleds

Post by cgp44 »

For me the FOV is as I said. Cup your hands to the side of your head thumbs on
nose and you see the scuba mask FOV.

I think my forehead to eyeball distance is longer than those who can
position the lense nearly touching their eyeball. As the lense is brought closer to
the eyeball the edges move quite a bit.
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: motion blurr explained and solved for oleds

Post by geekmaster »

cgp44 wrote:For me the FOV is as I said. Cup your hands to the side of your head thumbs on
nose and you see the scuba mask FOV. ...
The Rift DK FoV is much larger than the angle between your hands, because of the lenses that provide angular magnification. If you draw a diagram with light rays from your eyes, through the lenses that bend the rays inward (bending inward even more near the edges due to radial distortion), to pixels on the screen near the edges of your FoV, those angles will be MUCH larger than what your eyes can see when looking through a "scuba mask" or restricted by your hands as in your example (depending on exactly how you hold your hands, how big your hands are, and other factors). I showed this effect clearly (with diagrams), showing the bent light rays expanding the perceived FoV, in my "Fresnel lens stack" thread.

The FoV is restricted by the edges of the lenses, not by the screen or by the face mask of the Rift. Larger lenses give a larger FoV (as I showed in my Rift lens experiments thread). If appears that the new Crystal Cove prototype uses larger lenses (perhaps 2-inch diameter), so should not suffer from the restriced FoV we get with our Rift DK 1.5-inch diameter lenses.

Because your FoV is very strongly determined by the angle between your eyes and the lens edges that limit your FoV, it is obvious that getting your eyes as close as possible to the lenses (with eyelashes brushing them as mentioned in many threads) is the most effective way to maximize your FoV. Also, the farther a lens is from the screen the more pixels it can see, so using the long A-cup lenses is necessary for maximum FoV. In my experience, the shortest C-cup lenses discard about half the pixels (but give me a better focus). I sacrifice some focus for more FoV, using the A-Cup lenses, with coin-slot adjustments putting the screen as close to my face as possible, and my eyelashes brushing the lenses.

Of course, if you wear your Rift with lenses pulled away from your face (coin-slot adjustments), or with short lenses (B or C-cup lenses), especially if you wear eyeglasses in your Rift, you may see an FoV much narrower that a scuba face mask or your hands cupped to the sides of your head as you described. The diameter of the lenses shipped with the Rift DK really need the lenses very close to your eyes for the mfg-specified FoV.

Regarding motion blur, the blurbusters website has some nice demos (UFO test, black frame insertion, etc.) that show you how perceived motion blur is caused by images registering on your retina in a location other than where your brain expects it to be, such as from pixel switching latency while turning your head. A link to one of these demos is provided in the first post in this thread. The Crystal Cove prototype not only has larger lenses so they can be farther from your eyes, but also displays each image only while it is where your brain expects that image to be, and shows a black screen at all other times, significantly reducing perceived motion blur.
User avatar
brantlew
Petrif-Eyed
Posts: 2221
Joined: Sat Sep 17, 2011 9:23 pm
Location: Menlo Park, CA

Re: motion blurr explained and solved for oleds

Post by brantlew »

Simplest explanation I've yet seen of low-persistence.

http://www.youtube.com/watch?v=HoLHHUdi_LE&t=6m38s
User avatar
TheHolyChicken
Diamond Eyed Freakazoid!
Posts: 733
Joined: Thu Oct 18, 2012 3:34 am
Location: Brighton, UK
Contact:

Re: motion blurr explained and solved for oleds

Post by TheHolyChicken »

brantlew wrote:Simplest explanation I've yet seen of low-persistence.

http://www.youtube.com/watch?v=HoLHHUdi_LE&t=6m38s
It's a great video - it's no wonder it was so upvoted on reddit.

I personally am very aware first-hand of the benefit that low-persistence can provide as I have a lightboost monitor; there's a marked improvement in clarity when sweeping something across the screen, or panning around in a game. The blur really is reduced. I'm very excited that Oculus are now being taken more seriously by screen manufacturers, and will have access to this kind of tech. It's evident from the CES coverage that it's already a huge improvement over former versions.
Sometimes I sits and thinks, and sometimes I just sits.
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: motion blurr explained and solved for oleds

Post by geekmaster »

brantlew wrote:Simplest explanation I've yet seen of low-persistence.

http://www.youtube.com/watch?v=HoLHHUdi_LE&t=6m38s
Yes, that video demonstrates what Michael Abrash and others said, much better than previous diagrams.

Now, how can we pulse our Rift DK1 backlight LEDs at exactly the right time (after all pixels have been updated, just before a new image starts getting displayed), and also at a high brightness level? The LEDs should be able to be driven much brighter, at a lower on time (i.e. strobed). But with dark adaptation of our eyes, they would not need to be too much brighter (if at all).

We need a DIY Rift mod for those willing to void their Rift warranty in exchange for the improved user experience (or wait until their warranty period has expired).

Controlling a backlight pulse does not seem all that complicated. Perhaps just a one msec wide pulse with time a delay of about 15 msec after VSYNC. A tiny embedded processor should be able to do that.
User avatar
android78
Certif-Eyable!
Posts: 990
Joined: Sat Dec 22, 2007 3:38 am

Re: motion blurr explained and solved for oleds

Post by android78 »

I love the video, it does a good job of demonstrating something that I've had a hard time explaining to other people in the past.
@geekmaster - do we know the specs of the display driver? There should at least be a vsync type signal that you should be able to use for timing the pulses.
User avatar
android78
Certif-Eyable!
Posts: 990
Joined: Sat Dec 22, 2007 3:38 am

Re: motion blurr explained and solved for oleds

Post by android78 »

Gotta love ifixit:
http://www.ifixit.com/Teardown/Oculus+R ... down/13682
I would think that the timing delay between input and output for these would have to be fairly consistent, so you could try just using input vsync with a set delay.
cgp44
Binocular Vision CONFIRMED!
Posts: 281
Joined: Tue Oct 29, 2013 10:21 pm
Location: christchurch NZ

Re: motion blurr explained and solved for oleds

Post by cgp44 »

I am wondering if the term 'motion blurr' is incorrect.
After a while travelling without my goggles, I've reconnected
and specifically looked for the blur. FOR ME it was not noticeable much.

Should the correct term be 'world jitter' or 'world wobble'? In that wiggling your head
does not produce smudges. Some are sensitive to this wobble and their tummies signal such.
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: motion blurr explained and solved for oleds

Post by geekmaster »

cgp44 wrote:I am wondering if the term 'motion blurr' is incorrect.
After a while travelling without my goggles, I've reconnected
and specifically looked for the blur. FOR ME it was not noticeable much.

Should the correct term be 'world jitter' or 'world wobble'? In that wiggling your head
does not produce smudges. Some are sensitive to this wobble and their tummies signal such.
The blur comes from the focused image moving across your retina during eye counter-rotation, when your brain expects that image to be anchored in one place on the retina like as expected for normal eye counter-rotation while turning your head (or to a lesser extent, while rotating your eyes to track an object moving onscreen).

The image moving across the retina unexpectedly causes motion blur, so the term is correct. However, you could accurately say it is CAUSED by "world jitter" (i.e. frame update periodic hold time). You can trade this motion blur for stroboscopic motion ghosting, which is less annoying at higher frame rates, such as is done by the Crystal Cove prototype.
Post Reply

Return to “Oculus VR”