NVIDIA's Light-field Glasses Prototype. Oculus in trouble ?

remosito
Binocular Vision CONFIRMED!
Posts: 251
Joined: Wed Apr 10, 2013 2:06 am

Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl

Post by remosito »

Did I understand this article correctly????

http://lightfield-forum.com/2013/07/ref ... prototype/

and the loss in spatial resolution is a factor of 9 per dimension? ( [1280X720]/[146x78] )

So overestimating invisible pixels on Rift at 36%, a nvidia lightfield screen would have to be [960x1080] x 0.8 * 9 = [6912x7776]


or am I misunderstanding what they mean by spatial resolution?

They as well say FoV of the result is 29x16 degrees. So you loose massively on this as well!?!?!

Basically this stuff is like a decade away from 100+ degrees FoV and HD equivalent resolution!
Starcitizen - Elite:Dangerous - Xing - Gallery: Six Elements - Among the sleep - Theme Park Studio - The Stomping Land - Son of Nor - Obduction - NOWHERE - Kindom Come : Deliverance - Home Sick - prioVR
User avatar
colocolo
Diamond Eyed Freakazoid!
Posts: 790
Joined: Mon Jun 04, 2012 1:25 am

Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl

Post by colocolo »

It is, but who knows what microdisplay HMD manufacturers are considering to produce if Oculus is getting into the market. Silicon Micro display is working on 4k -8k panels. If they quadruple the panel size, pooouuu... 2 times 124MP wouldnt be a dream anymore. But therefore they probably would also had to integrate eyetracking to allow programmers to render only the fovea area, obviously. But that for a semiconducter manufacturer shouldnt be that hard. German Fraunhofer institute just put 160x120 photodiodes in the pixel gaps to detect eye movement. I think this sort of eyetracking will be sooner or later the standard. Is there any way, apart from contact lenses an electromyo-oculugraphic method to detect the eye movement that fits nicely into the small room between optics and display?
i mean its almost the same as camera eyetracking with the only difference that the camera sensor is now in the display.
remosito
Binocular Vision CONFIRMED!
Posts: 251
Joined: Wed Apr 10, 2013 2:06 am

Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl

Post by remosito »

colocolo wrote:It is, but who knows what microdisplay HMD manufacturers are considering to produce if Oculus is getting into the market. Silicon Micro display is working on 4k -8k panels. If they quadruple the panel size, pooouuu... 2 times 124MP wouldnt be a dream anymore.

Didn't Palmer say problem with microdisplays is that scaling size doesn't work???

But from the sound of that 8k4k in development from Silicon Micro. The pixel density part is actually A LOT closer than I thought possible.

But how would one increase resulting FoV? Just different microlenses in the array?
Starcitizen - Elite:Dangerous - Xing - Gallery: Six Elements - Among the sleep - Theme Park Studio - The Stomping Land - Son of Nor - Obduction - NOWHERE - Kindom Come : Deliverance - Home Sick - prioVR
User avatar
colocolo
Diamond Eyed Freakazoid!
Posts: 790
Joined: Mon Jun 04, 2012 1:25 am

Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl

Post by colocolo »

Ultimately, but that wouldnt be a light-field display anymore , Innovega's ultra-near-focus lenses could be a solution. Innovega is foreseeing a large customer market since there are already 100 million contact lenses wearers and they can produce the lenses very cheap.
I dont know, does the smallest feature size you can see shrink also if you could focus nearer objects?
The eye has a resolution of 40 µm. But pixels in microdisplays are already smaller. I assume the smallest feature size perceivable also shrinks. Its not that you retina suddenly perceives fewer signals.

EDIT: When it comes to acceptance of near-focus contact lenses a tiny metallic plate on the contact lenses could help people to put their lenses on or pull them off.
With rubber tweezers that have a small flat magnetic tip and grab the metallic plate you could easily put them on or pull them off. :idea:
Since the lenses are only for gaming it could be a real help to spread this technology.
User avatar
Bishop51
Binocular Vision CONFIRMED!
Posts: 243
Joined: Thu Sep 22, 2011 11:05 am
Location: Vancouver Island
Contact:

Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl

Post by Bishop51 »

I had the opportunity to demo this at SIGGRAPH while there to talk about VR development challenges. I can safely say that Oculus is in no immediate danger :) The tech is cool, has great potential but it is a long way off before they work out the bugs.
User avatar
android78
Certif-Eyable!
Posts: 990
Joined: Sat Dec 22, 2007 3:38 am

Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl

Post by android78 »

Bishop51 wrote:I had the opportunity to demo this at SIGGRAPH while there to talk about VR development challenges. I can safely say that Oculus is in no immediate danger :) The tech is cool, has great potential but it is a long way off before they work out the bugs.
I'd say you're right at this point, however I imagine this is the direction they will end up going once the computing power and more efficient rendering technology is developed. Part of the problem is, even if we get the displays with high enough resolution, the current rendering techniques are still optimized for rendering on a flat 2D plane. As an interim, you're likely to be able to get some reasonable re-projection using the depth buffer, but there should be a much more efficient way.
User avatar
colocolo
Diamond Eyed Freakazoid!
Posts: 790
Joined: Mon Jun 04, 2012 1:25 am

Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl

Post by colocolo »

2560 x1600 is getting closer and closer to 7 inch panels. :D
http://www.engadget.com/2013/08/16/benc ... h-2-560-x/
User avatar
xef6
One Eyed Hopeful
Posts: 41
Joined: Sat Nov 03, 2012 11:41 pm

Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl

Post by xef6 »

http://math.ucr.edu/home/baez/physics/Quantum/see_a_photon.html wrote:They found that about 90 photons had to enter the eye for a 60% success rate in responding.
http://www.oculist.net/downaton502/prof/ebook/duanes/pages/v1/v1c033.html#vis wrote: The detection threshold was found to correspond to approximately 50 to 150 photons striking the cornea. Of these, about 50% are absorbed and reflected by the ocular media, leaving 25 to 75 photons to strike the retina. Only about 20% of these photons are absorbed by the rhodopsin, thus only 5 to 15 photons are left to excite vision.
geekmaster wrote:I agree. Studies show that a dark-adapted human eye can detect a single photon, no matter how tiny or how distant the source of light. This has nothing to do with object size or eye focus.
This wording seems a little misleading to me :?
User avatar
android78
Certif-Eyable!
Posts: 990
Joined: Sat Dec 22, 2007 3:38 am

Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl

Post by android78 »

@xef6 - the point is that the eyes detection has little to do with size, but is possible due to the number of photons that are received. Angular resolution is of little consequence.
User avatar
xef6
One Eyed Hopeful
Posts: 41
Joined: Sat Nov 03, 2012 11:41 pm

Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl

Post by xef6 »

@android78 - I agree that it's about the number of photons received. I was thinking more about the photon counts than angular stuff. The point I was trying to imply with the quotes was that the cornea/etc reflects/absorbs a significant percentage of light, so there generally needs to be way more than a single photon coming from the source in order for even one photon to reach the retina (assuming the source isn't inside the eye). Not that it matters too much, since there's usually bajillions of photons flying around anyways.

I probably could have been more clear in my post. Sorry to pick nits.
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl

Post by geekmaster »

xef6 wrote:
geekmaster wrote:I agree. Studies show that a dark-adapted human eye can detect a single photon, no matter how tiny or how distant the source of light. This has nothing to do with object size or eye focus.
This wording seems a little misleading to me :?
Not misleading. Clarifying and simple. A single photon can be perceived, by the eye itself (if it strikes an active site on the retina). However, not every photon is perceived. About 90-percent of the photons that reach the eye DO NOT make it to the retina. It is the visual centers of the brain that attempt to filter out optical neuron misfires by requiring a sufficient number of photon RATE before bringing it to your conscious perception. Such perceptual filters can be adjusted through attentiveness and are subjective, being more sensitive when you are actively looking for a photon that when a stray photon grabs your attention when it is diverted elsewhere. Entire books have been written on the subject of visual perception, but the point of my post was in part to refute the claim that sensitivity to photons depends on how far they travelled before reaching the eye. Such simplification on my part is not misleading at all, IMHO, considering that posting too much information can lead to "TL;DR" syndrome...

If you really do want the long answer, you can start with the WikiPedia article on "Visual phototransduction" here:
http://en.wikipedia.org/wiki/Phototransduction
Notice especially that a different photoreceptor behavior is used depending on whether the eyes are "in the light" or "in the dark" (i.e. dark adapted, as I said in the rest of my previous post). The key is that you CAN see a single photon, but depending on what mode your eye is operating in you may require a continuous stream of photons to distinguish light and dark areas (contrast) relative to each other.
Last edited by geekmaster on Sun Aug 18, 2013 1:50 pm, edited 1 time in total.
TomFahey
One Eyed Hopeful
Posts: 12
Joined: Wed Mar 20, 2013 12:29 pm

Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl

Post by TomFahey »

Am I the only one that's more impressed with this technology than the Oculus Rift? As commendable as the Rift is, I think the actual design approach is somewhat inferior to Nvidia's glasses - yes, there is the problem of the limited resolution, but that's not a fundamental problem, as display technology is constantly evolving and accelerating. Whereas with the OR, while it does offer superior resolution, it has the fundamental problem of the disparity between the 3D depth and the fixed focus that the display has, which the Nvidia rep mentioned - something that the Rift doesn't try to tackle. I think Nvidia's solution is very elegant, and I'd like to see more from them with this, especially as display technology progresses - although, for now, I must admit the Rift does have the advantage in that it'll be a useable product, albeit with an inferior design.
geekmaster
Petrif-Eyed
Posts: 2708
Joined: Sat Sep 01, 2012 10:47 pm

Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl

Post by geekmaster »

TomFahey wrote:Am I the only one that's more impressed with this technology than the Oculus Rift? As commendable as the Rift is, I think the actual design approach is somewhat inferior to Nvidia's glasses - yes, there is the problem of the limited resolution, but that's not a fundamental problem, as display technology is constantly evolving and accelerating. Whereas with the OR, while it does offer superior resolution, it has the fundamental problem of the disparity between the 3D depth and the fixed focus that the display has, which the Nvidia rep mentioned - something that the Rift doesn't try to tackle. I think Nvidia's solution is very elegant, and I'd like to see more from them with this, especially as display technology progresses - although, for now, I must admit the Rift does have the advantage in that it'll be a useable product, albeit with an inferior design.
If you follow the forums at OculusVR, you will see that you are not alone in your interest in this technology. Especially impressive are the DIY aspects presented here:
https://developer.oculusvr.com/forums/v ... 049#p35589

I plan to replicate the DIY method presented there for my own light-field HMD (although that post used it for a camera lens).
hast
Cross Eyed!
Posts: 100
Joined: Mon Mar 15, 2010 8:16 am

Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl

Post by hast »

TomFahey wrote:Am I the only one that's more impressed with this technology than the Oculus Rift? As commendable as the Rift is, I think the actual design approach is somewhat inferior to Nvidia's glasses...
I think this tech is really cool, but I really don't think it's competing with the Oculus. If anything I think the two will make for an awesome combination sometime in the future when light-field displays are more viable.

In many ways it's like a continuation of the idea of the Rifts leep optics. Instead of trying to make a perfect lens you pre-warp the generated image so that the lens warps it to something that looks right. And microlens arrays take that idea to the next step, and getting even more benefits. (Ie the leep gave us large FOV, microlenses gives us a proper light field.)

But microlens arrays require even higher resolution screens than the normal leep optics in the Rift. And it requires quite a bit more work from the GPU (apparently they used a highest-end Titan to render even a very simple static 3d model). So that tech will likely have to mature a little bit more before you get it on the shelves.

Perhaps just in time for the consumer Rift 2 (or 3).
TomFahey
One Eyed Hopeful
Posts: 12
Joined: Wed Mar 20, 2013 12:29 pm

Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl

Post by TomFahey »

Well, to be fair, we don't have a release date for the Rift yet, do we? If they take their time too much, they might well find themselves made obsolete by competitor's like Nvidia that jump in - especially given LG's (very) recent release of a QHD 5.5in display, with a pixel density of 538ppi.

But microlens arrays require even higher resolution screens than the normal leep optics in the Rift. And it requires quite a bit more work from the GPU (apparently they used a highest-end Titan to render even a very simple static 3d model). So that tech will likely have to mature a little bit more before you get it on the shelves.
See, I'm unsure about this - are you sure it wasn't just a plug for their flagship GPU? Unless they were trying to make a point of light field rendering being an easy task through their own proprietary system, then there'd be no reason for them to show it off with a GTX660 - why not flash off the big guns for the show? I have no idea about the requirements for rendering a light field though, so, please, someone correct me on this if I'm wrong.
I plan to replicate the DIY method presented there for my own light-field HMD (although that post used it for a camera lens).
A possibility that interests me is variable refractive materials that can change their refractive index with a voltage change - I can envisage something similar to an LCD layer, except it bends light rather than emitting it as a particular colour. Also of interest - although this is a bit "out there", is optoelectronics:

"http://www.gizmag.com/synthetic-magneti ... tal/25261/"

^ This is probably more an "interest" topic, but it sure would be helpful on a practical scale! :P
hast
Cross Eyed!
Posts: 100
Joined: Mon Mar 15, 2010 8:16 am

Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl

Post by hast »

TomFahey wrote:Well, to be fair, we don't have a release date for the Rift yet, do we?
To be really fair this is simply a thesis project at Nvidia. They have not even said if they intend to try to turn it into a product.

And even if they were to turn it into a product it's still not a direct competitor to the Rift as this is only the display technology. They still need to have head tracking and all that stuff added.

But it should be an excellent complementary technology to the Rift once the resolution of screens increases.
TomFahey wrote:especially given LG's (very) recent release of a QHD 5.5in display, with a pixel density of 538ppi.
I don't think a "QHD" (really WQXGA in 16:9 format) is anywhere near high enough resolution for a light-field screen.

The Lytro camera uses the same ideas (but as a camera). While they haven't said exactly what resolution their sensor is it seems reasonable to assume that their "11 megaray" sensor is 11 megapixel. The resulting image from this is 1.2 megapixel. The reason for that is that a lot of information goes unused for any one image, although the resulting image is more "accurate" that normal.

Assuming that 1080p is good enough for the Rift this makes it reasonable to assume that a light field display of the same perceived resolution would need a display with at least 4x the spatial resolution (per axies, so at least an 8k display). So it's a bit off most likely.

Once display resolution is high enough for light fields it has most likely passed the point of worthwhile returns for Leep optics. So this is an excellent way to keep improving the experience.
TomFahey wrote:
But microlens arrays require even higher resolution screens than the normal leep optics in the Rift. And it requires quite a bit more work from the GPU (apparently they used a highest-end Titan to render even a very simple static 3d model). So that tech will likely have to mature a little bit more before you get it on the shelves.
See, I'm unsure about this - are you sure it wasn't just a plug for their flagship GPU?... I have no idea about the requirements for rendering a light field though, so, please, someone correct me on this if I'm wrong.
Light field rendering of this type works by rendering a grid of images. Looking at some of the videos from their prototype it seems like they are rendering something like a 14x7 grid of views, that's almost 100 viewports. With stereo rendering you get approximately 40% processing increase with one addition viewport. It seems reasonable that they can use a lot of tricks to amortize this cost a lot more over additional viewports. But I do think that it takes quite a bit more processing power than a normal rendering. (And this is in addition to the problem that you will need a much larger display, most cards today would have a hard time rendering to even a 4k screen at high FPS. Not to mention 8k.)
User avatar
colocolo
Diamond Eyed Freakazoid!
Posts: 790
Joined: Mon Jun 04, 2012 1:25 am

Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl

Post by colocolo »

hast wrote:
TomFahey wrote:Well, to be fair, we don't have a release date for the Rift yet, do we?
To be really fair this is simply a thesis project at Nvidia. They have not even said if they intend to try to turn it into a product.

And even if they were to turn it into a product it's still not a direct competitor to the Rift as this is only the display technology. They still need to have head tracking and all that stuff added.

But it should be an excellent complementary technology to the Rift once the resolution of screens increases.
TomFahey wrote:especially given LG's (very) recent release of a QHD 5.5in display, with a pixel density of 538ppi.
I don't think a "QHD" (really WQXGA in 16:9 format) is anywhere near high enough resolution for a light-field screen.

The Lytro camera uses the same ideas (but as a camera). While they haven't said exactly what resolution their sensor is it seems reasonable to assume that their "11 megaray" sensor is 11 megapixel. The resulting image from this is 1.2 megapixel. The reason for that is that a lot of information goes unused for any one image, although the resulting image is more "accurate" that normal.

Assuming that 1080p is good enough for the Rift this makes it reasonable to assume that a light field display of the same perceived resolution would need a display with at least 4x the spatial resolution (per axies, so at least an 8k display). So it's a bit off most likely.

Once display resolution is high enough for light fields it has most likely passed the point of worthwhile returns for Leep optics. So this is an excellent way to keep improving the experience.
TomFahey wrote:
But microlens arrays require even higher resolution screens than the normal leep optics in the Rift. And it requires quite a bit more work from the GPU (apparently they used a highest-end Titan to render even a very simple static 3d model). So that tech will likely have to mature a little bit more before you get it on the shelves.
See, I'm unsure about this - are you sure it wasn't just a plug for their flagship GPU?... I have no idea about the requirements for rendering a light field though, so, please, someone correct me on this if I'm wrong.
Light field rendering of this type works by rendering a grid of images. Looking at some of the videos from their prototype it seems like they are rendering something like a 14x7 grid of views, that's almost 100 viewports. With stereo rendering you get approximately 40% processing increase with one addition viewport. It seems reasonable that they can use a lot of tricks to amortize this cost a lot more over additional viewports. But I do think that it takes quite a bit more processing power than a normal rendering. (And this is in addition to the problem that you will need a much larger display, most cards today would have a hard time rendering to even a 4k screen at high FPS. Not to mention 8k.)

Concerning processing power. I have asked that question to Samuel Lapere, the author of raytracey.blogspot.co.nz. He said that lightfield tile renderings should take less power than rendering one single image at the same res because of the coherence of light rays in a scene.
Also rendering quadruple resolution of normal 2D images takes only 3-3.5 the rendering power.
Brigade engine is according to him taking great steps as they recently renewed their engine almost completely and it is now as good as Octane Render. :woot (you have seen Octane images?)
They have found a clever way to greatly reduce the noise without image quality loss and Brigade 3 compared to 2 is now almost noise free.
hast
Cross Eyed!
Posts: 100
Joined: Mon Mar 15, 2010 8:16 am

Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl

Post by hast »

colocolo wrote:Concerning processing power. I have asked that question to Samuel Lapere, the author of raytracey.blogspot.co.nz. He said that lightfield tile renderings should take less power than rendering one single image at the same res because of the coherence of light rays in a scene.
Also rendering quadruple resolution of normal 2D images takes only 3-3.5 the rendering power.
That's nice... But I think we'll see 4k and 8k screens before raytracing takes over. :-)

Today you need pretty much a top of the line card to get good FPS on a 4k screen. And that's without any additional processing demands from extra viewports.
Kamus
Two Eyed Hopeful
Posts: 85
Joined: Thu Mar 05, 2009 9:14 pm

Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl

Post by Kamus »

hast wrote: That's nice... But I think we'll see 4k and 8k screens before raytracing takes over. :-)

Today you need pretty much a top of the line card to get good FPS on a 4k screen. And that's without any additional processing demands from extra viewports.
Agreed, 4K is here now, and it's not even that expensive if you go the HDTV route (although those tv's lack of a 60hz display input really suck)

8K isn't that far off... i think about 2-3 more generations and we'll have GPU's capable of driving the displays. But i'm not so sure about display availability.
User avatar
colocolo
Diamond Eyed Freakazoid!
Posts: 790
Joined: Mon Jun 04, 2012 1:25 am

Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl

Post by colocolo »

Kamus wrote:
hast wrote: That's nice... But I think we'll see 4k and 8k screens before raytracing takes over. :-)

Today you need pretty much a top of the line card to get good FPS on a 4k screen. And that's without any additional processing demands from extra viewports.
Agreed, 4K is here now, and it's not even that expensive if you go the HDTV route (although those tv's lack of a 60hz display input really suck)

8K isn't that far off... i think about 2-3 more generations and we'll have GPU's capable of driving the displays. But i'm not so sure about display availability.
Perhaps CG-Silicon technology will bring new displays. According to a Sharp representative it
allows much higher pixel denities than IGZO because it triples carrier mobility relative to LTPS.
Remember that was said after revealing their 6.1 inch 2560x1600 IGZO display.
http://www.theverge.com/2012/10/2/34419 ... cg-silicon
http://www.sharpmicro.com/Page.aspx/ame ... f0701dbbb6
And LTPS technology allowed Japan Display to build a 2,3" 651ppi panel (so 6,9 inch 9Megapixel).
Post Reply

Return to “Oculus VR”