NVIDIA's Light-field Glasses Prototype. Oculus in trouble ?
-
- Binocular Vision CONFIRMED!
- Posts: 228
- Joined: Thu Nov 15, 2012 2:12 pm
NVIDIA's Light-field Glasses Prototype. Oculus in trouble ?
Can't believe no ones posted about this yet. -Is Oculus in trouble ? Discuss.
http://www.youtube.com/watch?v=deI1IzbveEQ
http://www.youtube.com/watch?v=deI1IzbveEQ
- blazespinnaker
- Certif-Eyed!
- Posts: 541
- Joined: Sat Sep 01, 2012 11:53 pm
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
Maybe you could start off by explaining why you think Oculus is in trouble.
That being said, there is likely some interesting R&D to be done in the realm of the screen itself. This goes more to my theme that the intellectual property is in the underlying components rather than the mere assembling of them.
That being said, there is likely some interesting R&D to be done in the realm of the screen itself. This goes more to my theme that the intellectual property is in the underlying components rather than the mere assembling of them.
Last edited by blazespinnaker on Tue Jul 23, 2013 5:53 pm, edited 1 time in total.
Gear VR: Maybe OVR isn't so evil after all!
- android78
- Certif-Eyable!
- Posts: 990
- Joined: Sat Dec 22, 2007 3:38 am
- colocolo
- Diamond Eyed Freakazoid!
- Posts: 790
- Joined: Mon Jun 04, 2012 1:25 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
There is an easy way to overcome this problem.
Make Oculus Rift more comfortable.
Who cares about wearing a shoebox on the head. It only needs to be light(200gramm) and comfortable like normal glasses. All in reach for Oculus VR.
No need to worry.
Make Oculus Rift more comfortable.
Who cares about wearing a shoebox on the head. It only needs to be light(200gramm) and comfortable like normal glasses. All in reach for Oculus VR.
No need to worry.
- blazespinnaker
- Certif-Eyed!
- Posts: 541
- Joined: Sat Sep 01, 2012 11:53 pm
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
I've never heard anyone complain about the OR's ergonomics.
Gear VR: Maybe OVR isn't so evil after all!
-
- Certif-Eyed!
- Posts: 661
- Joined: Sun Mar 25, 2012 12:33 pm
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
They don't hate Oculus, the guys at the nvidia booth seemed to love Oculus.
I tried it and it could be a very synergistic technology if they choose. Light weight lenses, the ability to adjust focal information and eye perscription at any point on the screen IN SOFTWARE, etc. makes these screens very tantalizing. The main issue is that it utterly decimates resolution. You need such an over abundance of pixels in such a small space it's not even feasible. To give you an idea: the Sony microdisplay OLEDs have well over 2,000 PPI and still look low resolution with this microlens array on top. No screen door though
I tried it and it could be a very synergistic technology if they choose. Light weight lenses, the ability to adjust focal information and eye perscription at any point on the screen IN SOFTWARE, etc. makes these screens very tantalizing. The main issue is that it utterly decimates resolution. You need such an over abundance of pixels in such a small space it's not even feasible. To give you an idea: the Sony microdisplay OLEDs have well over 2,000 PPI and still look low resolution with this microlens array on top. No screen door though
- android78
- Certif-Eyable!
- Posts: 990
- Joined: Sat Dec 22, 2007 3:38 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
The advantage of the light field is that you don't have a fixed focal plane. This is something that isn't achievable by the OR without either complex dynamic lens system or a microlens array similar to nvidia are showing here.colocolo wrote:There is an easy way to overcome this problem.
Make Oculus Rift more comfortable.
Who cares about wearing a shoebox on the head. It only needs to be light(200gramm) and comfortable like normal glasses. All in reach for Oculus VR.
No need to worry.
-
- Cross Eyed!
- Posts: 141
- Joined: Thu Sep 02, 2010 3:08 pm
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
I think this could be a cool tech... at some point. I'm guessing a few years off.
But it did seem like the resolution is lower. I think it's a bit more complicated to handle the images.
Currently it only seems to be better in weight. Still needs custom software, head tracker, ect. But would be great to see good competition.
I just hope the drivers could be used for all HMD at some point. So games don't have to rewrite their code to handle multiple HMD's when there is competition.
But it did seem like the resolution is lower. I think it's a bit more complicated to handle the images.
Currently it only seems to be better in weight. Still needs custom software, head tracker, ect. But would be great to see good competition.
I just hope the drivers could be used for all HMD at some point. So games don't have to rewrite their code to handle multiple HMD's when there is competition.
-
- Binocular Vision CONFIRMED!
- Posts: 228
- Joined: Thu Nov 15, 2012 2:12 pm
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
Sure. Simply because Nvidia is HUGE. Everyone already knows Nvidia. If Nvidia get's the ball rolling and develops an HMD that surpasses the Rift, what's going to stop people from getting that ? Nvidia can easy ship out their new product and have it in every retail outlet across the world. Oculus can't. I realize what is shown in this video can't technically be compared to the Rift.. yet.. but I'm just saying don't count your chickens kind of thing.blazespinnaker wrote:Maybe you could start off by explaining why you think Oculus is in trouble.
That being said, there is likely some interesting R&D to be done in the realm of the screen itself. This goes more to my theme that the intellectual property is in the underlying components rather than the mere assembling of them.
If you ask me it would be in Oculus' best interest to just sell the whole damn thing to someone who's going to really
focus on making the product user friendly. (I.E just put it on your head and play NO configuration) and then
it will really take off.
-
- Two Eyed Hopeful
- Posts: 68
- Joined: Tue Jan 22, 2013 4:01 pm
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
I think the OR is extremely uncomfortable. It's so tight that it blocks to blood circulation in my face, and my nose doesn't fit properly in the slot so it scratches along the sharp edge of the plastic.blazespinnaker wrote:I've never heard anyone complain about the OR's ergonomics.
-
- Cross Eyed!
- Posts: 117
- Joined: Fri May 17, 2013 6:49 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
Anything that Nvidia could come up with would be short handed, not good enough, sorta too late for the party, not really 360 degrees of view, more like 35-40*. Even Sony could beat Nvidia glasses with a 3trd gen HMD, but they have some sense since experiencing the Rift. I don't even like Nvidia's 3D glasses, even THAT they couldn't get right for the first time, the screen is too dark, the 2nd gen, less darker but still DARK, and you need a newer display to support it, outrageous!! So over expensive hardware, like that controller screen platform, gimmicky, if you ask me.
Oculus would be in trouble if someone pirates the tech down to the last detail, and even improves upon it, then it would be in trouble but, if that was the case, Oculus wouldn't have to be supported in less than 1 year, massively changing the world in this short time. Big companies are big, but hugely incompetent, 15,000 dollars for OLED 40* screen, do I need say more?!
Oculus would be in trouble if someone pirates the tech down to the last detail, and even improves upon it, then it would be in trouble but, if that was the case, Oculus wouldn't have to be supported in less than 1 year, massively changing the world in this short time. Big companies are big, but hugely incompetent, 15,000 dollars for OLED 40* screen, do I need say more?!
- colocolo
- Diamond Eyed Freakazoid!
- Posts: 790
- Joined: Mon Jun 04, 2012 1:25 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
Yeah they are extremly incompetent because of the tails wearing economy chimpansees leading the company. Luckily Oculus VR only consists of tech heads and that will make the big difference in the end.Ryuuken24 wrote:Anything that Nvidia could come up with would be short handed, not good enough, sorta too late for the party, not really 360 degrees of view, more like 35-40*. Even Sony could beat Nvidia glasses with a 3trd gen HMD, but they have some sense since experiencing the Rift. I don't even like Nvidia's 3D glasses, even THAT they couldn't get right for the first time, the screen is too dark, the 2nd gen, less darker but still DARK, and you need a newer display to support it, outrageous!! So over expensive hardware, like that controller screen platform, gimmicky, if you ask me.
Oculus would be in trouble if someone pirates the tech down to the last detail, and even improves upon it, then it would be in trouble but, if that was the case, Oculus wouldn't have to be supported in less than 1 year, massively changing the world in this short time. Big companies are big, but hugely incompetent, 15,000 dollars for OLED 40* screen, do I need say more?!
- TheHolyChicken
- Diamond Eyed Freakazoid!
- Posts: 733
- Joined: Thu Oct 18, 2012 3:34 am
- Location: Brighton, UK
- Contact:
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
Very cool, but Oculus is in no "trouble" here. It's just a tech demonstration, and a long LONG away from being an actual product (if it ever even is to become one). The tech does have a lot of benefits, but the downsides are considerable; it appears that despite the immense rendering horsepower required, the final produced image is still very low resolution (using 2x 720p panels). They say that they are "wide FOV", but I couldn't see any actual numbers for comparison. And let's not forget the head tracking - absent from this presentation - that is extremely important; the work required to create a reliable fast tracker, with great prediction to reduce perceived latency, is considerable.
If anything, it's great for Oculus that there are others in the industry helping to push tech forwards.
If anything, it's great for Oculus that there are others in the industry helping to push tech forwards.
Sometimes I sits and thinks, and sometimes I just sits.
-
- One Eyed Hopeful
- Posts: 46
- Joined: Sun Feb 03, 2013 4:12 pm
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
Yeah no trouble here, just some very interesting research.
-
- Two Eyed Hopeful
- Posts: 85
- Joined: Thu Mar 05, 2009 9:14 pm
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
zalo wrote:To give you an idea: the Sony microdisplay OLEDs have well over 2,000 PPI and still look low resolution with this microlens array on top.
PPI is an irrelevant spec on anything that uses lenses to augment the image. Because it effectively makes the PPI as large as the target size of the lenses. Say you have a VR display like that one, that makes it seem like you are staring at a 60" screen, it would then mean that you have to measure the PPI as if it were from a 60" screen at 720p (the resolution of the OLED's they are using) Which would turn out to be... wait for it... 29.37 PPI instead of the native 2,000 you get with the naked eye.
Also, you mentioned the lack of SDE. While I find it hard to believe that you wouldn't notice any of it, it seems to me you are confusing PPI with pixel fill factor. (read, the gaps BETWEEN the pixels)
Historically LCD technology has always had horrible pixel fill factor compared to other technologies. I don't know what the fill factor on those OLED microdisplays is, but i wouldn't be surprised at all if it had higher pixel fill factor than your average LCD panel / micro display.
So just to be clear; It's a mistake to correlate resolution directly with SDE. Since it is possible that you could have pixels that have no gaps at all, and therefor have no SDE what so ever. But you would still have to deal with a soft image even in those circumstances if the display was low resolution.
The reason the Oculus rift has so much noticeable SDE is because the pixels are gigantic since it covers your whole FOV.
I wouldn't be surprised if you could still notice SDE on a 4k panel in an oculus rift, since the pixels would still be HUGE if blown up to fill your entire FOV. I'm thinking we would need something close to IMAX film resolution to truly have invisible pixels on our entire FOV.
That is one reason, of course the other is that the gaps between pixels on LCD's at the resolution the rift uses are also gigantic. Since LCD technology has more space between pixels than say... LCOS. It would take higher resolution for the SDE to become less noticeable on LCD compared to say, LCOS or TI's DMD. I do not know what the fill factor is on OLED technology though.
- Dilip
- Certif-Eyed!
- Posts: 692
- Joined: Sat Dec 13, 2008 9:23 am
- Location: Ahmedabad//INDIA
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
Or may be little diffrenet approch, to fill entire fov, is it not possible to use curved display with bare minimum magnification power and keeping them nearest to eye making slim factor hmd?Kamus wrote:I'm thinking we would need something close to IMAX film resolution to truly have invisible pixels on our entire FOV.
What could be the benfits of this approch? some are..
1) keeping low magnification power will reduce size of projected pixel still curved dispaly can cover entire horizontal FoV
2) May be there will be no need at all to use wrapping as there wont be much distortion
3) All Tridef,Nvidia middleware work on all existing game out of box as they all doing SBS already
4) I also have idea of using Centre Heavy 3D Apporch with such display where
depth 30 to 35% at centre and 5 to 2% around both sides so that HUD can
be acomodated making realistic picture (as we do see minimum to no depth
in corner of our eyes in reality too anyway)
All we need is bendable display which is in "PRE-BETA" but why it can't be done, if one could arrange such screen?
If anyone think otherwise, please explain me why it won't work technically?
- colocolo
- Diamond Eyed Freakazoid!
- Posts: 790
- Joined: Mon Jun 04, 2012 1:25 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
the nearest approach to your eyes is already what the Rift does i think. Try to hold something close to your eyes. You will notice that 12-15cm is the closest you can get.Dilip wrote:Or may be little diffrenet approch, to fill entire fov, is it not possible to use curved display with bare minimum magnification power and keeping them nearest to eye making slim factor hmd?Kamus wrote:I'm thinking we would need something close to IMAX film resolution to truly have invisible pixels on our entire FOV.
What could be the benfits of this approch? some are..
1) keeping low magnification power will reduce size of projected pixel still curved dispaly can cover entire horizontal FoV
2) May be there will be no need at all to use wrapping as there wont be much distortion
3) All Tridef,Nvidia middleware work on all existing game out of box as they all doing SBS already
4) I also have idea of using Centre Heavy 3D Apporch with such display where
depth 30 to 35% at centre and 5 to 2% around both sides so that HUD can
be acomodated making realistic picture (as we do see minimum to no depth
in corner of our eyes in reality too anyway)
All we need is bendable display which is in "PRE-BETA" but why it can't be done, if one could arrange such screen?
If anyone think otherwise, please explain me why it won't work technically?
Innovega has invented cool contact lenses that let you focus beyond that limit.
http://www.youtube.com/watch?v=9bMd1kqKlN0
Would be so cool if Sony would integrate their magnetic eye tracking mechanism into Innovega's lenses.
Maybe in 5 years we already wear VR sun glasses.
- Ericshelpdesk
- Cross Eyed!
- Posts: 141
- Joined: Tue Mar 19, 2013 11:41 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
I like that the focus doesn't require much distance. Very cool piece of technology. Can't wait to see something like this in production.
If Nvidia and Oculus got together to make some 3D Vision HMD thing happen, I would be all over it if the price was right.
If Nvidia and Oculus got together to make some 3D Vision HMD thing happen, I would be all over it if the price was right.
- colocolo
- Diamond Eyed Freakazoid!
- Posts: 790
- Joined: Mon Jun 04, 2012 1:25 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
WTF!
"The patent covers contact lens systems and intra-ocular lens technologies."
(http://innovega-inc.com/press_landmark.php)
intra-ocular lens means basically this:
http://www.youtube.com/watch?v=xTKLVDEQdfE
Common operation nowadays, an alternative to LASIK.
"The patent covers contact lens systems and intra-ocular lens technologies."
(http://innovega-inc.com/press_landmark.php)
intra-ocular lens means basically this:
http://www.youtube.com/watch?v=xTKLVDEQdfE
Common operation nowadays, an alternative to LASIK.
- android78
- Certif-Eyable!
- Posts: 990
- Joined: Sat Dec 22, 2007 3:38 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
How do you figure that? The rift still has you focusing on a fixed plane. Real life doesn't exist in a single plane, it has light being scattered or reflected off points at many different depths. Having a light field display simulates this, so it more effectively simulates reality then a single flat panel does.colocolo wrote: the nearest approach to your eyes is already what the Rift does i think.
Remember that what the lens in your eye does is to alter the angle of light that hits any point of the surface of the lens, so that all the light from a point at a certain distance will hit the same point on your retina. The light field display divides the display into many segments, with each segments showing the rays with correct angle for the position it will be coming from.
This is a (very rough) concept picture:
You do not have the required permissions to view the files attached to this post.
-
- Cross Eyed!
- Posts: 154
- Joined: Mon Jan 21, 2013 5:26 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
I think it is great we see even more research and development when it comes to HMD, it can only be a good thing. It is even more exciting to see that Nvidia are not just researching, they are actually researching something that in the end could be a superior technology.
With Oculus Rift, it will never look life-like using current technology regardless of resolution as there is no array of depth, it is just stereoscopic 3D. What Nvidia is working on however has the potential to actually give you some pretty life-like depth. It is very cool to see them research this as it is the next step up from stereoscopic 3D and a step closer to the ultimate goal of holographic 3D.
With Oculus Rift, it will never look life-like using current technology regardless of resolution as there is no array of depth, it is just stereoscopic 3D. What Nvidia is working on however has the potential to actually give you some pretty life-like depth. It is very cool to see them research this as it is the next step up from stereoscopic 3D and a step closer to the ultimate goal of holographic 3D.
- colocolo
- Diamond Eyed Freakazoid!
- Posts: 790
- Joined: Mon Jun 04, 2012 1:25 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
What? 3D perception isnt life-like? All reviewers of the Rift including Oculus VR stated that it looks like natural 3D....Paladia wrote:I think it is great we see even more research and development when it comes to HMD, it can only be a good thing. It is even more exciting to see that Nvidia are not just researching, they are actually researching something that in the end could be a superior technology.
With Oculus Rift, it will never look life-like using current technology regardless of resolution as there is no array of depth, it is just stereoscopic 3D. What Nvidia is working on however has the potential to actually give you some pretty life-like depth. It is very cool to see them research this as it is the next step up from stereoscopic 3D and a step closer to the ultimate goal of holographic 3D.
-
- Cross Eyed!
- Posts: 154
- Joined: Mon Jan 21, 2013 5:26 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
It looks like 3D, just like a 3D movie. It won't look like real life however, in the same way that you are able to tell if you are looking at a 3D movie or if you are looking at real life. There's the whole issue with focus that is mentioned in the video.colocolo wrote:What? 3D perception isnt life-like? All reviewers of the Rift including Oculus VR stated that it looks like natural 3D....Paladia wrote:I think it is great we see even more research and development when it comes to HMD, it can only be a good thing. It is even more exciting to see that Nvidia are not just researching, they are actually researching something that in the end could be a superior technology.
With Oculus Rift, it will never look life-like using current technology regardless of resolution as there is no array of depth, it is just stereoscopic 3D. What Nvidia is working on however has the potential to actually give you some pretty life-like depth. It is very cool to see them research this as it is the next step up from stereoscopic 3D and a step closer to the ultimate goal of holographic 3D.
It doesn't have vertical parallax (the display anyway) or accommodation either. It also only has two points of reference whereas the eye has a multitude of references. For example, close one eye and hold up a needle or a hair shaft in front of the other. You will be able to see "through" it, as the eye has numerous points of views and thus can see around it. This is also one of the reasons the world doesn't look too 2D just because you close one eye, where as with the Rift it becomes completely 2D if you close an eye.
There are numerous reasons as to why the Rift or anything that uses ordinary stereoscopic 3D, won't look exactly like real life. You will always be able to tell that something is "off".
What Nvidia is working on isn't the ultimate solution, that would likely be holography, but it is a step up from what we have now and its a very exciting development.
- colocolo
- Diamond Eyed Freakazoid!
- Posts: 790
- Joined: Mon Jun 04, 2012 1:25 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
ah ok, now i get the point after reading through various articles about advances in this tech.Paladia wrote:It looks like 3D, just like a 3D movie. It won't look like real life however, in the same way that you are able to tell if you are looking at a 3D movie or if you are looking at real life. There's the whole issue with focus that is mentioned in the video.colocolo wrote:What? 3D perception isnt life-like? All reviewers of the Rift including Oculus VR stated that it looks like natural 3D....Paladia wrote:I think it is great we see even more research and development when it comes to HMD, it can only be a good thing. It is even more exciting to see that Nvidia are not just researching, they are actually researching something that in the end could be a superior technology.
With Oculus Rift, it will never look life-like using current technology regardless of resolution as there is no array of depth, it is just stereoscopic 3D. What Nvidia is working on however has the potential to actually give you some pretty life-like depth. It is very cool to see them research this as it is the next step up from stereoscopic 3D and a step closer to the ultimate goal of holographic 3D.
It doesn't have vertical parallax (the display anyway) or accommodation either. It also only has two points of reference whereas the eye has a multitude of references. For example, close one eye and hold up a needle or a hair shaft in front of the other. You will be able to see "through" it, as the eye has numerous points of views and thus can see around it. This is also one of the reasons the world doesn't look too 2D just because you close one eye, where as with the Rift it becomes completely 2D if you close an eye.
There are numerous reasons as to why the Rift or anything that uses ordinary stereoscopic 3D, won't look exactly like real life. You will always be able to tell that something is "off".
What Nvidia is working on isn't the ultimate solution, that would likely be holography, but it is a step up from what we have now and its a very exciting development.
Lightfield or hologramm tech is the holy grail we are looking for. Makes much more sense.
Hopefully i wont be dissapointed by the Rift after imagining all the stuff i want to experience.
-
- One Eyed Hopeful
- Posts: 7
- Joined: Sun Mar 17, 2013 7:12 pm
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
I haven't found the lack of depth of field per eye to make the experience any less "real". For example one of my favorite things to do in Half Life 2 was to walk up to a chain link fence and change my focus from the fence to the background buildings. Sounds stupid, but it's a far different experience than a simple stereoscopic movie. It actually feels like you could reach out and touch the fence. I don't think you'll be disappointed with the Rift. Just can't wait for better resolution.
-
- One Eyed Hopeful
- Posts: 16
- Joined: Wed Feb 27, 2013 7:35 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
Well, just consider that when this technology comes to market in 10 years or so, we'll all be as impressed by it then as we are by the Rift now. Maybe.colocolo wrote:ah ok, now i get the point after reading through various articles about advances in this tech.
Lightfield or hologramm tech is the holy grail we are looking for. Makes much more sense.
Hopefully i wont be dissapointed by the Rift after imagining all the stuff i want to experience.
- colocolo
- Diamond Eyed Freakazoid!
- Posts: 790
- Joined: Mon Jun 04, 2012 1:25 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
i wonder how many pixel you would need for an image with life-like resolution.
this prototyope has around 100 sub images. So the perceived resolution would be around a whopping 20,000 pixels. What would be the perfect sublightfield-image amount? Is it achievable at all?
this prototyope has around 100 sub images. So the perceived resolution would be around a whopping 20,000 pixels. What would be the perfect sublightfield-image amount? Is it achievable at all?
- android78
- Certif-Eyable!
- Posts: 990
- Joined: Sat Dec 22, 2007 3:38 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
I'm not sure how you figure that this would have a perceived resolution higher then the native resolution of the screen used. Because it uses multiple pixels to simulate a point (more like blob) of light, the perceived resolution would be lower, but that woul actually just mean its always a blur, not that you would see pixels. I think the Lytro camera uses something like an 11 megapixel sensor and from eye, it would be hard to distinguish the point being focused on when it is covering 10deg of fov. So for a 100 deg fov maybe you'd need 1.1 billion pixels? Are we almost there yet?colocolo wrote:i wonder how many pixel you would need for an image with life-like resolution.
this prototyope has around 100 sub images. So the perceived resolution would be around a whopping 20,000 pixels. What would be the perfect sublightfield-image amount? Is it achievable at all?
- colocolo
- Diamond Eyed Freakazoid!
- Posts: 790
- Joined: Mon Jun 04, 2012 1:25 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
did i say that the perceived resolution would be higher?android78 wrote:I'm not sure how you figure that this would have a perceived resolution higher then the native resolution of the screen used. Because it uses multiple pixels to simulate a point (more like blob) of light, the perceived resolution would be lower, but that woul actually just mean its always a blur, not that you would see pixels. I think the Lytro camera uses something like an 11 megapixel sensor and from eye, it would be hard to distinguish the point being focused on when it is covering 10deg of fov. So for a 100 deg fov maybe you'd need 1.1 billion pixels? Are we almost there yet?colocolo wrote:i wonder how many pixel you would need for an image with life-like resolution.
this prototyope has around 100 sub images. So the perceived resolution would be around a whopping 20,000 pixels. What would be the perfect sublightfield-image amount? Is it achievable at all?
No, sure it will be lower.
Lets assume pixel will shrink one day to 1 micron(including pixel gap) and production methods will become cheap enough so that you can buy 2 big sunglasses(50mmx50mm each glass,LCoS).
On one mm² you would have 1 MP. and in total 2500x1MP per glass. 2,5billion pixels.
I hope that would be enough to simulate reality perfectly.
Silicon Micro Electronics is already working on 8k4k microdisplays.
SSDs show us that at least big amounts of single crystal silicon have not to be that expensive.
-
- Cross Eyed!
- Posts: 154
- Joined: Mon Jan 21, 2013 5:26 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
Even from the distance of a meter, you can quite clearly see a human hair, which just has a width of 50 micrometers. Since you can see an arc of ~2 meters at that distance. You'd need resolution with a width of 40 000 just for one eye (40K). This is assuming that seeing a human hair at one meter is the extreme limit of human sight.android78 wrote:I'm not sure how you figure that this would have a perceived resolution higher then the native resolution of the screen used. Because it uses multiple pixels to simulate a point (more like blob) of light, the perceived resolution would be lower, but that woul actually just mean its always a blur, not that you would see pixels. I think the Lytro camera uses something like an 11 megapixel sensor and from eye, it would be hard to distinguish the point being focused on when it is covering 10deg of fov. So for a 100 deg fov maybe you'd need 1.1 billion pixels? Are we almost there yet?colocolo wrote:i wonder how many pixel you would need for an image with life-like resolution.
this prototyope has around 100 sub images. So the perceived resolution would be around a whopping 20,000 pixels. What would be the perfect sublightfield-image amount? Is it achievable at all?
-
- One Eyed Hopeful
- Posts: 6
- Joined: Wed Jun 26, 2013 9:41 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
Methink you see the hair because of its length and shape, not its width. If the "hair" was the same width and length, I am quite certain you would not see it at a meter.Paladia wrote:Even from the distance of a meter, you can quite clearly see a human hair, which just has a width of 50 micrometers. Since you can see an arc of ~2 meters at that distance. You'd need resolution with a width of 40 000 just for one eye (40K). This is assuming that seeing a human hair at one meter is the extreme limit of human sight.android78 wrote:I'm not sure how you figure that this would have a perceived resolution higher then the native resolution of the screen used. Because it uses multiple pixels to simulate a point (more like blob) of light, the perceived resolution would be lower, but that woul actually just mean its always a blur, not that you would see pixels. I think the Lytro camera uses something like an 11 megapixel sensor and from eye, it would be hard to distinguish the point being focused on when it is covering 10deg of fov. So for a 100 deg fov maybe you'd need 1.1 billion pixels? Are we almost there yet?colocolo wrote:i wonder how many pixel you would need for an image with life-like resolution.
this prototyope has around 100 sub images. So the perceived resolution would be around a whopping 20,000 pixels. What would be the perfect sublightfield-image amount? Is it achievable at all?
-
- Cross Eyed!
- Posts: 154
- Joined: Mon Jan 21, 2013 5:26 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
The length in this case is not relevant, as one can clearly see the width of it. If the width was beyond what the human eye could see, it would not be visible.Mazhurg wrote:Methink you see the hair because of its length and shape, not its width. If the "hair" was the same width and length, I am quite certain you would not see it at a meter.
As such, a screen, in order to be perfect, needs to be able to mimic a human hair at that distance, and making it as thin as it is. If it cannot draw a line that is at least as thin as a human hair at that distance, it isn't perfect.
Heck, I can even spot white blood cells as they zip along the eye, and they are just a few micrometers in size. Though they considerably closer than in the hair shaft example.
-
- One Eyed Hopeful
- Posts: 16
- Joined: Wed Feb 27, 2013 7:35 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
Those are just bits of debris, not white blood cells.Paladia wrote:Heck, I can even spot white blood cells as they zip along the eye, and they are just a few micrometers in size. Though they considerably closer than in the hair shaft example.Mazhurg wrote:Methink you see the hair because of its length and shape, not its width. If the "hair" was the same width and length, I am quite certain you would not see it at a meter.
-
- Cross Eyed!
- Posts: 154
- Joined: Mon Jan 21, 2013 5:26 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
No, with a blue background you can see white blood cells. It is called the Blue field entopic phenomenon. You see them zip along their path over and over again, it works best if you are looking at a blue sky. They even look fairly large despite their tiny size.comham wrote:Those are just bits of debris, not white blood cells.Paladia wrote:Heck, I can even spot white blood cells as they zip along the eye, and they are just a few micrometers in size. Though they considerably closer than in the hair shaft example.Mazhurg wrote:Methink you see the hair because of its length and shape, not its width. If the "hair" was the same width and length, I am quite certain you would not see it at a meter.
- colocolo
- Diamond Eyed Freakazoid!
- Posts: 790
- Joined: Mon Jun 04, 2012 1:25 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
since the retina contains 150 million rod cells for black/white vision and only 6 million cone cells for
color vision would it be useful to use less RGB subpixel and instead more white OLED pixels for Gigapixel panels?
EDIT: ah, just wait for nm QLEDs or some embedded nanolasers that create in combination with waveguides and fancy holographic structures a life-like image.
I guess there is more to come.
color vision would it be useful to use less RGB subpixel and instead more white OLED pixels for Gigapixel panels?
EDIT: ah, just wait for nm QLEDs or some embedded nanolasers that create in combination with waveguides and fancy holographic structures a life-like image.
I guess there is more to come.
- Likay
- Petrif-Eyed
- Posts: 2913
- Joined: Sat Apr 07, 2007 4:34 pm
- Location: Sweden
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
Some addition to the above discussion: The eyelens is a factor which limits the actual "pixelsdensity" of the eye. An object, no matter how small it is will affect several cells because of this. So practically there's absolutely no need to have a display with an equal pixeldensity as an eye to give a similar experience.
- android78
- Certif-Eyable!
- Posts: 990
- Joined: Sat Dec 22, 2007 3:38 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
The whole issue of what is the minimum width the human eye can see gets confused because we are sensing photons, so even if you can't actually see something (as in a hair at 1m), you will perceive it because it will block some of the photons (or reflect some, depending on light source), showing a darker or lighter region. This doesn't mean that you need that resolution to reach the maximum number of pixels they eye can possibly resolve. I believe the minimum size the eye can resolve is on the order of 1.2mm at 1m.
Take this to the extreme and work out the size average visible star in the night sky is. For example, a reasonably close star Alpha Centauri. At 1.339 pc (which is 41.3751*10^12 km) and 1.227 Ro radius (which is 8.533785 * 10^5km). This gives a distance to size of 4.8484*10^7:1. I'm not sure how small an angle this is (for comparison, its equivalent to 1mm at 48km) yet you can still clearly see the star. Admittedly, the scattering of photons through earth's atmosphere does dramatically increase the size of the field of light reaching your eye, but it's still tiny none the less.
Take this to the extreme and work out the size average visible star in the night sky is. For example, a reasonably close star Alpha Centauri. At 1.339 pc (which is 41.3751*10^12 km) and 1.227 Ro radius (which is 8.533785 * 10^5km). This gives a distance to size of 4.8484*10^7:1. I'm not sure how small an angle this is (for comparison, its equivalent to 1mm at 48km) yet you can still clearly see the star. Admittedly, the scattering of photons through earth's atmosphere does dramatically increase the size of the field of light reaching your eye, but it's still tiny none the less.
-
- Petrif-Eyed
- Posts: 2708
- Joined: Sat Sep 01, 2012 10:47 pm
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
I agree. Studies show that a dark-adapted human eye can detect a single photon, no matter how tiny or how distant the source of light. This has nothing to do with object size or eye focus.android78 wrote:The whole issue of what is the minimum width the human eye can see gets confused because we are sensing photons, so even if you can't actually see something (as in a hair at 1m), you will perceive it because it will block some of the photons (or reflect some, depending on light source), showing a darker or lighter region. ...
http://math.ucr.edu/home/baez/physics/Quantum/see_a_photon.html wrote:The human eye is very sensitive but can we see a single photon? The answer is that the sensors in the retina can respond to a single photon. However, neural filters only allow a signal to pass to the brain to trigger a conscious response when at least about five to nine arrive within less than 100 ms. If we could consciously see single photons we would experience too much visual "noise" in very low light, so this filter is a necessary adaptation, not a weakness. ... It is possible to test our visual sensitivity by using a very low level light source in a dark room. The experiment was first done successfully by Hecht, Schlaer and Pirenne in 1942. They concluded that the rods can respond to a single photon during scotopic vision.
This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.
- android78
- Certif-Eyable!
- Posts: 990
- Joined: Sat Dec 22, 2007 3:38 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
Geekmaster agrees with me! *fistpump* Finally I'm validated!
- android78
- Certif-Eyable!
- Posts: 990
- Joined: Sat Dec 22, 2007 3:38 am
Re: NVIDIA's Light-field Glasses Prototype. Oculus in troubl
The other thing is that the receptors in the eye aren't just one single photon wide (what's the width of a photon anyway? Does an EM wave field have a physical size?) and are constantly receiving the photons. So if you block 1/4 the photons hitting the receptor, then it will be perceived as a darkening in that region to 1/4 the brightness. This is kind of what happens when you have a hair at 1m away with something bright behind it.