fingers and hand tracking technology
Posted: Thu Aug 29, 2013 12:08 pm
First you should read my "arm tracking technology" thread to see what basic idea I'm using here.
Basically, the idea is there is one spot on the body that has a sensor and reciever and the sensor is in one spot. This is like lighting up on spot on the ground in a dark room with one flash light and the flash light never moves.
This sensor and receiver is on the neck like a dog collar or oxen collar and is held on the neck so it doesn't shift around when the person moves.
The sensor and receiver are right next to each other so when the person bends down the flash light doesn't move, if you use the flash light in a dark room analogy.
Now you have the arm technology show that from sensors you build up a hierarchy that shows where the bones are. So you have a sensor on the elbow and a sensor on the wrist and these show where the forearm is and the sensor on the elbow shows the humerus. Then the sensors on the elbow and wrist go to the sensors on the neck and the SW see's these sensors and estimates the position of the forearm and humerus bones, then the SW shows the forearm and humerus bones moving when the sensors move.
That is arm tracking technology.
Hand and finger tracking technology is the same idea, you need a static position to reference the dynamic positions. The static positions are the still flash lights in a dark room, the dynamic positions are the moving flash lights in a dark room.
The moving flash lights are the sensors on the arm, the one one the elbow and wrist, the still flash light is the one on the neck.
So the neck is used to give the still flash light but there is;
1 sensors on the wrist = 1 sensor
1 sensor on each knuckle = 5 sensors
1 sensor on each bending part of each finger that is not the knuckle = 9 sensors
= 15 sensors
These 15 sensors go to the receiver on the neck, then the SW sees the sensors on the wrist and hands fingers and decides where the bones are.
Come to think of it, the finger tips could use sensors too couldn't they? Then the length of the bone that has the finger nail on it could be found by the SW, so that's 5 more sensors, one for each finger tip and that would bring the total to 20 sensors per hand.
40 sensors for 2 hands, and 41 sensors total if you include the sensor on the neck.
Then the receiver on the neck gets these sensors which have a unique ID the SW can use to find the location of the bones by seeing how far the sensors are from each other.
e.g. the wrist sensor is the far away from the knuckles sensors, the knuckles sensors are this far away from the middle of the fingers sensors, etc.
Then if you want to get fancy there can be a haptic skin that can give feed back to the hands touch sensation, so in VR the hand touches something the sensors go to the receiver the SW decides the hand touches something and the hand feels that thing.
Unless the hand has some kind of robotics to hold the hand in position when it touches some VR object, like a ghost the hand will pass through that object.
So 41 sensors to get the hands up and running with VR, and 43 sensors if you want the arm included, you would add one sensor to the elbow. Now the entire hand and arm is articulated in VR. And with haptic skin on the glove you can feel the VR world too.
Basically, the idea is there is one spot on the body that has a sensor and reciever and the sensor is in one spot. This is like lighting up on spot on the ground in a dark room with one flash light and the flash light never moves.
This sensor and receiver is on the neck like a dog collar or oxen collar and is held on the neck so it doesn't shift around when the person moves.
The sensor and receiver are right next to each other so when the person bends down the flash light doesn't move, if you use the flash light in a dark room analogy.
Now you have the arm technology show that from sensors you build up a hierarchy that shows where the bones are. So you have a sensor on the elbow and a sensor on the wrist and these show where the forearm is and the sensor on the elbow shows the humerus. Then the sensors on the elbow and wrist go to the sensors on the neck and the SW see's these sensors and estimates the position of the forearm and humerus bones, then the SW shows the forearm and humerus bones moving when the sensors move.
That is arm tracking technology.
Hand and finger tracking technology is the same idea, you need a static position to reference the dynamic positions. The static positions are the still flash lights in a dark room, the dynamic positions are the moving flash lights in a dark room.
The moving flash lights are the sensors on the arm, the one one the elbow and wrist, the still flash light is the one on the neck.
So the neck is used to give the still flash light but there is;
1 sensors on the wrist = 1 sensor
1 sensor on each knuckle = 5 sensors
1 sensor on each bending part of each finger that is not the knuckle = 9 sensors
= 15 sensors
These 15 sensors go to the receiver on the neck, then the SW sees the sensors on the wrist and hands fingers and decides where the bones are.
Come to think of it, the finger tips could use sensors too couldn't they? Then the length of the bone that has the finger nail on it could be found by the SW, so that's 5 more sensors, one for each finger tip and that would bring the total to 20 sensors per hand.
40 sensors for 2 hands, and 41 sensors total if you include the sensor on the neck.
Then the receiver on the neck gets these sensors which have a unique ID the SW can use to find the location of the bones by seeing how far the sensors are from each other.
e.g. the wrist sensor is the far away from the knuckles sensors, the knuckles sensors are this far away from the middle of the fingers sensors, etc.
Then if you want to get fancy there can be a haptic skin that can give feed back to the hands touch sensation, so in VR the hand touches something the sensors go to the receiver the SW decides the hand touches something and the hand feels that thing.
Unless the hand has some kind of robotics to hold the hand in position when it touches some VR object, like a ghost the hand will pass through that object.
So 41 sensors to get the hands up and running with VR, and 43 sensors if you want the arm included, you would add one sensor to the elbow. Now the entire hand and arm is articulated in VR. And with haptic skin on the glove you can feel the VR world too.