For those of you who haven’t seen head-tracking displays, let me direct you to this video which describes the basics of what’s going on.
Basically, what’s shown on the screen updates in relation to the angle at which you view the screen. The end result turns the screen of your TV or iPhone into a window into a virtual world. It’s a virtual-virtual-reality. Awesome!
I would love to be playing an FPS on my iPhone, and tilting/turning the phone to shoot behind objects or tilt around corners. Racing games could be equally cool, letting you peer left or right to look into your rear view mirrors as you race through LA.
Technology-wise, the youtube video linked above requires IR sensors to track your head position in relation to the TV – but this is b/c the TV is stationary and your head is moving. With the iPhone, the effect can be the same by rotating the phone in your hands, and have your head by stationary. The iPhone’s accelerometer can (might?) be able to give all the necessary information to determine the phone’s orientation during the game.
If anyone knows if there’s any stub code for head-trackers written in objective-c, please leave a note in the comments. My gut tells me that if virtual-virtual-reality is ever to find its way onto the iPhone, it’s going to have to be written largely from scratch. Please someone prove me wrong!