==Regarding the MOVE wand physical characteristics (Part 1 of X)
*) The wand is very light but does not feel flimsy
*) Feels exactly like my Tivo remote control except smaller. I have quite average male hand size so as you can see the MOVE is not very large
*) The glowing ball is soft hollow rubber which you can deform by pushing it and it pops back out
*) The subcontroller has the same remote control feeling but is even lighter.
*) The subcontroller has a hard to grasp feeling when you try to put your fingers on L1/L2 because you have to have "Monster Hunter Claws" to hold it tight when pushing in L1/L2
==Regarding the MOVE wand 1:1 positional motion evaluation (Part 2 of X)
(Part 3 is rotation)
VERDICT: 1:1 positional tracking works as advertised in all directions including depth. Its not a joke or exaggeration. Its real and its here. Software teams on this project obviously are new to this stuff and all software is in an alpha state and is less impressive than the hardware.
*) I first went to the debug data screens for motion input to evaluate raw data.
*) I also evaluated all the games and compared against the debug data screen
*) Its pretty evident from the debug screens that there isn't much lag in receiving data from the wands.
There WAS lag however from the augmented reality VIDEO showing me what I did a split second later. That type of lag is normal when a computer is digitizing your video and showing you what it sees a second later if you have played with video digitizer preview mode you know what I mean.
TO BE CLEAR: Augmented reality has video delay lag but the tracking never fell behind my movements.
I swung my arms around and the image of the paddle & bat in my hand stayed in sync.
*) You can see the debug data screens showing X,Y,Z positional information with latency around 22 milliseconds
*) I was able to cover the glowing ball completely hiding it from the camera. The system continued to function fine tracking my actions but error correction was not happening because of this. It slowly became out of sync and the position of the wand on the screen started to drift out of position with my hand. After uncovering the ball it snapped back into synchronization.
*) GAMING: I screwed around with the gladiator game.
Considering I just did the DEBUG Demo screens it was shocking to see that the on screen avatar was not 1:1 motion. We already know the positional data is fed to the system quickly so apparently the software dev teams aren't sure how to make a fighting game with 1:1 motion data input.
1st off this game is mostly gesture based for attacking. The blocking does allow some free movement to place your shield anywhere but it is definitely not keeping up with my jittery fighting motion. Basically it was clearly trying to make the on screen fighters animate and transition into fighting stances.
This game I felt was a poor way to evaluate motion controls because its the same thing as the old Arcade game "Gladiator" but in 3D.
Is the input laggy? I believe you can't tell jack from this game because the entire time the characters try to move into fighting stances which I certainly never did and the animation is slow compared to my jerky movement. I think the software devs are trying to figure out what to do here. If they really tracked motion 1:1 then the on screen avatar would look like a dork holding the shield too low all the time.
*) GAMING: Ping Pong FTW?
Ok this was a real test as far as I was concerned. For some reason 99.9% of the people only played that stupid gladiator game and NOT the ping pong. WTF? Ping pong is the best way to check the motion out.
Was it going to work? Short answer is yes (inputs basically worked) and no (software tuning needed badly).
The positional data was tracking my paddle decent enough but the floating paddle in space without a real 3D ping pong table took me awhile to get used to. Normally you play ping pong you can see the table in front of you and that helps A LOT with seeing the ball coming.
There was a very "distant" feeling because the table is so far away from you considering you stand like 8 feet away from the TV set and the table is a flat image on the screen.
After playing several rounds I got used to it and was able to play properly. At that point it became apparent the software was ALPHA state. Hitting the ball with the paddle felt like I was using a greasy block of ice and it wasn't really ping pong yet. (I was able to do forehand and backhand smashes though)
That being said it was an impressive ALPHA. I have high confidence someone is going to actually model ping pong physics and then get us a tuned game of this.
After viewing the 3D TV gaming demonstrations I COMPLETELY BELIEVE they will get us 3D ping pong it would be madness not to make us 3D ping pong.
==Regarding the MOVE wand 1:1 rotational POINTING evaluation (Part 3 of X)
(Part 4 is 3DTV gaming)
VERDICT: Rotation is tracked and accurate I never noticed a hitch. Pointing I believe will be ok in the end however the software being displayed was obviously untuned. Most certainly calibration to TV and software finesse to smooth out is needed. I am convinced it will work fine but the testing I performed had the following results which led to this conclusion.
*) The rotation never screwed up from what I could see in all cases. It was always spot on to what I was doing in all cases.
*) I evaluated the pointing on only two things because SOCOM was not up for display today
*) The menu system for the gladiator/ping pong game had a pointer
*) The Debug Demo with all the on screen data had a pointer test calibration
The menu system for the gladiator game had a small green light cursor. It was not calibrated to the center of the TV and it basically felt slushy like I was pointing underwater.
The reason this felt this way is because they have a crappy non optimized algorithm to read rotational data. I can explain what it was due to the debug data screens showing the reality of the data input.
Unfortunately, I don't have a screen shot of the debug screen calibration from the demo but if you look at the test screen below and see the large letter X? That's the size of the calibration target you point at with the MOVE controller.
The data coming in is VERY TELLING. The debug screen pointer received raw data and drew a tiny white cursor on screen and it was jittery as all hell in a tight circle. It was bouncing around left right up down all the time about the size of my thumbnail on screen.
IMPORTANT HISTORICAL NOTE: REAL Light Gun games used to do this.
I programmed a light gun game before many years ago for the 3DO with about this much accuracy so I was not surprised.
How you are REALLY supposed to make light guns work in the original light gun games was that you compensated. You take data input and average out a lot of data into one center point. 20 jittery points average out into 1 perfect center point for the cursor is the basic idea.
HOWEVER! If you are lazy you will simply absorb a full half second of "scattershot" rotational data and put the cursor in the center of that average data. This would work in that it would always end up pointing in the right position... but if you do it this way the cursor ends up laggy as hell and moves like its on roller skates in a parking lot filled with slushy snow & ice.
A better algorithm would be to make a temporal calculation ring buffer which constantly adds one new data point and removes one old data point. You try to make the buffer bigger or smaller depending on how fast the pointer is moving dynamically calibrating the size as needed.
That type of tuning can turn a JITTERY INPUT into a smooth pointing mechanism.
Think of this algorithm like MOUSE CURSOR acceleration. The faster you move the mouse the more acceleration is given. The slower you move the mouse the slower the acceleration is provided.
This same algorithm is needed for MOVE rotational pointing. Maybe SOCOM has it?
From what I can see of the raw data into the demo is about as accurate as real light guns were except its doing it in HD 1080p resolution now instead of 320x240.
Once the devs start making TUNED pointer algorithms it should work FINE.
It absolutely WILL WORK FINE.
==Regarding 3DTV (Part 4 of 4)
VERDICT: Unless you are being a crabby stubborn whiner and hate technology there is simply no way to NOT like this technology. I played Baseball in 3D that was OK since it helped me feel like being a batter was much more fun. BUT... Super Stardust HD in 3D however was AMAZING. Just AMAZING.
The way it worked was so realistic depth wise that I am sold. I want one as soon as possible.
Maybe I will have to work a second job to afford a 3DTV (Thanks Kuturagi).
The effect was amazing in Super Stardust because the world is ROUND. Its a sphere going into the screen not a 2D pop up book. It really was 3D fully and truly 3D.
It blew me away it was better than IMAX 3D by light years.
After seeing this it immediately struck me that the 3D ping pong will be done by someone somewhere without doubt. Ideally with facetracking and popping the table OUT of the screen rather than into the screen to make holographic 3D ping pong.
The future is here my friends. What I saw today with Move and 3DTV when combined brings us one step closer to Star Trek Holodecks.
I don't know if MOVE will end up a niche product but when they combine 3DTV and MOVE its going to blow most people away on how cool this stuff is.