.. that’s what my wife keeps calling me for obsessing about the Wiimote. One of the big things I wanted to do with the wiimote is be able to estimate its position/rotation when it can’t see the IR LEDs. It isn’t possible to do this very accurately, but I was able to put together a program that does this. You can even use it with the nunchuk. It can’t figure out yaw at all due to there being no gravity in that direction to affect the accelerometers. The most important part of it is calibrating the sensors. I made a 6 direction calibration routine (since I’ve found that not only do I have to scale the measurements, but also gravity is oddly different depending on if the wiimote is facing up or down. So, you point the remote in different directions and it calculates the calibration. It is still a bit wiggly, and the position deteriorates fairly quickly, but I feel like it’s enough to get an idea. With the nunchuck, I’ll have to make it gravitate towards the center or something. There are also some more things I can do to make it more stable such as determining and then integrating the angular velocity. Basically, in order to know the linear acceleration, you need to know the orientation of the controller so you can remove the force of gravity from the readings. In order to know the orientation, you need to know the linear acceleration so you can remove it before calculating the direction of gravity. So, that’s basically the problem. You can’t know one without knowing the other.
If you make some assumptions, you can get semi-accurate data. That’s all I’m currently doing. I assume that when the reading on the accelerometers is close to 1g, you aren’t accelerating. The software then figures out the orientation of the wiimote and uses that for future linear acceleration readings. I suspect I can do some more accuracy by using other similar tricks or by using the jerk* to determine the expected next linear acceleration and assuming major differences from that are orientation changes or something like that.
Anyway, my test program uses pygame, which I’d never really used before. I started using it on the webcam IR tracking since the original version used it. It seems pretty useful for visualizing 2d things. But for 3d, it was Python-Ogre time.
I started a test app for actually using the wiimote in a 3d environment. Basically, there are a bunch of blocks laying around, and you have one that you control with the wiimote. You can use it to push the other blocks around or even pick them up and throw them. However, I find stacking them to be a fun challenge. There is force feedback involved, so when your virtual hand touches something, the wiimote vibrates. All in all, it’s a pretty good interface. I may want to add some more smoothing to the wiimote position because it can be a little tricky to hold it still. By the way, the camera in the wiimote seems to be 1024×768, and it tracks the dots at about 30fps I think. That makes things very responsive. I do wish that they had put a wide angle lens on it though. That would have been really nice. As it is, the FOV is fairly narrow, which lessens the coolness of the good resolution.
The next test was to add head tracking into the mix (previously, the camera was rather stationary). I decided to use FreeTrack to do that since there is an odd bug in the software I wrote that causes roll to not be calculated correctly. FreeTrack pretends to be TrackIR, which GlovePIE captures and then sends via OSC to my test app.
In my previous head tracking app, it assumed you had VR goggles. I don’t actually have those, so it’s fairly useless for it to work that way. One thing that is cool about Johnny Lee’s implementation was that it made it look like your monitor was a window. The Ogre guys have been trying to figure out exactly how to do this and I tried their code in my old head tracking app. It didn’t work very well. However, by messing with it a bit, I was able to get it to work fairly well for me. It’s still not quite as convincing as Johnny Lee’s version, but pretty close.
So now, I have a little app where you can use the wiimote like a virtual hand, move around via the joystick on the nunchuk, and finally using the head tracking on a stationary monitor. Pretty cool.
My main reason for doing all this is to come up with a good interface for MV3D’s in game editor that uses these techniques. Granted, there will have to be a fallback for pretty much everyone else since they won’t have a similar setup. I’m still trying to figure out what I can use the movement/orientation of the nunchuk for.
I’m vaguely thinking that two more wiimotes would be good. One to take the place of the IR camera since the wiimote’s IR camera is so much better. The other would be for a second hand. The only problem is that it’s nice to have the analog joystick from the nunchuk.