Augmented Reality (AR) applications have been making headway into our lives. Probability the most recognized AR use is the yellow first down marker used on TV for most football games. AR apps have also been creeping into our phones. My current favorites are SkyMap and Wikitude. Being an engineer and scientist, I’ve always been interested in how things work. The inner workings of AR could not escape my interest, especially with access to such tools as our new smart phones. What makes AR possible in today’s new phones are the MEMS and GPS sensors. The MEMS sensor is used to sense the attitude of the phone, and the GPS is used to sense the location. Just using the MEMS sensors you can do some cool applications that use the phone’s attitude to make objects on the screen maintain a level orientation such as in our Magnetometer application and its “floating compass” look and feel. Add in your GPS location and you can adjust the size and position of objects on the screen and then give the appearance of anchoring those objects in the real world.
To demonstrate these principals, I’ve modified the example from my previous post Loading Textured SketchUp Models Into AR Windows Phone Applications. This example showed how to put a fully textured earth and moon from SketchUp into an XNA Windows Phone 7 application. Now what we’re going to do is anchor the earth and the moon to realworld locations. We’ll scale the size of the earth, moon, and the distance between them. One of the reasons for me choosing this example is to also demonstrate something I’ve always been fascinated with—the true scale of emptiness in our universe. When I was young, I was the kid who would calculate the scale of the sun, planets, and moons and place a baseball, basketball, or whatever else would work at distances that showed the scale of distances and sizes of those solar system objects. Now that I’m all grown up, I can do this with software and my phone!
In this example, our earth will be located at a nearby park and will be just under 20 ft (7918 miles actual) in diameter. That will make our moon just over 5 ft (2159 miles actual). So how far do we need to put them away from each other? Yep, about 590 ft (238,855 miles actual). It’s one thing to look up at the moon and think that it sure is a long ways away; it’s another thing to actually lay out a scale model of the earth-moon system and visualize just how small and far away they are from each other. Visualizing the system like that as a kid really made me realize what Armstrong, Aldrin, and Collins did in a tiny spaceship back in 1969. I’m sad that we lost Armstrong. We need more heroes like these guys who can look at a mission like they did and go for it.
There are two parts to anchoring your AR objects in the real world. The first part is to change the orientation of the objects based upon the orientation of your phone. This gives the illusion that your AR objects maintain their orientation no matter how you orientate your phone. I’ve already covered this with just a few lines of code in my post: Silverlight/XNA and Correcting WP7 Attitude Alignment. In this example, we are dealing with a TextureMeshObject (from a previous post) that has a View and World property that get applied to the ModelEffectCollection of the internal Model. To apply the orientation correction in this example, we will set the resulting matrix to the View property of the TextureMeshObject.
Matrix matrix = Matrix.CreateRotationX(-pitch);
matrix = Matrix.CreateRotationY(-roll) * matrix;
matrix = Matrix.CreateRotationZ(-yaw) * matrix;
earth.View = matrix;
The next part is to position your objects in the space around your phone. The easiest way I found for this is to make your phone the center of your little universe. So if your GPS location is (35.127055,-101.913976) and your scaled 20 ft diameter AR earth is anchored to the GPS location (35.127383,-101.914133), how do you calculate the matrix for the earth.World property? It ends up being just a few lines of code.
First we set our constant for the real earth’s radius. We also start off with an Identity matrix. In our matrix, we assume that the M43 component is the altitude of our position. In order to get the correct rotation, we have to subtract our current altitude and real earth radius. This puts the center of rotation now at the center of the real earth.
const float EARTHRADIUS = 6371000.0f; //We're using meters
Matrix mworld = Matrix.Identity;
mworld.M43 -= (float)CurrentLocation.Altitude + EARTHRADIUS;
We then calculate the difference in latitude between our position and the AR earth position. This difference tells us how many degrees we need to rotate about the X axis.
mworld = mworld * Matrix.CreateRotationX(MathHelper.ToRadians(
Now we figure out the difference in longitude. This tells us how many degrees we need to rotate about the polar axis, which in our case is the Y axis.
mworld = mworld * Matrix.CreateRotationY(-MathHelper.ToRadians(
Now we need to add back in our altitude and real earth radius to the M43 component and apply the finished matrix to the earth.World property.
mworld.M43 += (float)CurrentLocation.Altitude + EARTHRADIUS;
earth.World = mworld;
That’s it! You now have complete control of anchoring any object in your AR world. The biggest limitation to this is that you must take into account the accuracy of the GPS in your phone. If you make the size of your anchored object near or less than the GPS accuracy then when you approach your anchored object it will appear to hop around. It’s best to keep your anchored object larger than your GPS accuracy.
You can see in this video that we can walk up and around the earth (20 ft diameter). You can see it hopping around a bit but not too badly. However when we hike over to the moon (5 ft diameter), you can see it hopping around a bit more. The bigger your AR objects and the more accurate your GPS, the better the result.