Welcome to WiiR3D's final blog post. It's week 12 at UOW which means that our work is due for submission, and our product is ready for the world. The trade show is on Thurday 23 October at the University of Wollongong Unihall from 1pm to 4pm. During the trade show we'll be displaying our product along with the other project teams from CSCI321 this year. Come along and you will get the privelage of meeting the team, and we'll even give you our business card, which niftily includes three infrared LEDs for you to begin making your own three point head tracking headset.
We'll be showing the WiiR3D 20 Questions, Room with a View, Roaming Ralph, and WiiR3D VRDesktop demonstrative applications at the show, which you'll also be able to try out wearing one of our stylish construction hat headsets, the making of which Adam exposed in his Headgear Construction blog entry.
Thank you all who have kept an eye on the WiiR3D project during its development. Head to our downloads page to get hold of the WiiR3D product, our demonstrative applications, and user and technical manuals.
The WiiR3D business card
From the WiiR3D team
At times it felt like it would be never-ending, but here we are. Of late I've busied myself with designing our poster for the trade show (click on the image to the right to see a larger version), sticking LEDs to business cards (above), finalising our documentation (you can download the user and technical manuals from our downloads page), updating this website, and taking care of various administrative loose ends.
Public speaking has never been my strong suit, so I'm looking forward to the trade show with both trepidation and excitement. Mostly, I am just relieved that we've reached the end without too many hiccups along the way. I hope you will find the WiiR3D product useful and I look forward to seeing any applications you develop using it. Make sure to email us a link to your work!
I’ve had a great time working with this team to produce the WiiR3D head tracking server. It really has been a fantastic learning experience. I believe that we have delivered a solid product and one that can be further developed and extended in the future. Some things that I would like to see extended would be:
Integrating the IPointModel as a plugin via the GUI
Developing gesture recognition and smoothing modules into plug-in components.
Developing a web camera plugin.
Developing some other calculators.
Developing a four Point model and a calculator that supports the model.
I would really like to thank all of those people who have helped us and supported us throughout this project and all those people how have been watching the project develop and grow.
Well, the project is almost finished, with just the assessment and trade show remaining. We’ve settled with five demonstrations for the trade show; looking-and-gesturing, demonstrating gesture recognition and pose calculation, 20q, a “real-world” application of gesture recognition as an interaction tool, Room-with-a-View, demonstrating the use of pose and a second input device for first-person-gaming, VRDesktop, Johnny Lee’s head-tracking ‘re-envisaged’ using our own translation data, and Roaming-Ralph, an application of head-tracking for avatar control through a 3D environment.
Though we probably won’t be able to maintain WiiR3D, the project is available under the GPL at http://code.google.com/p/wiir3d/, so feel free to download, modify and distribute the project as you wish. Some possible additions might include:
More complex gesture recognition
Selection of TCP/UDP/Shared Memory for the transport mechanism
Wider variety of plugins to support web cams and other input devices
Serialization of plugins to save plugin state (currently only saves plugin paths)
Headtracking is an amazing new method of user interaction. We hope that the system we’ve provided will help others in using this technology as it continues to evolve. Thanks for choosing WiiR3D, the superior headtracking solution.
See you later,
Finally, we go to the end of the project. The team is great. I have to say "well done" to the guys, Jess, Adam and James. Now we head into the presentation this Thursday and trade show on 23rd Oct. Thank you for all of you who pay attention in our project throughout the year and thank you to all of you who has been help us out in this project. Special thank to Jess, Adam and James, they are awesome team mates.
My name is Adam Parkes and I'm a member of the WiiR3D team. I've been working on many tasks for the project such as coding and designing, but today I have a new task. I have to develop and build a headgear prototype.
Now, there are a couple of issues you might need to be aware of before I start talking about the design of the headgear.
The headgear will be used by a large number of people on the presentation day.
The headgear must not change the size and dimensions of the Triangle formed by the IR LEDs from person to person.
The headgear must be robust, i.e. not easily broken.
The headgear must be able to be repaired quickly if broken.
The headgear must be able to fit a vast number of people (not all at once).
Changing batteries should be easy.
With that all said the design was rather simple. Sturdy, unbreakable and resizable... only one option... a construction hat! Luckily enough I know someone that can get their hands on them very easily (construction hats can only be used for about 3 years, then they must be disposed of as they are deemed unsafe; at least that seems to be the case in NSW Australia).
So I got a construction hat. We can now tick off points 2, 3 and 5. Since the LEDs will be attached to the construction hat, the triangle formed by the IR LEDs will be rigid and will not change from person to person.
You might not believe that the headgear will not brake. “Sure, the helmet won’t smash or crack,” you say, “but what about the LEDs or the circuit or the battery?” This is very good point and one that I took seriously.
I first thought about the LEDs. We need a way to change them without a soldering iron. For this I developed a plug-in arrangement, very simple and very easy to fix if a LED breaks. However, an issue created by the plug-in design was that when inserted in the wrong way, the LEDs would bust. To address this issue, the circuit was extended to ensure that if they where plugged in backwards they would not break, but instead simply not work.
Construction hat with holes drilled for the IR LEDs
Second, the circuits and the battery. They can both be strapped, taped or Velcroed to the inside of the construction hat. They sit above the head netting in the construction hat, out of the way the user. The circuits have been constructed on a small Veroboard which is then wrapped in tape and stuck in place. The battery has been Velcroed to the roof of the construction hat as well. To resolve point 6, ease of battery change, just remove the old Velcored one can replace it with the new one; a small amount of Velcro will need to be added to the new battery.
The only thing that could break on the construction hat is the putty we used to cement the LEDs into position. I don't want to jinx myself by saying this... but it appears to be pretty strong. I haven't done drop tests or anything like that. I would expect that we would have super glue, apoxy and even Blu Tack on the day to try and mitigate this risk.
Anyway, here is what the finished product looks like:
The WiiR3D team is well into its construction phase, with a few tech demos up and running. They include: a basic gesture recognition demo which displays ‘yes’ or ‘no’ when it recognizes a ‘nod’ or ‘shake’ respectively; a mouse mover which allows the user to direct the mouse cursor with the position of their head; a demo which allows the user to explore a 3D room by moving their head; and a demo where a character mirrors the user’s head pose – you can take a look at this final demo in the video below.
In other news, as you have already noticed, WiiR3D is now situated in a shiny new home. Our vision has been been through a metamorphosis as we've worked and honed in on the true purpose of our project: our mission is to produce a component which will assist open source developers to utilise head tracking technology. In addition, we will develop a variety of technical demonstrations to exemplify some of the possible uses of the WiiR3D product.
The server has been converted to a plugin architecture, which allows users to write and plug in their own input device drivers and alternative head tracking calculators; we’ve got our own calculator based on Alter’s equations up and running; and the GUIs for the server have also been improved. All round, a lot of healthy work going on for the WiiR3D construction stage.
From the WiiR3D team
A long time between updates, I know, but I think it's all worth the wait, don't you? Our first tech demo is going, as you will have seen from the video above, and I personally am very impressed with the way it's come out. The project is really taking shape.
Recently I have been exercising my graphic design abilities, revamping the look of our documentation, logo, and website. I wanted to add some colour to the project and give us a real corporate identity while still maintaining the clean and simple character that we originally imagined. I hope you approve. For anyone is might be interested, I found these tutorials particularly helpful in leaning to use Adobe Dreamweaver.
Since our last post, along with my general administrative and project managerial duties, I worked with 3DS some more to develop objects that can be used in our tech demos. It was difficult with no real training, but the objects are workable enough, and after my experience using it for the project I am even considering taking a course in 3D modelling in the future.
In the near future I am looking forward to seeing the development of other tech demos using the WiiR3D product. I will be spending a lot of time working on our final documentation, a big job that will require lots of cooperation from the group, but which I am sure will result in a helpful resource for those who utilise WiiR3D.
To round up, I’m feeling good about our progress thus far. We’ve made it over the midyear hump and we’re still learning a lot, but with the end in sight and the trade show on its way, we’re on the up.
Over the last couple of months I’ve been working on the server. My main task was to change the server to a plugin architecture. We decided that the Driver, Calculator and PointModel should be pluginable. The Point Model interface describes the configuration of the head gear; this currently includes relative distance and coordinates. We chose to use a plugin architecture because we didn’t want to restrict the project to the Wiimote and Alter’s calculator as the only input device and interpolator component. This means that you can now develop your own calculators, input drivers and PointModels in any .NET language.
I’ve also been working on the GUIs, trying to make them usable and concise. The picture below is the Sever GUI this is one of the many GUIs I have created to provide a usable interface to the server.
I’ve also developed the pitch and yaw demo. This demo is based of Panda3D’s Looking and Gripping demo. It takes the pitch and yaw data from the server and feeds it into the head positioning of the girl in the demo.
The second demo I created was the Mouse Mover demo. It’s a simple application that allows their user to move the mouse with the translation and rotation of their head. Again this is a simple demo but one that demonstrates the power of head tracking.
I did a lot of testing over the last few months and I managed to get the server to a stable state. Using the Alter calculator and the WiiMoteDriver it can consistently send valid approximations of translation and rotation. As you can imagine this was a long process and one that illuminated many small problems with implementation. One of the problems I found was that the Wiimote itself manages the numbering on the IR signals. It reads the IR signals left to right and then numbers them based on the first one it saw, then it remembers them. If two or more IR signals go out of range or are covered up, depending on the order they are reintroduced, the Wiimote will re-label them according to which one it saw first, second, third etc. The server couldn’t produce consistent data because the driver could produce consistent points. This problem has now been solved by sorting the nodes. This has the ability to make the server more consistent but also reduces the Z rotation of the headgear. In the future I would like to amend this problem, but at the moment it hasn’t been flagged as a high concern.
[Progress report: 17/5 – 17/8]
We’ve been a bit quiet over the past few months as we’re nearing the end of the ‘construction’ phase of the project. Since last time, many changes were made to the server. We’ve now adopted a plugin architecture for the server, which will allow developers to extend the application to use different types of input devices and head tracking algorithms. True to the project name, we provide plugins for the Wii remote for input and Alter’s equations for 3-point head tracking. Extension of the application is as simple as writing an assembly which extends a .NET interface provided with the application.
Additionally, I’ve managed to get Alter’s equations sending valid translation and rotation data over UDP, which is quite exciting. All of the equations for the 3D model points and rigid transformation are described in Alter’s paper (see Appendix A for the rigid transformation). Below is a prototype constructed as a proof-of-concept demonstrating Alter’s equations ‘in action’ on a dummy image plane.
One of the constraints inherent in Alter’s equations is that the 3-points on the model need to be positioned such that the triangle formed by the points is well distinguished away from a line. Tests showed that a Z-depth of greater than 6cm is enough to achieve stable results.
Additionally, I’ve built simple gesture recognition into the server, which is currently capable of recognizing gestures composed of ordered sequences of Up/Down/Left/Right movements. One of the hurdles with gesture recognition is that it relies heavily upon the accuracy of the data provided by the calculator, which itself is subject to error. Furthermore, head nuances such as minute shakes and shivers need to be properly handled. The solution settled upon is to ignore movements which are only “slightly” different from the last, and to process the most dominant direction and magnitude. A set of sample log files and the prototype script for demonstrating gesture recognition are available at here. The ‘Looking-and-Gesturing’ Panda3D app which demonstrates gesture recognition for nods and shakes is available here.
At the moment, I’m working on a python app which demonstrates gesture recognition in the popular 20 questions game. I’ll be using the mechanize and HTMLParser python modules to scrape the questions and Yes/No links from the HTML and into our application. The user will be able to nod or shake their response to the current question. At the moment, I’ve got it successfully pulling out the response links and question, which is a promising start indeed:
More gesture recognition testing, tweaking and fine-tuning
README / Help file describing server operation
XML representation of simple gestures, for client config
Filling in the blanks in the tech manual (protocol description, Alter and gesture recognition).
Recently, I was working on the calculation to get head tracking points. However, it has totally failed. Thanks to James; he got the Alter equation working. Now we can actually get working on head tracking stuff. The plugin developed from Alter works pretty well, you can see from the clip. Ah, we also changed our goal for the project. Before we decided to have 4 games that using the head tracking technology to play the games, but now, we will build a lot of mini application that demonstrate the head tracking technology. Each application will demonstrate an aspect of the head tracking technology that we develop.
Next step, we will build a lot of demos. So, hope it will interest you all.
It's all action in WiiR3D's elaboration phase. We have been working towards creating an architectural prototype and extending on Johnny Lee's target room to build a prototype of our Menu room.
As you can see, we've had a go at building some basic 3D models and the perspective control still needs some work, but we're definitely making progress.
We have also received, from the very kind people at OpenKMQ, a new headset and Wiimote holder. It is very lightweight (PET plastic), designed with head tracking applications in mind, and definitely an excellent option for anyone building their own IR LED headset. We are particularly fond of their Wiimote holder, which mounts easily on a flatscreen monitor. This position for the Wiimote is ideal for when users are playing a game while sitting at the computer.
We still are looking into options for a way to mount the Wiimote when the user is playing is a standing position - a kind of retort stand has been suggested, as well as something based on a camera tripod.
In other news, our official website is now under construction, as you may have noticed. You can visit via the navigation bar at the top of our blog, or access it directly. We will be working to have the site fully functional within the next few weeks.
From the WiiR3D team
Hello all! My work lately has been quite engaging - I've enjoyed trying my hand at 3D modelling as well as webpage design.
No-one in the team has worked in 3D modelling before, so giving that a try was quite a learning experience. I'm still in my early stages, but I did build the objects that you saw in the Menu room, and have been experimenting with some more organic-looking objects.
It's also been my pleasure to put together our official website. Google's been such a good host for our project thus far, it seemed natural to utilise GooglePages. I haven't designed a web page before, but Google's simple WYSIWYG editor helped me build (what I'd like to believe is) a professional site to promote the WiiR3D product.
Things that I’ve been doing over the last month include:
3D model development
To build the initial 3D models I used Sketchup. Sketchup is a program that makes developing 3D models extremely easy. There are some problems with Sketchup: it's buggy and it can’t directly save to FBX format which is what we required. There was a simple way around this which involved exporting the model to a format, renaming the file as a zip and running a converter package over a particular file in the zip. Aside for the above annoyances I did really like Sketchup and I would recommend it to anyone that knows nothing about modeling.
Working with XNA to develop a menu room
My goal was to place the models I created in the environment, this involved learning about the 3D pipeline. I followed tutorials such as http://msdn.microsoft.com/en-us/library/bb197293.aspx. The process was not too tricky, although it was time-consuming. I made a framework for the models called an entity that reduces most of the repetitive actions that have to be preformed on models to get them to display.
Developing the structure of the server
I helped develop the initial structure of the server. This involved deciding on communication protocols, system structure and the division of tasks between the server’s components. I was also involved in the initial testing of the server generating data follow and displaying pseudo points.
Developing the IHeadTrackingDriver interface
This interface was one that I developed and will continue to expand. This interface will assist any developer that wishes to use head tracking hardware other than the Wiimote. To use other hardware developers will have to implement this interface on their driver class. The server has a constructor that takes an instance of a class that implements the IHeadTrackingDriver.
Developing the WiiHeadTrackingDriver class
I developed the WiiHeadTrackingDriver. This class is responsible for connecting to the Wiimote and reciving data for the IR coordinates.
Developed a HeadTrackingSimulatorDriver
I developed this class as a method to test the connection of the server and client. I wanted to be able to test if we could pass coordinates through to the client and at what speeds. This class will be used over and over when testing the server. In fact I will extend this class in the future to create an ErroringHeadTrackingSimulatorDriver. The Erroring simulator will be very helpful when testing the robustness of the server.
Wiimote as a gaming input device.
Currently I’m working on intergrating the wiimotelib into our program so that we can use it as a secondary input device for the games. The biggest problem that I’m currently having with that is that the nunchuk will not connect to the wiimote automatically.
In the future I will:
Continue to work with the Wiimote to get it to act as an input device.
Continue developing the IHeadTrackingDriver interface
Develop the ErroringHeadTrackingSimulatorDriver.
Start working on panda3D.
Integrating the Client into the demonstration room.
Investigate Wiimote libraries for python.
Progress report [5/4 – 17/5]:
Tasks I was responsible for over the past month-and-a-half include:
The goal for this task was to play around with C# and the XNA framework, in order to determine its feasibility for the games we’ll be building, as well as to establish a baseline architecture if it does prove feasible. A basic "cube grid room," was constructed, which utilizes the same perspective projection manipulation performed in Johnny Lee’s DesktopVR application. Issues regarding the correctness of the perspective projection arose during development (see above.) Additionally, it was decided that whilst XNA does provide a suitable abstraction over the graphics device and 3D rendering, it may be more appropriate to use an engine which would allows us to quickly develop, test and modify some sort of 3D environment. To this end, it was decided that the Panda3D game engine should be used. Panda3D is written in C++, though can be fully utilized from within the Python scripting language.
Head-tracking gear construction
Following the advice of Christian Muise, the 9V batteries from the prototype head-gear were replaced with 3V (2025) watch batteries, in order to achieve a smaller form factor. Currently, the head-tracking gear consists of three small infrared LED circuits attached to a pair of safety glasses. Additionally, a shipment of TSAL 6200, 6400 and 7400 infrared LEDs have just arrived (thanks Jess!), which I intend to experiment with. The TSAL line of infrared LEDs are recommended by Johnny Lee, and so it would be interesting to see what differences (viewing angle and range) they make when compared with the stock Dick Smith Electronics infrared LEDs.
Also worth mentioning is the equipment received from Pixel Partner, in particular, the reflector-based glasses and the Wiimote monitor stand. We found that the reflector-based glasses were too flimsy to wear comfortably, due to the type of plastic (PET-G) used in its construction. However, we found that the Wiimote monitor stand proved to be an ideal solution for the placement of the Wiimote.
Head-pose tracking algorithms
Potential head-pose tracking algorithms aimed for three-point tracking were investigated. Two worth mentioning are “Simple, Robust and Accurate Head-Pose Tracking Using a Single Camera” and POSIT. The former paper describes a method of pose-tracking via root approximation of parametric equations representing the points in 3D space. The latter is used in the FreeTrack project. Fellow WiiR3D member Buu is working on the head-tracking computation, see below for more information.
The WiiR3D system is comprised of a server-client type architecture. The server, written in C#, does the majority of the grunt-work, and is responsible for performing the gesture recognition, head-pose and distance calculations and broadcasting this over a datagram (UDP) socket. This allows developers to easily utilise our information from any language (compiled or otherwise), as long as they have support for sockets. Currently we have a functioning iterative server written in C# which is able to communicate with a listening client over UDP. For simplicity, a fixed-length protocol has been designed for all communications between server and client.
Standard documentation (protocol description and basic operation for server, construction of the head-gear, head-tracking computation, configuration and usage)
Tasks to do in the immediate future:
Spike the Panda3D engine.
Investigate issues with perspective manipulation.
Look into head gesture recognition
Implement link-status in the server (rudimentary syn/ack over UDP)
Improve head-tracking gear (parallel circuit, use TSAL LEDs)
Yeah, I don't have much to talk about since the other guys have talked about all the stuff we've done. During the elaboration phase, I was working with the mathematics for head tracking calculation and am still working on that. I also have enjoyed modelling for project.
My goal in the immediate future is to finish all the maths work and translate it into code so that we can use it for a real head tracking application.
We are a third year project team from the University of Wollongong, Australia. Throughout 2008 we will be developing WiiR3D, a twofold product dealing with the implications of head tracking on the 3D gaming industry. We will be building on the research of Johnny Lee from Carnegie Mellon University in order to develop an API for developers to incorporate head tracking as a new input for computer games.
What is 'head tracking'?
Head tracking is the process whereby a pair of IR sources, typically mounted on the head or a similar position, are accurately located by an IR camera, such as the Nintendo Wiimote, so as to provide view-dependant images on screen. This means that the computer can know where the user is situated relative to the computer, and adjust its display accordingly.
What does head tracking imply to 3D gaming?
That is the question that WiiR3D aims to explore. As you can see in the video, head tracking allows the computer to generate a much more authentic three-dimensional environment than is currently standard. This step closer to 'virtual reality' for the common gamer is a significant one, and even on its own, this new technology could revolutionize the 3D gaming industry.
However, enhanced realism in the gaming environment is not the only way in which games developers can utilise head tracking technology. Head tracking can also be used as an active control, and this is another function of head tracking that WiiR3D intends to explore. To do this, WiiR3D plans to develop a set of three minigames in which a variety of possible uses for head tracking will be demonstrated.
From the WiiR3D team
Jessica Lloyd- Project leader, resident scheduling phreak and (technical) writer
G'day, I'm Jess and I'm still suffering slightly from shock that I have finally made it to this stage at UOW - mixing computer science and creative writing degrees is rare, and it feels like I took an extra long time to get to here. Fortunately, I am lucky enough to be in what I believe to be the best team I could be, working on the most exciting project in which I've had to opportunity to be involved. I'm confident that WiiR3D is going to be a great success, and a fantastic learning experience for all of us - and I think that we'll be developing a product on which other developers will be able to build upon in the future.
I hope that you will find this blog informative and interesting. :)
Adam Parkes- Testing manager and coding enthusiast
My name is Adam Parkes; I'm a third year computer science student. I love coding and gaming. I want this project to be successful and be something that I look back on in fondness. I also hope that in some small way we will help change and influence those of you who produce games and are reading this. I also hope that you follow us through this journey and learn as much as we learn as we try to develop an API for head tracking and a series of three demonstrative games.
James Leskovar- Development manager and programmer
Hi there. I'm James Leskovar, and am a member of the WiiR3D team as part of the third year project at the University of Wollongong. I'm primarily interested in programming, AI, and computer architecture, so the project is definantly an exciting challenge for me. We hope the WiiR3D project will give hobbyist and professional developers alike an opportunity to enjoy the exciting new technology of head tracking.
VinhBuu To- Database guru and modeller extraordinaire
My name is VinhBuu To. I am taking part in the WiiR3D project as part of my final year at University of Wollongong. Personally, I am interested in programming and software modeling as well as databases. WiiR3D is really a big challenge for me, but I think we have the best team here. So, I hope what we develop in WiiR3D will be helpful and interest people.
During Inception the WiiR3D team worked on understanding the scope of the project. James looked into the technologies that we will be using throughout the rest of the year, Adam got the requirements down-pat, Buu modelled plans for the overall application architecture and use cases for the game component, and Jess dealt with administrivia and meetings. James was the first to get Johnny Lee's sample head tracking application up and going, and built the following headset prototype:
As you can see, while the headset does what we need it to, it is slightly lacking in the area of style and class. Fortunately, having a stylish headset is not the most important part of WiiR3D, it's what we allow the user to do while wearing it.
WiiR3D will be a software system which explores and demonstrates the implications of head tracking to the 3D gaming industry. Aimed at 3D game developers, our three rudimentary minigames will each demonstrate a different way in which head tracking can enhance the 3D gaming experience.
WiiR3D will use a headset mounted with at least two clusters of infrared LEDs to indicate the user’s head position, and utilise a stationary Wiimote to read the infrared information and transmit it via Bluetooth to the computer system. WiiR3D will then interpret this information in order to adjust the display on screen relative to the user. The system will also be able to deal with a secondary input device, such as a keyboard or a second Wiimote wielded by the user, in addition to head tracking.
Game component features Main menu
The WiiR3D menu will demonstrate how the user can make simple commands using head tracking. The user will be able to employ familiar movements to command the application. These movements are nodding, shaking the head and nudging left and right, which roughly conform to the functions of the enter key, escape key, and left and right arrow keys respectively. The system will identify nodding, shaking and nudging movements using the infrared headset and the Wiimote’s infrared camera, and control the view and actions within the menu’s 3D environment according to the user’s movements.
We also have a quick mockup of what the hope the main menu will resemble:
Hide and seek
Hide and seek will demonstrate how head tracking can be used to allow a user to explore a three dimensional environment. The environment that the user will be exploring will see objects obscuring the ‘treasure’. Once the user has found the treasure the game will give some trivial feedback about time it took to find it. Head tracking will allow the user to view the environment in a manor that feels natural. The user will be able to move around in front of their monitor and see different perspectives of the virtual environment.
Pong is a traditional video game based on table tennis. The game environment is a 3D room where the ball bounces into the wall and the user has to catch the ball with their ‘paddle’ by moving their head. In this game, the user will move their head to move the ‘near plane’ in order to hit the ball. When the user comes near the Wiimote, he/she can have bigger ‘viewing frustum’ of the 3D room and vice versa.
WiiR3D Space Invaders is yet another reincarnation of the traditional game. It aims to retain the retro feel of the original whilst adding many new features to enhance the game. One such enhancement is the use of full 3D graphics for game play. Additionally, head tracking will be used as a form of input, enabling the user to rotate their aiming direction.
Game play follows the original arcade version of Space Invaders. The user is presented with a first person view of the game field. Enemy critters surround the user from in front and above, and gravitate toward the ground at an increasing rate. The user’s goal is to prevent any enemy critters from touching the ground. The player is armed with but a single-shot laser cannon, and has a number of lives they’re able to lose to the critters before the game is over.
The user is able to rotate their view with 3 degrees of freedom using head tracking. This enables the user to look around their environment and precisely aim at individual critters, with the laser blast following directly down the user’s gaze. As with the other games in the WiiR3D package, head tracking will also be used as a form of menu navigation.
Lifestyle Objective Milestone Review complete - Welcome to Elaboration!
During this phase of project development we will be building an architectural prototype. Elaboration will stretch from now until 21 May.