Los Alamos Mid School (74)/Interim Report

Interim Report
http://mode.lanl.k12.nm.us/get_interim1112.php?team_id=74

Dangerous scenarios that have to be dealt with are exactly that, dangerous. When it would be too risky to send a human to assess or manage the situation, robots are called to action. These robots need to be controlled remotely to keep people away from the danger, which immediately brings up the question of how to accurately translate the data being recorded from the robot into a form humans can interpret. One new technology that may become very useful in this field is haptic feedback, or feedback that stimulates the sense of touch. Our supercomputing challenge project will be attempting accomplish this very dilemma by creating a model of a dangerous situation and a controller. We are planning to make the controller as precise and intuitive as possible by using haptics technology, the use the sense of touch. This could greatly improve the current controls by making them easier to use and more accurate. We intend to create a haptic environment that will simulate a human interacting with a haptics based controller. By doing this we will hopefully have made a better way control robots. This virtual simulation will be written in the Python language.

So far we have created an environment in which we will do testing on the robot with the use of haptic technology. The environment is a 100 by 100 (10,000) tile grid in the form of an array. At the moment each tile has five characteristics which include: x coordinate, y coordinate, z coordinate, temperature, and texture. So far we have created algorithms which create natural looking terrain. However, at this time we are not yet finished with the configuration of the temperature of each tile. The algorithm to configure the texture of each tile works by first creating “seed” tiles every four spaces. This distance allows for some variability between texture types for our tests. These “seed” tiles then branch out to create small “biome” like areas with similar characteristics. The “biome” areas then expand more and more to eventually configure the entire grid. The “biomes” generate in a “random” seed based fashion. The reason for using these seeds is we are able to do tests on different types of environments. By using this seed based tile generation we have also created a way to make different environments as we please. This is done using a pseudo-random number generator that is initiated based off of a seed integer. With each seed used, a separate and quite different environment is created which will always stay the same for a specific seed. The next step for our project is going to be working on the robot and controller side of our project. We hope to be able to accurately simulate a haptic controller and robot model within our environment. We expect to find that haptic technology makes it easier for the use of robots in dangerous situations.

Introduction
Hi,

My name is Jon Brown, and the challenge has asked me to look over your interim report. I am a grad student at New Mexico Tech, if you like there is more information on my background and work on my User Page.

Progress
It looks like you have made good progress on the terrain generation part of your project. Your focus right now, as I think you recognized, should be on getting the simulation of the robot moving around and the force-feedback system going. Ultimately, the obvious conclusion of your project would be to get your force-feedback system running on an actual robot, but that's something that could be done in year two of this project if you want to keep it going next year.

Mentors
If you do not yet have a mentor, please contact consult or see here for help in finding a mentor to help you with this ambitious project.

Have you been in contact with Bandit Gangwere? He has a lot of experience in this area, and sounded interested in mentoring you in an email thread in response to your proposal.--Drew 14:43, 7 February 2012 (PST)

Model
As I've said above, I think it's a good thing that you're moving on to getting your robot simulation going. Making a agent that moves around on your terrain in real time based on user (keyboard) input should be an accomplish-able goal. Since your main goal is dealing with haptic control systems, implementing that into your simulation should be your next major focus after that. There are plenty of video game controllers/joysticks with that capability, and to interface them with your program you might look into GlovePIE (it looks like it'll map joystick functions to key presses and can deal with vibration/force-feedback, but no guarantee that it'll do exactly what you need, just a place to get started looking).

In terms of your terrain generator itself, the only other thing I would implement (as I've said I think you should start to move past this phase of the project), would be obstacles/walls, so when the robot his one it stops moving and the haptic system could give a signal that the robot has hit something.

Face to Face Evaluation
Your next milestone is a face to face evaluation in February.

Rubrics
The judges will use these rubrics to evaluate your projects. Use them as checklists for what you need to communicate to the judges.


 * Expo Judges Rubric
 * Finalist Judges Rubric