2009-09-06

New release of leJOS firmware

I've now updated the robot and the computers to leJOS 0.85.
According to http://lejos.sourceforge.net/ the changes are:
  • better support and documentation for MAC OS X, including the Fantom USB driver
  • a Netbeans plugin
  • improved JVM speed and many more amazing improvements by Andy
  • support for the new LEGO color sensor in the NXT 2.0 set
  • now supports the instanceof keyword
  • detection of rechargeable batteries and improved battery indicator
  • nanosecond timers and improved timer support with the Delay class.
  • % operation on floats and doubles
  • Class, including the isAssignableFrom(Class cls) method
  • display of LCD screen in ConsoleViewer
  • major speed and accuracy improvements to the Math class from Sven
  • platform independent lejos.robotics packages
  • new navigation proposal (work in progress) that is platform independent, supports more vehicles, has better localization support, and new concepts of pose controllers and path finders
  • preliminary support for probabilistic robotics, including general purpose KalmanFilter class using matrix algebra
  • reworking of the Monte Carlo Localization classes
  • limited java.awt and java.awt.geom classes
Sounds good :)

/Peter F

2009-09-05

Old memories

Before NXT there was RIS with RCX and I have two of those.
A couple of years ago me and a friend built two robots that were able to find eachother, dock and then exchange a ball.

One of them has a lamp and the other one has a lightsensor. The "child" scans 360° to find the lightest point and then head for that direction. On his way he is doing small adjustment to always head for the lightest point, in other words the "mother".

Here is a video showing the docking process:


It's a little bit sad that I've lost the latest version of the program that contains the "ball exchanging" part (and I haven't got the time to rewrite it) so the video dosn't show that.

This isn't actually part of PenemuNXT, just fun to show the world :)

/Peter Forss

AI and communication

While we still don't have anything new to show you right now, though that doesn't mean that we're slacking. We've started working on two different things that together should enable us to create a working prototype for our mapping robot.

I (Josef) am working on an AI for the robot so it can navigate through the room while scanning. I tried to create my own navigation class that would calculate the position of the robot based only on the rotation of the wheels, but for some reason it's not working. The algorithm to calculate the angle its facing seems to be working, but only certain values for some reason. Seems like a rounding error somewhere, but I don't know where.


public double getRobotangle(){
robotnewangle = robotangle - (Math.toRadians(CS.getDegreesCartesian()));
robotangle = (Math.toRadians(CS.getDegreesCartesian()));

leftwheelangle = ((Math.toRadians(Motor.A.getTachoCount())) - leftwheeloldangle);
rightwheelangle = ((Math.toRadians(Motor.B.getTachoCount())) - rightwheeloldangle);

leftdist = ((wheeldiameter*Math.PI)*leftwheelangle/(2*Math.PI));
rightdist = ((wheeldiameter*Math.PI)*rightwheelangle/(2*Math.PI));

return robotangle;
}


The algorithm to calculate it's position doesn't seem to work at all, even if I use values from the compass sensor. For some reason it never returns any data.


public Point getRobotpos() {
float x, y, hypotenuse;

hypotenuse = (float)(Math.sqrt((2*Math.pow((getRobotAverageDist()/robotnewangle),2))
-((2*Math.pow((getRobotAverageDist()/robotnewangle),2))*Math.cos(robotnewangle))));

x = (float)(robotpos.x + (Math.cos(getRobotAverageangle()*hypotenuse)));
y = (float)(robotpos.y + (Math.sin(getRobotAverageangle()*hypotenuse)));

robotpos.x = x;
robotpos.y = y;

return robotpos;
}


Feel free to check them both out on our Google Code site and come with tips if you have. Right now the code is a bit unstructured however. The algorithm to calculate the angle based on the rotation of the wheels is a comment right now in favor for a similar method that uses the compass sensor instead.

When my own class didn't work I was forced (for the time being anyway) to use the navigation class in leJOS, which unfortunately limits me to use only the methods provided by the class to navigate the robot. I'd rather be able to manipulate the motors at will, for example to be able to follow a wall easily.

I'm using behavior programming for my work, a really smart way of creating AI's, since it's so easy implementing, editing or removing different parts of it. I recommend reading the leJOS tutorial about it if you're interested.

Peter is working to improve our communication, which as you could see in our movie works already, but could be made a lot smoother and more structured. The idea of the new communication class is that you add data that you want to send to a queue. The class will then process one item at a time and send it over either USB or Bluetooth (depending on what you want to use). The receiver will add each received item to a list and you may then process them whenever you want. There is one client (NXT) part and one server (PC) part in this.

Hopefully, when we're done with these things, or at least got something that works, we can combine it with the scanning algorithm we already have and with small tweaks to the Graphic Interface a crude prototype that should be able to move around and scan a room, and in real-time paint it up on a computer screen.

/Josef

2009-09-01

Description document

Today we published two PDF-documents describing our project PenemuNXT more in detail. One in Swedish and one in English.

You can download them from here:

PenemuNXT - Swedish
PenemuNXT - English