@Quarg: Seems likely we'll need some machine work sooner or later.

Definitely your experience building crap already will be valuable, even if all you do is kibitz.
@Jonathan: I wrote that long post at the same time as you, so it's not a direct response to yours, obviously. Here's a direct response:
The prize requires the craft to be privately funded. Where we sit in there, I don't know. We could sell the rover to a private group and use public monies, I think. But there's grey area, and I'm still waiting for a response from google on another question of mine. They're still working on the rules, so hopefully they'll be quicker to respond after they've finalized the rules. In the meantime, I think we should restrict any funding we receive to private monies until we can get clarification. That probably only impacts students who might otherwise get some money from their schools. To make it work, I suspect we'll need a donation system (we'll want to solicit donations anyway), and we can each just use that to put our own money in. I'd like to lay out ownership on the basis of contributions to the project, but the capitalist system measures contributions solely on the basis of money laid out. **** capitalism. I say anybody who codes, designs, and/or builds and tests gets an equal share, and anybody who participates gets that.

We can worry this some more, but let's make sure this stays in the discussion so we all know where we stand. I'm not in a hurry to make any profit off this thing, I just want to put my name on the moon somehow.
So, to the rest of the rules. The XPrize Foundation has a page about it:
http://www.googlelunarxprize.org/lunar/ ... guidelines
That page causes a recursive loop. It says to see google's xprize homepage for details about the rules, but google links to that site for the homepage about the prize. I guess I'll put that on the list of questions for google, too.
Here's a summary of the rover's requirements:
* Self-portait
* 2pi panorama
* Travel 500m
* Broadcast high definition video of its travels
* Transmit a package that will be put into it before launch, i.e. first email from the moon.
There are some bonuses we should consider as well.
* Long distance roving ( > 5km)
* Imaging man-made artifacts.
* Discovering water
* Surviving lunar night
So we need stuff that accomplishes all of that.
The camera is obvious. We either need a still photo shooter, or we need some code to extract the still photos from a video stream. The latter option has the advantage of reducing hardware needed on the rover, but probably introduces more code complexity. It's still too early to tell what's best, and what we go with will probably not be the first thing we try.

So we obviously need a camera, and to get the self portrait we need to be able to extend and rotate the camera. On the list of questions for google: Can the carrier rocket take the photo of the rover? Or does the self-portrait have to obviously be taken from a camera on the rover itself? Whatever camera does the self-portrait should be the same camera we take the panorama with. If it's a still shooter, it obviously needs to be able to extend and rotate. Well, even if it's a regular video camera it needs to do that stuff.
By specifying near real-time videos of its travels, google is basically requiring a video camera. I guess I should ask them if there's a certain minimum framerate associated with this, because if we use a still shooter I'd like to be able to just use that, but its framerate will be pretty low, maybe 2-3 fps or less.
Imaging objects on the moon will be a very useful feature to have. I'd like something where we can tell the robot to image an object it has seen, and it literally images the entire object, driving around it as necessary and if possible assembling a 3d model of the object that we can load into a good cad program or blender or something. With this function, the robot should have no problem giving a good image of any man-made artifacts, we'll just have to command it to do so.
Moving is probably going to be the biggest portion of the code we have to write. We need some way for it to sense the environment around it. Anybody know what radar options are available for this? There are projects that use regular video feeds, usually from two cameras. Is there code we can just plug in to two cameras to get this information? We can do dead reckoning navigation on the size of the wheels/treads. That's going to require a little sensor that's already available for automotive applications (used to detect position of the crankshaft in the engine). We need path-finding capabilities that work with knowledge of the environment that the robot gains from its own sensor array. What are our options for sensors? One obvious sensor comes from the communication system. It should be reasonably possible to have it triangulate on its home, the carrier it came in, based on signal strengths for two antennas on the base unit. It may also be possible to have it receive a signal from the orbiter to give it some location clues, but that will require the orbiter to know its location.
Google doesn't specify on their page if the initial 500m movement is a radial distance or actual path distance, nor is it specified if it has to happen on one charge. I'm already thinking we should copy the mars rover system as much as we can: put some rechargeable power supply and solar panels to recharge it. Ultracapacitors are interesting.
Communications are something I talked about in that other post, so I'll just refer to that. We do need to take inventory of what software already exists that'll handle the relaying, and what hardware exists that'll work with whatever kernel we use (probably linux, but let's not rule out any of the other good open source kernels).
Speaking of kernels, it probably goes without saying that we should stick to open source software.
So what this boils down to for programming is this:
* Video streaming capabilities, even if we're just encoding and streaming still pictures taken at 2-3 fps. (if it requires too much power to do this encoding on the rover, we can move it to the base, the orbiter, or the earth station)
* Best possible movement/range. For programming this is largely a navigation problem, making sure we don't fall down a steep slope and be unable to recover.
* Sending other data back to earth somehow. I'm interested in having it carry up a package of armagetron and we can download it and it'll be the first software download from the moon.
* Package management and software updater. This has to be able to update/upgrade onboard software without requiring on-side human intervention to finish the job. Personally, I was very impressed with the feisty->gusty upgrade and am inclined towards apt-get and ubuntu's software updater for that reason. I think we'll be able to simplify our packages to a point where upgrading a desktop distribution is several orders of magnitude more difficult than upgrading our rover.
* Simulator. The simulator is to allow us to code the robot control stuff without developing actual hardware. We still have to build the hardware sooner or later, of course. I found an open source mechanics simulator awhile back that was built by the army, I'll try to dig it up. Otherwise, I think we might have to build our own. Maybe there's an existing game engine we can use? There is a tank program game out there whose name escapes me, but I suspect we'll need a more advanced simulation than that. What's most important in the simulator, imo, is a good terrain and gravity simulation.
For the hardware itself, I don't know what all to specify. Obviously we have to worry about surviving the launch, surviving the landing, and then surviving falls and stuff. We'll have to account for as many failure modes as possible and build in general mechanisms for recovering.
So, first and immediate goals:
* Email questions to google (we should talk more because we'll come up with more questions, no doubt)
* Find/start the simulator. This needs to come with the api I mentioned previously.
* Research sensor options. Find out what NASA used on the rovers, what other off-the-shelf components are available, power requirements, PC interfaces, etc. The simulator can only provide sensory input data that we can put on the rover, but we should go ahead and make the simulator provide whatever sounds reasonable and code to not require any particular combination of sensors, as much as possible.
* Research communication hardware. Find out what NASA used on the mars rovers. What we really need to know here are power requirements, range (probably as a function of available power), and open source kernel support. Otherwise, as long as there's IP at the core, we can use any open source software we want for video streaming and other data transmissions. So wifi is an option, at least, and there may be others. IP by carrier pigeon is definitely not an option unless you want to build a spacesuit for a pigeon.
* Languages. I'd just as soon specify python as the language, but let's find out what else is available. Size is going to be a bit of a premium, but I'd hate to use C++ only to find out later we could've used Ada. So get a list of available, mature languages, size of their interpreters, etc. What's actually most important here isn't size of interpreter, but I want to feel like we looked over the possibilities pretty well.
Next set of goals:
* Specify sensors and code simulator to provide the expected sensory input.
* Load whatever lunar surface images we can get into the simulator. It wouldn't surprise me if google has GIS data for the moon available for us on request. We need to test in the simulator for many landing points on the moon.
* Turn simulator testing into an automated test setup, similar to unit tests themselves run from a buildbot. When we move into actual hardware testing, we'll want the simulator to act as regression tester.
* Probably should make sure we have unit tests for the code, which should be done from the beginning. I've never done unit testing, believe it or not, so I'm at something of a loss here.
* Establish a hardware development path that gets us a working test platform we can take out into the street and test with, and ends with a complete rover.
* Establish target communication hardware, if at all possible. This might wind up taking quite awhile, but I'm going to set an early ambitious target for this. The fact is, as long as what we end up with has an IP layer, we're fine, even if we have to write a kernel module for it.
* Research writing kernel modules. There's absolutely no way we can avoid this. I'd like a comparison, if possible, of writing linux kernel modules, openbsd/netbsd/freebsd, and any of the other open source kernels that are mature enough for this kind of application.
* Research computer hardware used by NASA and other rapid acceleration projects. We need boards that'll survive the launch, after all.
That's it for now, I'm pretty sleepy, hope I didn't miss anything. Did I?

Anybody want to start claiming tasks? Are there approaches that I've missed? I've never tried to build a robot before, so maybe there are obvious approaches that I've totally missed.