I was testing out the nightly makehuman package and downloaded it from svn. I compiles for Linux and runs fairly well. It is a wip and seems to change every day. It is actually better than it used to be. I looked and it has a core of "C" which uses opengl and SDL to do the work for compatibility sake. The rest is done in Python and seems to be fairly well structured. It exports to mhx format and will load in blender (I had to fiddle with version numbers to make it go, but that changes all the time and version issues are everywhere. It will settle down eventually. I am also using nightly blender, so that adds to the confusion ). I opened one of the models in blender and went to edit by tab and selected all vertices and opened up a UV window. It was odd that the "naughty bits" were not mapped to anything. It did have rigging and it has some sort of speech movement connection which I don't understand yet. Anyway it is very well rigged and I decided to add back the most naughty bit of all, which is the brain. It seems to me that I can craft a Python script that uses trees and AI to control the model to walk and interact with objects. I suppose I have to have a little physics there and a way to make blender create situations that the model can learn from. Like creating blocks with gravity or a way for the model to interact with other virtual objects. It also seems that the complexity of makehuman is well toward a model that could be used to fabricate a genotype to phenotype from DNA clues. It certainly seems to get ethnics correct in application and I will have to look deeper into the para-metrics to see if something jumps out at me.
The image is a UV map to a sphere and some gimp effects like fractals and lava added. UV texturing is actually simple once you have a good model of its operation and can do seams on the model and a series of unwraps. An interesting new thing with blender 2.6 is the ability to see stress on the UV map and is selected in a pop up menu by pressing "N" in the UV sub-window.