Threading the snakes to make a universe

The point of this is to test PyCurl with an innocuous list of files. The unique addition to a PyCurl code that is a common example is the first quote. I just extracted the file name so I could name it the same on my computer. The web address is just and example that occurred to me when I was working through the GTK python stuff. The process is to write it to file with the address to get and then run the CurL code on that file with the number of concurrent threads I want to use for the bandwidth.

The ultimate goal is to automate all of the aspects of creating a 4 dimensional causal map of the universe from the data bases available. It becomes a lobe of Alice Infinity's brain. It takes an identity such as a star position, galaxy, black hole, GRB, imploding neutron star, collapsing dwarf or other object or effect and plots against the 4 dimensional map of the near universe and its intervening space. It then determines the causal relationship of the parts and further examines the relationship for anomalies. An example might be a massive sun spot which irradiates a planet or nebula and it is possible to get a spectral match for Plutonium. That would be an anomaly as it is not a naturally occurring compound in free space. I doubt that I would be that lucky ( and them that unlucky ) to see a planet with nukes that got zapped by a GRB or massive flare, but that is just one possible anomaly that serves as an example.

In addition it correlates the 4 space sequence from a hypothesized big bang to the present and maps the parametric as 1. expanding 2. static 3. decaying and so forth to identify the most likely fit for the data. This then comes out in the end as something like "Hey look at 30° up 15° right, there is %s there!" % ('Plutonium') and "The data provides 80% likely undefined Universe scale."

    bn= os.path.basename(url)
#!/usr/bin/env python # -*- coding: utf-8 -*- """Extract list of URLs in a web page""" from sgmllib import SGMLParser class URLLister(SGMLParser): def reset(self): SGMLParser.reset(self) self.urls = [] def start_a(self, attrs): href = [v for k, v in attrs if k=='href'] if href: self.urls.extend(href) if __name__ == "__main__": import urllib usock = urllib.urlopen("") parser = URLLister() parser.feed( parser.close() usock.close() fo=open('urls.txt','w') for url in parser.urls: if '.png' in url or '.jpg' in url or '.tif' in url or '.xpm' in url : fo.write('') print url fo.write(url) fo.write('\n') fo.close()


Automated Intelligence

Automated Intelligence
Auftrag der unendlichen LOL katzen