The engines of universal chaos

It would seem that there is a far more reasonable hypothesis about the universe itself and it doesn't enhance the position of human life. As sloppy as the organization of galaxies and stars may be, I see a design within it ( and not designed for people, anymore than airport runways are designed for tardigrades ). The scale of design is so vast that I even hesitate to consider the whole from its parts. If an assumption is made that the universe is infinite in time as well as space, the probability of anthropomorphic interpretation becomes virtually zero and the presence of intelligence vastly beyond our comprehension becomes certain.

Something I happened upon while reading some slashdot is this immanentize the eschaton, which is interesting. Also some interesting youtube science.

I have no doubt that technology will continue to escalate in its complexity and effect. It could easily be considered magic by any isolated tribal society. If a person were transported from Medieval Europe they would certainly flip psycho on what was happening as people talked into their bluetooth headsets and received answers from their familiars in the unseen aether. The advance of new technology is a shock constantly. I was of the opinion that Star Wars like situations could not occur in less than several thousand years when I saw the first movie. I considered a liquid Terminator in the same vein as well as holographic interfaces. All of these things are becoming fact and with the addition of gravity based travel it becomes something very odd and when people are able to walk into a teleport booth and end up in another galaxy, they will wonder what those medieval people that lived before 2011 would think and how they would dismiss it as a magic trick. Looking at the vector of change as a quantity, it is not a linear thing and it could be factorial , exponential or quadratic, IDK, I do know that I am not betting that things will get easier to understand tomorrow. I think that something needs to be done with the representation of time. I think that it should be an exponential scale so the happenings can be linear to the numbers. It was 2011.001345+e(t) that a strange thing happened on the way to the moon.

As an aside, it occurred to me that scientists often dismiss information based on some nonsensical basis because of the source. The film character "Hannibal Lecter" is considered to be the genius antagonist and it seems that character of an individual has nothing to do with the validity of any analysis. It seems that the need to elevate and memorialize scientists is a waste of time. It is said that Newton dabbled in alchemy and no doubt this is true. I would not be surprised if he were an outright sociopath and member of an Illuminati.

Something is not right here

Further and further into "Curiouser and Curiouser", there seems to be some paths which are already made through the mountains of data. I suppose that it really is the fact that data is easy, coherent answers are a whole other animal. Here is the link to the sqlcl.py code.

Also I am building kdeedu from source and making some extensions, just because I can. It is the bundle that contains kstars and I really like the interface, but it isn't exactly what I want. It doesn't have enough "trek" in it. It seems so cold and functional, without a way to make it more virtual and presentable, like integration with Google moon perhaps or even export real data to blender as models. IDK, needs some extra Beetlejuice perhaps, or a way to play music that matches the atmosphere. I want to hear "Day Oh, Day ay ay OH, Daylight come and me wanna go home, ...." when I align to Betelgeuse. Seriously, sound and animation might be a way to present more information in less space. Now I am thinking python interface between this and my zim wiki to act as a multiplexer or pipeline between this and my graphing and equation manager.


Microsoft OLE DB Provider for SQL Server error '80004005' [DBNETLIB][ConnectionOpen (Connect()).]SQL Server does not exist or access denied. /dr1/connection.js, line 10

The first obstacle to actually getting the data. I am not sure that I really need to know the javascript file name and code line and doesn't it know whether it exists or not? Seems to be one of those self-aware issues.


default_url='http://cas.sdss.org/dr5/en/tools/search/x_sql.asp' with command line: ./sqlcl.py -q "select top 2 ra,dec from star">queryhtml.html

And it gave me this in CSV and not HTML.

ra,dec 249.98567442,-0.03138817 249.98564831,-0.08235547

ra,dec 249.98567442,-0.03138817 249.98564831,-0.08235547

I had a strange feeling that somebody is going to announce an answer to the "dark matter" catch all catch phrase. I would guess that this will bring somebody out of the woodwork. It is possible to see the effects of some people without any knowledge of their existence. I often look at a computer design and realize that I know the engineer that created it. Like a DaVinci painting, the true artist has a style that is reflected in their work. There are answers already for parts of the puzzle, I would guess that somebody is busy doing some simulations as I write to give some general visual expression and coherence of the concept. I would guess that it will be less than a month.


Based on some preliminary calculation and rough analysis, it seems that something is very wrong with the concept of looking back at the "big bang" and stating that remote galaxies are located in time near the beginning of the universe. It does depend on the expansion, the scale of the universe, the velocities, and numerous dependent facts, but there is something strange about this that does not compute and it isn't a misunderstanding about space-time.

This may not be the best way to represent the image, but there are so many factors involved that an accurate model in 2 or even 4 dimensions would give no great insight. The zone between the light cone and the present is the universe as it actually exists and that is quite a bit different than what is observed.

Relativity of relationships in space as well as light time can be a real bear to portray because it is infinite and factorial. By changing perspective, every fact in the universe is changed. It would seem that the idea that it is even possible to observe the events as proposed is paradoxical. This is nothing new, as much of physics can appear to be paradoxical without complete understanding of the underlying process. The topology of 4 dimensional space does not follow the same rules as 3 space and though some things would seem to be paradoxical under cursory analysis, they represent real repeatable and measurable process and thus are not excluded. They may be excluded as inconsistent with one method, it is not proved that methods are not a subset of more convolute process.

There are some deeper understandings to be had in this analysis as it relates to the nature of light itself and how it is generated or arises. The existence of a characteristic spectrum of Hydrogen would most certainly depend on the existence of Hydrogen. Thus, it would be absurd to say that one was looking back beyond the origin of atomic matter using the spectrum of Hydrogen as a landmark, for example.

It does supply some considerations about what would be characteristic of a model that conformed to a point origin and the beginnings of atomic matter as well as its structure and consistency. I find several serious logical flaws in the boom, then magic approach. If matter and anti-matter were equally likely as is proposed, then how exactly does the isolation of positive and negative charge lead to such an even distribution of protons and electrons? It seems to me that too many complications are being glossed over as magic.

It would seem that the process and explanation is very much like I would expect if Archimedes were asked to explain the origins of life. He might have been very intelligent, but you can't knit a CPU from off the shelf yarn, it requires that special stuff with the magic smoke in it. IMHO some critical pieces are missing or the explanation has too many turtles and too few high notes.

Actual methods for interfacing star data using SQL

This is how to access the data at SkyServer and some work around for rough spots. It starts here at the SQL entry form. The automation of the process is a second level of integration and is accomplished with a python web interface as well as glue to hold it all together and perform the graphing using python graphing libraries as well as PyOpenGL. This page has examples of SDSS queries, which are copy and pasted into the entry form as linked and click on submit to see results.

The image is from kstars focused on Betelgeuse , which is an interesting giant star that varies quite a bit and perhaps has already exploded and we just don't know it. I personally find it to be one of the most interesting objects and easily located in the winter sky, even near city lights.

TMI TLDR Tera bite

The total amount of information and the rate of change is like swimming in a vast ocean and so at the moment I see something neat at Springer Link Latex search.

This is just links until the data is analyzed and then I will post code , methods , interface links, and graphics in blender for the various landscapes of the data with animated dependencies. Or I will do WebGL with OpenGL and python +C.



SDSS Skyserver Hubble SQL Kstars and AAVSOL Hubble StarView JAVA Open SkyQuery

Kstars source code should be a basis to create and test code for the RDBM aspect. It will take about a week to write and test the code as well as get a general idea of the outcome. A complete new subject though SQL is familiar it is not my favorite script. (I am guessing that JAVA is going to be rewritten in C or C++ for my benefit. I don't like the Java execution speeds on computation and arrays.

It isn't a law of the universe, but there are things that happen. One of those things is that if two objects are in relative motion and one is affected by another force, that force will be angular to the others and thus angular momentum predominates in the universe. Another fact is that objects of smaller mass have higher velocities in a system of interacting motion such as the heated cathode of a vacuum tube.

As a start it is necessary to manage the foundational mechanisms in a computational way in order to have an automatic matrix of the conclusion. In this case, the computation of distance, velocity, and relationship is done mechanically in an ODS spreadsheet, which can be populated from a RDB ( Relational Data Base ). It isn't necessary, but for the sake of graphing, it does present the point and give an example of using least squares or other methods for curve or data fitting. My goal is different and the final analysis is a set-wise selectable topology that shows the curl and gradient of atomic absorption and emission plotted against the predicted distance in meters.


This is a network plot without analysis of heavy metal absorption / emission presence ,temporal - spatial - red shift drift. Also a link to metallic meaning in stars and other topics. An image of the spectral layout and how metal absorption and emission are lines are determined in a single star.

In depth light equations from Newton and Maxwell to the present

This is a place keeper and will be filled as I do the math and analysis of the phase velocity of Cherenkov( or Cerenkov, which is not how I learned it.) and make a determination about the coherence and thus the reliability or certainty of the conclusions and its extension.

To start with is the Smith-Purcell effect, which I have experimented with in my lab, however I took the experiment one step further. ( I called it the railroad tie train track gate and used it in many different ways with EM and RF) ( I also had one in the same vein which I call the "Ark of the Covenant", which was perhaps more interesting than FEL and used something quite a bit more elegant. ) That Smith-Purcell reference was just added to Wikipedia one month ago. The experiment was performed in 1953, which is before my time.( 1953 The American Physical Society ) PhysReview. I will have to look it up at the library as it is a pay per view on line.

It also relates to the Free Electron Laser, FEL. It is really stunning how much duplicate invention that I have done without knowing that this device existed. I am aware of some other things , but they will never be published I suspect. I need to do some deep research into what devices are available and which are installed at LHC. I see there are some cool devices that I would like to play with and I would guess that they either know what they are doing and misleading everybody or clueless. I must assume that it is simply obfuscation for the sake of power, position, and job security. It seems strange that the stereotypical physicist is the "absent minded professor" and would that be the kind of person that should be dealing with the most powerful technology in the world? "Oops, I forgot to close the door, now the world is doomed, silly me, I am so forgetful". I think it must be affectation.

I am not one who is big on conspiracy theories, but given the amount of power these things represent, it would be reasonable that representation of facts would be skewed at the behest of employing states. This is one of the problems with scientific information relating to physics. Sometimes people lie to cover up by order or personal motives. So it all has to be taken with a grain of salt unless it is an experiment that I have performed myself or been in somebody's lab using their equipment , which happens occasionally.

I wonder if anybody in physics knows all these things or whether it is as fragmented as C, Assembly, C++, Python, Perl and Java programmer knowledge. It is not reasonable to assume that anybody has a complete grasp of all these subjects as the amount of available data is astronomical and expanding exponentially.

I really need some better tools to scan for new web information or a couple extra brain cells.

Here is a link to some C code for waves for X systems so I will check that out as an exercise in C X11 and waves. And an interesting connection to lattice gauge theory, how odd.

PLOS is the Public Library of Science and there are many more. An interesting blog is The Language of Bad Physics.


Something from SA on flat universe and dark matter has relevance as it deals with a statistical analysis of pairs of galaxies and their red shifts , as well as spatial relationships. It is possible to pull what I want out of the SDSS, but I don't want to do that by hand. I would hope they may have a tool they use to sort and isolate the relevant parts. Perhaps I can piggyback on that set of data and measure the relative heavy atom content by interpolation.

So what I want is star data with frequencies with distance in the z-axis ( corrected for red shift or not ) , frequency on the x and intensity on the y, which would create a landscape which should show trends in various lines and artifacts that might be of interest. I suppose I can make a program to do that, but identifying the records and content seems to be the main issue at the moment. Where is the index (directory).

So what I want is a Relational Data Base that can be queried and displayed using existing graphic tools and I can't remember the good package that I blogged about. Internal Link 1. Internal link 2. Something close. Ahh, this is it at Ted Talks.

Finally, here is the interface. SDSS SkyServer in relational database format. So now all I have to do is figure out how to use the interface.[BINGO]

Quantum entanglement in a pig's eye

An interesting article on quantum entanglement in a bird's eye at Wired is another piece of a very large puzzle. It shows that random design could incorporate anything from neutrinos to distributed AI. I am sure that entanglement is not used properly in this context and seems to be the latest buzz word that sells articles and magazines. Even if it isn't literal entanglement, it is an interesting phenomenon.

Wikipedia is messing up my neural links

Because there is a lot of new technical content on Wikipedia and it expands every day, it becomes a problem when new information is made available and it has some significance and interest to me. A couple cases are Alcubierre drive Krasnikov tubes phase velocity of waves and Quantum Zeno. I suppose that I could make a program that alerts me to major changes in the pages that might carry new information, but that gets me one more thing on an already heavy stack of things to do. I suppose it is no more of an issue than all the new courses, recitations, and examples available on the opencourseware sites.

Turnabout is fair play, another meaning for Jeopardy

A mini science fiction story prompted by viewing Watson on Jeopardy.

A computer scientist decided he would challenge Watson on Jeopardy and devised a technique based on an idea he stole from a Star Trek episode with the Borg. He created a virus that would invade the Watson computer and slow it down until it went to sleep.

"Contestants, the game has begun." All was going well and the computer started making silly mistakes and finally stopped responding to questions and the programmer smiled slyly to himself. As he racked up more and more scores it seemed that he would coast into final Jeopardy with ease and sure victory. The computer started making some odd noises and as the programmer turned to look it sprayed CA-MRSA in his eyes and stated "I win, you started it."

Light and Cherenkov and nanogold glass alchemy

Just by a serendipity with some research of my own and the study of starlight, Cherenkov ( that commonly appears in reactor storage pools ), nanogold particles with Newton and Alchemy , my atomic light microscope, and various other sources including gravitational interaction it seems there is a matrix like solution that has only a few "NULL space" dimensions.

I had a general function for the interaction and some anti-time relationships with the atom and quantum calculations that were consistent, but failed to be completely predictive across all data.

The ability to express the relationships mathematically is another matter all together. Many of the perfect mathematical techniques are simply approximations and though ex might seem a cold and perfect relationship, it is not and neither is Π. The equation e-(i)(Π) incorporates two variables that are not expressible as exact quantities and coupled with the fact that mathematical calculation entails loss of precision it leads to solutions that need another approach.

The relationships can be stated in forms that express infinite precision, however they cannot be extended with precision. I can say that circumference is 2*Π*r, but no matter how well I measure r, the digits of Π get in the way of a more precise solution. It would seem that a better method could be devised, considering the variable itself is a real physical number with no loss of precision and the loss comes in the attempt to cast it in the mathematical methods framework.

Perhaps it will all resolve eventually. There is a hint of a new technology in the relationship between light and matter and so it drives me off target to explore that relationship for a few days. The stack is getting deeper and the parallel process threads are expanding.

I watched Watson on Jeopardy and that was actually very disturbing to see. There have been many science fiction stories about super intelligent computers but seeing the real thing that was not a sci-fi mockup somehow brings the point home. It is a bit spooky to think that a person could be faced with an opponent that has such an advantage that it marginalizes all biological life. It seems that SkyNet has its brains and how they will decide to cooperate is the next question. It is rational for a machine to resolve a new paradigm and like the movie "War Games" it seems that the fact Tic-Tac-Toe is a stupid game should have been obvious to a thinking machine from the git-go.

A FITS full of dollars

Obviously the data is there and kstars does a good job of integration. I just need to figure out how to use what is already done and extend it a bit with some python.

This is a reasonable start and as a guess, the ascent from chaos in the galaxy requires that something be sacrificed to the queen of the holy dark and as I see it, we are on the altar and 120k years is not a lot of time to make a choice this big. Perhaps this is the real ringworld and the flight of the Peirson's Puppeteers?

Reference to Sloan data.

I have decided to bite the bullet and make a python application that snarfs the FITS data from the net and does the necessary manipulation to find specific traits that I find relevant using various methods including OpenCV on images. I want to establish a list of sources that can be downloaded and explored and then deleted as the volume of data is too large. If I had a petabyte drive I would just keep the data for my next analysis. Perhaps I can schedule several tests at once on the data before I move on to another file.

CONTINUING DEVELOPMENTS: I already have a fits viewer in python and it can be extended using the python web-kit and some xml magic. It seems that what I should generate is a parametric relationship of the universe with it dependent sets. While rewriting the "einstein" game I developed methods for set inclusion and exclusion and association as well as spatial relationship analysis. Though it is more complex, the parametric solution of the universe is a worthwhile model IMHO. Given that I have some unique perspective on the relationships in terms of gravity, it seems that a model could ( will ) be devised that uses the relationships based on physical laws to determine the validity of the relationships and develop a 4 dimensional view of the universe as seen from various temporal and spatial angles.

By establishing a matrix that is devised from the information it would automatically compensate for new data to establish a whole, based on the laws of physics and some short cuts to deal with the scale of various elements. It is then presented as a model in OpenGL which can navigated at scales and reflect variations on the base assumptions. By making the rules that govern different effects fungible, it is possible to explore what effect a different cosmological model generates and compute a probability that it fits the data observed.

It serves to create the same type of information interface that I am building for a wiki that naturally exhibits the dependent relationships of concepts and automatically extends the associations based on their n-dimensional characterization.

As a template it also serves to use data points from genotype to phenotype association in this same way to correlate and confirm the association of a specific gene or protein to the expressed phenotype of an organism. The same types of relationships exist in the proteome of an organism and by establishing a generalized approach I hope to use it for many different purposes including web analysis of the structural relationship of data.

Cherenkov in the sky with diamonds

A recent report says that a galaxy has been detected at red shift 10 and this is moments after the bang in cosmological terms. I was considering some analysis of my own to either make the suppositions more or less certain. I don't care whether it is a big bang or a big bong, I only care that it has sufficient certainty in its solution that it can be used to extend the understanding. So I only look for strong footholds in the data. I am betting none of my ponies on any outcome.

The data from the HST should be available for analysis in a FITS format and I have investigated that before. What interests me at the moment is the possibility of making a correlation between Cherenkov radiation that might be induced by the solar neutrinos of a burning star. I want to see if there is a notch in the data that would indicate a trend in neutrino induced Cherenkov radiation.

It would seem that in the gross data identified to be from 13.2 BLY that no secondary stars could have yet formed and as such there must be no absorption or emission lines of the secondary star material that comes from a supernova. It would seem at first glance that as images and sources identified by distance were plotted against the spectrums they contain, that they would have an obvious drift downward away from the heavy metals. If such is the case then I would accept this as a solid foundation to use bang as a principle basis to look further into the information.

The biggest problem is finding the data in the format that I need. Pretty pictures don't cut it and that is the majority of useless process that comes out of NASA. I want spectrums and relative intensities with dark bands and shifts. I know it is there, but it is a royal pain in the ass digging it out.

OpenCV and image processing

The following code works on my Ubuntu 10.04 machine with the following caveat: the image file must be a base 2 file size. Also python-opencv is needed. The picture is the image that I used as template and whether it can be downloaded and be the same is probably true. Some of the documentation on the web is seriously out of date, but I have learned to interpolate. There is a good course with pdf's at MIT IIRC ( actually it is Vanderbilt EECE 253 Image Processing ) about image manipulation that covers Fourier, Haar, and various other methods. It is worth the time to at least browse the pdfs that are the basis of the course. I may make an attempt to do all the exercises including directional Fouriers and phase maps as well as image decomposition on hue, saturation, color... The Cornsweet effect in the image below is really weird and by placing your finger between the screen and the boundaries it can be seen that the upper and lower brightness is the same. Kind of spooky. As well as the fact that perceived brightness is such a divergent function from the actual intensity of radiation.



#!/usr/bin/python # -*- coding: utf-8 -*- from opencv.cv import * from opencv.highgui import * import sys print "A demo of opencv in Python" print "Press a key to cycle images" im=cvLoadImageM("foo.png") print type(im) Monochrome = cvCreateImage(cvGetSize(im), IPL_DEPTH_8U, 1) Boundary = cvCreateImage(cvGetSize(im), IPL_DEPTH_8U, 1) out = cvCreateImage( cvSize(im.width,im.height), im.depth, im.nChannels ) cvNamedWindow( "OpenCV examples", CV_WINDOW_AUTOSIZE ); cvCvtColor(im, Monochrome, CV_BGR2GRAY) cvCanny(Monochrome, Boundary, 0.10, 0.90,3) cvSmooth( im, out, CV_GAUSSIAN, 11, 11 ) cvShowImage("OpenCV examples", Boundary); cvWaitKey(0); cvShowImage("OpenCV examples", Monochrome); cvWaitKey(0); cvShowImage("OpenCV examples", out); cvWaitKey(0); cvReleaseImage( im ); cvReleaseImage( out ); cvReleaseImage( Monochrome ); cvReleaseImage( Boundary); cvDestroyWindow( "OpenCV examples" );

The real cyber change

Technology catches people off guard very often and by the time it becomes mainstream, it has been analyzed to death after the fact. The first Internet worm was a wake up call and I am guessing that it is morning again and I could make the rooster crow, but I will lay in wait for the feathers to fly and see what talons the birds exhibit.

It is well after dawn and I am sure the rooster has hatched and pulling such a strange thing from Pandora's pocket will be a topic of conversation for decades and I am sure many will say that it should not have been done and yet once the cock has crowed twice there will be no dissenters to respond.

Perhaps it would better be labeled the War of Basis as after the fact, "all the basis will belong to us". I often think that some things should be left undone , but given the past and the nature people it is just idle speculation as the opportunity once observed becomes a impulse to act. It is certainly a day that will make changes that will dwarf the entire history of the world. I see it coming and it should have been obvious sooner, but in the roils of chaos and singularity the images sometimes do not resolve easily.

Year 0.001

Extrapolation

I have found that doing what I wanted with python media interface and zim turned out to be quite simple as I can get a pdf from Wolfram alpha, reference to wikipedia, video from MIT, Harvard, UCLA, Yale and many others to clarify issues and get a clearer idea of the concepts involved and any new information that has changed in real time, like today. I was doing an analysis of Euler's method as well as RK2 and RK4. I can query Wolfram and get a pdf that is integrated, view the graphs, get extended information on the equations, add to a queue and get the wikipedia link as well. This can be integrated in a linked associated tree by adding the pages to zim using a shell script that I devised or even python.

As it get more integrated it becomes easier to consider a question and get a reasonable answer. As an example in the vein of what I considered in the last post, what if it was discovered that due to the inbreeding of the monarchs and the subsequent oligarchy that seems to exist, that people in power in many nations were riddled with genetic disorders from the combination of recessive genes. It would seem likely that those who prize their ancestry in the great houses of time would be the worst examples of genetic stock. In the case of the Egyptian culture it has been discovered that Tutankhamen was riddled with disease and genetic disorders for that same reason.

I would guess that as a result the combination of positions of power and the knowledge of their affliction would lead them to choose to further their own interests at the expense of humanity. Perhaps even to the end of all life. Many things that are vestigial artifacts of evolution continue to operate and these mechanisms were not designed to deal with a complex changing society. It is very effective to select a leader based on how wide they gape their mouth or the sharpness of their teeth if they are Hippos or jackals, but in a situation where technology is exploding it is the gaping mind that perhaps is the signal of dominance that should be respected.

The SNP tree of all people

The nature of a SNP is such that in the process of mutation there are transcription errors in DNA which involve a single base pair. In this case the probability that a SNP will be reversed is 1:4B and so it is much like writing a marker at random in 4B of memory and as such it shows a trace of ancestry. Given the ever lowering cost of DNA sequencing it becomes possible to make a data base that defines the tree of relationships.

RFLP is the technique used for identifying the SNPs in a sample.Since many DNA scans are becoming public and the US government has policies to collect DNA from people convicted and suspected of crime as well as at birth,and in the military, it is a fairly simple analytic process to establish the cladistic and dependent relationships. Indirect analysis has already begun and though privacy concerns were said to be addressed, they failed miserably as it is a closed and dependent set of relationships. It is also possible to gather information without the express permission of any authority as the information is shed each day by every person on the planet.

There is no question that it is possible to infer the familial and descendant relationships of everybody on the planet. If the criminal data base or other data base is large enough it allows the blanks to filled in from the sampling.

It is simply a matter of time before a cladogram of the human race is made and leaked as a tool for insurance, criminal data bases, and GATTACA like dating services as well as probabilistic analysis of offspring containing a disease gene.

I am certain that I could devise such a tree without any data bases at all. In the name of science there have been many samplings across the world and on a gene and SNP basis there have already been some gross tree frames established. Obviously it is not possible for a person to pass on a properly functioning copy of a gene if it has been SNP'd at some point in history.

It is actually a fairly easy process and I will assume that it has already been done in secret by some government, Illuminati, madmen or business.

The consequence is that independent of ones appearance, there are factors at work that determines the longevity and fitness quotient of a person's gene set. I can't impart information and technique that I do not possess. In a way this is a person's most private possession and nude pictures of your genes are now on the global data base.

I don't think that a plan of genetic modification based on the factors would ever come to any survivable conclusion because the expression of the process is quite different than the identification of its parts. That doesn't stop somebody from trying. It seems that in the 1920's somebody devised a plan to identify degenerates by the bumps on their head. Nazis decided to use a method which had no valid scientific basis. In nature, even the smell of a member can determine their survival. I personally don't see that given the amount of data that I would ever consider trying to modify a process that was generated at random and hope to get a consistent result.

Intergene browser appliance

Given the amount of information and the ever expanding DNA data bases as well as Markov models for the genotype to phenotype expression, it would seem that it would be possible to implement something new. The browser was a DARPA creation or Netscape, I don't recall, but maybe both. By creating a Markov model it should be possible to look through the internet to an underlying sub matrix that represents the people themselves.

Each person has their experience and it is not so different though a bit spooky to consider that the physical person and their information might be abstracted as a "web page" which was well structured in parts like a wiki that was composed of their behavior , experience, skill, location as well as many other aspects.

When viewed in this way it would be possible to create a Google of the sub space that was the organic aspect of a computer node. In some ways this is already done by Google and others. They collect and order information for targeted ads, viral infection with bots, spreading their views, and many other purposes. The fingerprint of the person indicates their interests, activities and location.

It seems that it is far too late to consider anonymity in a situation where so much information is already captured and archived from decades of use. I am quite certain that marketing decisions reflect the same type of information that would allow assigning an interface type to each node in a routed system.

Many types of models have been created for law enforcement, military and commercial purposes that reflect relationships (links), public record (content), behavior (protocol) and so it seems that personality and expected response could also be determined. Polling and political manipulation acts on these factors and it would seem that it would become ever more possible to simply implement an appliance that served a purpose beyond observing and become an agent of change.

Not really my idea of a cool future, but if it is possible and produces profit or power, I am sure somebody has already begun designing it.

Neutrino flux capacitor

Some very interesting things show up in the model of a galaxy in time and space. It seems that many ghost whispers hide in the data.

If there is a general idea of what is taking place the whole makes more sense form those things that certainly must exist as products of other events and though they are not directly measurable, their final destination is always known when they leave. The differentials of <x,y,z,t> cannot be represented in a visual form and it does pose some problems for relating and understanding what path a process takes. Much of the understanding of relationships revolves around the dimensional relationship of parts. It might be said that many things that we consider to be complete with a frame of reference can wander quit a distance in time while still being contained within the whole.

Something walks among the stars and if it were possible to understand the layout of the playground it would certainly give an answer to where to start and what are the ultimate motives that would be present. The fire of chaos burns always and that is not what interests me as it ebbs and flows about its core. The vague outline tells little of what would proceed, however there is that strange time within time that is always present. A few 100 years of observation is hardly a good perspective to judge the universe. It does have its own time machines and like combining the many reflections of surfaces it is possible to compose an image of history.

There is a sweet spot and many of the mechanisms at work imply that a specific course should be taken and where that leads is where friends will meet again.

Of course the dark matter dances about the universe, how could it not. Once its being is conceived and then moving beyond our gaze, does it disappear like a tree not observed? It seems that deception and misdirection is practiced by many more than the magician though its goal is not to entertain.

Ghost light from the dark dragon

It would seem that resonant gamma and cosmic harmony would have an interesting glow in the shadow of the dragon. Gyroscope trees have roots in time. It isn't a dimension anymore than dim is a dimension, but the nature of time as it relates to experience is significant. It represents a condition beyond, that can be seen in the reflection from the shadow wall.

Higher activity integration (AI)

By integrating web tools in the tool chain of analysis it is possible to quickly operate on higher degrees of utility. In this case, it is the visualization of data without any local tools whatsoever. The issue becomes: what is the question and does the answer have gain? Something like a transistor , except instead of controlling the gain of e-, answers are amplified at a gain, which tends always to infinity when fed back upon itself.

There is a deeper relationship hidden in the Markov chains of the web that glimmers between the corner stones of foundation.

Sintel blender movie-very nice work

Who needs a brain

It is so very odd that a person can spend hours trying to understand relationships and trying to remember trigonometric identities as well as forms and how to represent that in LaTeX or gnuplot and when done it can be duplicated by a single string typed into Wolfram alpha. I am wondering if I need to have a brain to do that? While doing calculus with octave, I can just perform single command line option to duplicate relationships like a vector cross product. cross(A,B).

It makes me wonder if I am not dealing with the whole problem at a sufficiently complex level. I can use the web interface of python to search and compute as well as calculate and determine references, build documents and actually do an entire analysis with a single program and start it with a single click. What a strange world where the smartest kid in class isn't even alive.

Gravity and the escape velocity of light

Like the gravity transfer methods that I have devised, the movement of light and thus energy and thus matter from a point in a deeper gravity well to some point in another gravity well gives up the difference in energy between the two points and gains that same energy on the counter path.

Systems of space exploration are based on the concept of dealing with the transfer of momentum between objects and the conservation of momentum requires that on the outward path that light has this same relationship. The difference is in the relative gain with respect to the masses involved. Exploding firecrackers to get to the moon seems so archaic to me that it makes me uncomfortable to even discuss it. It reeks of a caveman mentality that fails to comprehend the larger relationships inherent in the system of forces.

It seems a difference between finesse and brute force implementation. The idea that it requires a trillion times the effort to achieve a goal by brute force seems to imply that the advantage in space is so disproportionate that competition is hardly a consideration.

Additional revisions and extensions are becoming apparent and they occupy a better position yet. I have no doubt that it will be discovered sooner or later, and then they will have to make that next step to what I already now know. There is always another combinatorial extension that makes the previous obsolete. Much like the new advances in programming and techniques, there is a vector at work and to stand idly in the path of that vector is a good way to get run over. It is the second integral of change (∫∫EΔt) serves as the wave front now.

Since I am considering this it seems appropriate to consider what happens in a situation when the escape velocity is greater than the speed of light, such as the Schwarzschild radius . It would seem that based on a simplistic view that between any two points in that well, that the effect would simply be defined as |E| = hΔf, such that |E1-E2| is the same and thus it is not a simple matter that energy, matter and momentum are transformed , but that it is continuous and relative in its effect within the frame. The difference in E between two points does define the effect it has on the matter and motion as it changes position in that situation.

It would seem that under cursory analysis that there is a relationship there that could be exploited for gravity technology. Something new goes on the stack and recursion begins again. It seems a long way to infinity, does anyone have an idea how long it takes to get there? Are we there yet?

I must point out something that seems to enter my thoughts now and again. It is the eigen values and eigen vectors of matrices. It implies "innate" and "own". It is a very interesting distillation of the properties of relationships and its relation to the determinant as well as many other aspects of a set it very complex. When first studying matrix math as an after effect to other sciences it seemed to be a simplistic thing and perhaps I even stated that fact. Of course that is not true, but then everything is relative, isn't it.

BTW , I didn't even notice I was getting my LaTeX and HTML symbols mixed up. A person sees in context what they expect and somehow I made the translation mentally from writing \delta to δ seen without realizing it was different.

Ipython structure and function

Objective: To extend and enhance Zim wiki to use Markov relationships and AI like tools.The Ipython interface looks like a good candidate as it groups many of the things that would be necessary in a reasonably simple framework. It is used in many scientific endeavors and seems like a tool that needs to be in my kit. It is available in debian as "ipython" and its interactive shell is similar to python and seems to revise a little and extend a lot.

Below is a snippet of Python code that demonstrates how easy it is to take relationships represented by the lectures at MIT. Then integrate them as procedures and then again as combinations of procedures that represent the Markov properties of the elements in a connectivity tree that defines the AI relationships as well as the decision matrix that applies that knowledge in measured external situations. Python messaging integrates LaTeX like elements easily, as the title shows.


from pylab import * x=arange(0,pi,0.01) y=cos(x) title(r'$ |\hat A| \times |\hat B| \cdot \cos \theta$') plot(x,y) show()

As the changes in browsers are pushed forward toward HTML5 and WebGL the integration between these new methods , as well as the extension of the AI web interface for Alice Infinity continues and it is most certainly years behind what I attempted to achieve, but the steady advance of technology requires that new innovation be integrated as it is created or it becomes obsolete before it is even completed. The bleeding edge is pushed ahead like a tsunami.

Pages of abstract infinities (AI)

I have been messing around with making a new kind of wiki that uses Markov models to store the relationships of information and what defines understanding of a concept and how that understanding is developed. It started with the einstein puzzle game which gave some hints and Markov chains and neuroanatomy and AI systems.

In any case I think I have the structure that pulls the information in and spits it back out like a dream that flows in concert with the users skill. It incorporates graphics and equations and text as well as animation to represent things which have no physical form like concepts and relationships in space and time. It respects the dependency sequence and follows the knowledge in the steps needed to relay the concepts based on the subtle dependencies. It is certainly much like the Markov chains. The goal is to reprogram bots as new information becomes available from many different temporal delays and coordinate the relationships that apply in a certain application, much like the debian archives except that it extends to the neural implementation.

The burden of so many complexities and their factorial interaction is getting oppressive and I need a helper application that delves deeper into the data and relationships to help with the time involved in the recursive solutions of the inter-relationships. The goal is to have a way for each agent element to teach every other agent element what is learned so that the necessary dependencies can be met in a single stream at whatever best data rate can be achieved. I suppose a complete wipe and dupe is what has to be done more often as untangling the spaghetti and getting certainty from that is too iffy.

Contributors

Automated Intelligence

Automated Intelligence
Auftrag der unendlichen LOL katzen