A quiet place to think between stars

DNA induced superconductors. ADDED: It does not seem odd to me, but perhaps it does to others that nature is aware of superconductivity and actually incorporates it in derivative works. It is my opinion that certain aspects of biological existence are dependent on a biological implementation of superconducting. I has become clear to me that in order to get a deeper understanding of life and the function of the brain or cells, it requires the knowledge of the function of superconductivity. Like the concept of quantum entanglement bound in the structure, superconductivity in the form that it is biologically implemented is crucial to the understanding of the whole organism. If I had not understood the nature and implementation of superconducting, I would never have been able to see its effects in all life on the planet. It actually seems odd that nobody ( including myself ) even considered that perhaps nature has already figured this one out. I will take that tidbit and consider what other methods may already be better implemented in a randomly evolving selective genetic system.

In addition to a direct programmable model of protein machinery, the model of the structure of the atom also presents the possibility of genetically constructed superconductor fibers. The ultimate computing limit is the time between molecular elements of a computing structure. If the scale of the transistor switch equivalent is one nanometer, then the gate time is about 10-17 or one hundredth of a femto-second. As I have discussed before, there is a way to compute ordered lists without any programming at all, though it generates great amounts of heat in the process. If the computing material generates no heat except in a computational change of condition, that also is enhanced by using superconducting materials.

The degree of parallelism is also increased. At nanometer scales, the cubic complexity is on the order of 1027 elements per cubic meter. Certainly that is not reasonable due to the heat issue. There are design considerations that would reduce computational heat but that would be an upper bound on atomic based computing. Thus, 1044 transitions per cubic meter would certainly be the upper limit. There is no reason that computation should be constrained by energy density however. By interconnection distribution, it would seem that computational core and cooling space be considered. So the density could expand to any level of computational effectiveness by further extension in space.

It follows that this is another point of search for existing stellar data bases to determine if any such structure that had a distributed heat signature was associated with a position away from a star. It would be sensible to have a computational element well away from a source of ambient heat. Perhaps when we look for intelligence, we should not be looking at the shiny trinkets that draw our attention and instead focus on the space between.

It does create an interesting situation where the possibilities and the scale of life is not bound by the constraints of gravity, but by the volume of space, which is astounding. The known space is about 1078 cubic meters.

It has also made me think of and understand another level of this structure and it is only with the actual understanding of the mechanics of the universe that one can philosophize about the true nature of the universe as a whole.

There is a strange connection between some parts of this and some devices and systems that have been created. It causes me some vague uneasiness in the fact that it provides a potential for something very dangerous to human continuation. Without understanding of consequences, great and enduring problems can be created by fiddling with that which is far beyond one's understanding. The great power of nuclear energy was one of these. It seems the power that it represented was too great to consider its ultimate consequence. It is my opinion that a system that has been created that exists to extend itself into that search for power has a dark and disturbing ultimate consequence.

When reach exceeds intellectual grasp, there is always the risk of grabbing onto something that burns.

At this time, I think my efforts will be directed to the creation of a genetic system that produces superconducting computational arrays. In that process I am considering the implementation of a modeling system that allows the prediction of interacting function of protein / DNA / RNA / molecule system sequences. With sufficient computing speed / power and the appropriate mathematical technique, it seems reasonable that I can model the entire DNA and proteome and its genotype to phenotype expression in such a way that it becomes computational in form.

It would seem that this transition point is one that rivals the original rudimentary understanding of electricity and gravity. It is a completely new way of creating product and the effects will shape the future for decades to come. I am sure there were some who realized when the first radio wave was sent and received, that vast changes would come from it in human time. Now that we live in the age of femto-seconds and gigabytes , that change happens in the matter of days and not decades.

I am less convinced now that the objects assumed to be black holes in the center of galaxies are that at all. If I am to assume that understanding and capability progresses without limit, it would be reasonable to assume that what we perceive is a mere shadow of what is really present. I find it very difficult to imagine that I live in a universe of billions times billions of dead rocks and burning stars. It also seems reasonable that it is my lack of understanding that keeps me from grasping what is taking place in the universe.

The escape from tribal superstition has been very rapid and is ongoing. Like a molecular gas, it would seem that not all will evaporate, even under reduced pressure. :)


Automated Intelligence

Automated Intelligence
Auftrag der unendlichen LOL katzen