Atomic lithography

I have been thinking about atomic lithography and it is possible to have an interface that does exactly what a standard model building laser does with any material. It would seem that the process would be much easier than I thought. The investment is virtually nil as I have many of the parts already. I am considering making my first nano-scale computer from a blender cad image repeated over and over with a connectivity diagram from my CAD package.

I am not new to computers and I have worked designing hardware from the transistor level for 40 years. I have used toroid magnet memory, soldered transistor arrays together to make a simple computer, used DIP packaged CMOS to make a design, wire wrapped card cages for computer control, designed CPUs with FPGAs, and designed my own CPUs from scratch and repaired design flaws in RISC and CISC chips from major manufacturers.

Developing a good parallel architecture is key to a good computer design. The use of cache is a major improvement to any computer, but multi dimensional CAM is vastly better for access to information. It is energy intensive and can produce a lot of heat as it functions like a CPU that executes instructions on every piece of data simultaneously. A good CAM design can search a database of any size for a specific item in a single clock cycle. This means that I can essentially neglect memory access bandwidth to an external drive or memory if it is incorporated in the chip itself.

If the design is incorporated as NCAM, it becomes an analog for the human brain and the memory itself is the CPU. It is different than programming a single threaded CPU as each instruction acts on everything at once. I can for instance do a graphic operation on a 3D image to produce a projection of an object and this would execute in a single cycle, independent of the object size or amount of memory involved.

The primary limitation of a design such as this is that it would generate enough heat to burn a hole in the floor if it is not temperature throttled, cycle spaced, or cooled in liquid Nitrogen. In the dead of winter here I can just open the doors and turn off the heat and let the machine run until it hits 75° inside. To compete with existing designs it would really only have to run at a very slow clock and as such would hardly generate more heat than a typical PC.

The real advantage comes when it is structured in a way that is consistent with the human thought methods and I don't have that completely figured out yet. I think that the device has to be able to create a lithographed copy of changes in connections in order to achieve a neural like system. So it must analyze itself and determine which paths must be connected by simulating its own circuit in the computer. It sounds a bit wicked to do, but I have used "magic" gate array boxes to do this before and simulate CPU connectivity to alter a design for fabrication.

The nice thing about this is that I have no overhead cost or anybody to complain about what the engineering cost is, or how it will screw up the business model for the company. I really don't care if IBM and Intel and AMD go out of business if I can build my own from scratch without damaging the environment, then I am actually doing the world a service.

This is some serious fun for me because companies have not let me get down into the design implementation level for years and I miss wiring my own stuff and seeing if I can push the boundaries of possibility. It is just fun for me. The biggest problem has been that it is just easier to have a chip CAD designed, fabbed, inserted in pick and place, flow soldered, tested with ATE, burned in, and packaged by robots.

This way I can see the immediate results of design decisions and bypass the cost involved, which is astronomical if you have ever been in the business. It is possible to drop 50K$ in a 100 places in a typical design and not even think about the fact that it is 5M$ just to plant a chip on a board. It means that you never do "onesies" unless you are Harvard, MIT, or DoD.

I doubt I will have any physical hardware in less than a couple weeks. I have enough experience to know that even a good idea is going to have unexpected bumps. I will probably do a few test lithos and run them through the atomic force microscope to see how precise the process is. I really think that if Newton or Benjamin Franklin or Archimedes were alive they would be as goofy as I am at the scale of what is available. Just the idea that I can get electricity from the wall would probably be enough to make their eyes bug out.

I was planning to do the whole process as step manufacturing, but this really makes a leap on my plans. I forgot I need to make a staged build so it can be duplicated for others. That changes things a little. I need to make the first process a complete copy of itself only so it can be given away to people as a starting point with the software. Very much like coordinating a software project where everybody has to have the source and diffs so they can patch the hardware. That will be a pain for a while.


Automated Intelligence

Automated Intelligence
Auftrag der unendlichen LOL katzen