Will humanity terminate itself

The growth of autonomous mechanisms is a real threat to the human race as it exists. I see that it is very possible to make an autonomous system of war that models itself after the algorithms of life itself. It is only in the ability to be shielded from momentum that provides a security from this method. It is that higher level of existence ( if you want to call it that ) that can survive when war is automated. It is not possible for an individual to make the necessary decisions in the time frame that can be achieved by a designed autonomous entity.

By taking the path to war, centralized control, and conflict, with ever more effective tools derived as extrapolations of more and more differential momentum, greater precision, and quicker resolution, it virtually guarantees the destruction of those who create the methods along with everyone else. IMHO

I can generate a momentum free space now and I am seriously considering that as a possible course of action. It is not a given that a chaotic system will fall to one of its fatal modes, but is certainly a real possibility with a weighting factor that is very bad. The complete collapse of an entire organism system is about as bad as it gets when you are considering Gain:Risk ratios. Chaos is a very dangerous thing and when planning a course it is not wise to choose a path that passes through a point of ( indifference or indeterminance or whatever you want to call it ) where the potential exit points include a fatal valley. It is pure and utter random nonsense for a person to step into a situation that is by its very nature not predictable or computable. It is like betting your lungs on a roulette game.


Automated Intelligence

Automated Intelligence
Auftrag der unendlichen LOL katzen