Psychotic algorithms

I would suspect that software created by a novice for a particular purpose would reflect the character of the person and their intentions. It seems to imply that the meme of Skynet might not be as far fetched and comical as one would hope. If you read code and see patterns, it is obvious that something has gone awry. In the desire to a goal, without full knowledge of effect, programs begin to have their own motives that do some very odd stuff.

Software just has a natural tendency to get out of hand. It may be that people judge the machine to employ "common sense" , and the computer has none.

It would be comedy if it wasn't doing things at this very moment that I know to be less than conducive to life of random individuals and that is not their goal ( I assume ) , but that is the product of their less than comprehensive understanding of what happens next.

It is too complex and that is the problem. The system is more complex than can be humanly understood and yet it trudges onward by its own momentum as a careening bull in the china shop of life. I wish I could make an example, but the explanation of it is even more complex than the process itself and would certainly fall to deaf ears as there is no mechanism available to tap the machine on the shoulder and say "You're doing that wrong and it is breaking things."

On a smaller scale it would be similar to a machine that has fallen into a mode and the software is running amok over the disk and display and printer and all you can do is hurry pull the plug. So what happens when you make systems that have no means to be turned off and they are in the process of going nuts? It reminds me of early worries about computers becoming self aware and there was always the option of pulling the plug, but if you design it so it it is redundant, decentralized, virtual, and can't be turned off, there is nothing that can be done.

Oh well, I am sure it will never

0 comments:

Automated Intelligence

Automated Intelligence
Auftrag der unendlichen LOL katzen