LHC problems

The massive amount of data that will be generated by the LHC may be symptomatic of a general problem which exists in the whole world now. The amount of information may well contain all the answers that a person wants, however the ability to analyze that information is limited to the principle that the human mind is the primary and final judge of it's validity in context. There are not enough people to deal with the data and if it continues, the problem will simply get worse. If a method to integrate the speed of information production and the way in which data is used is not devised, the search for new answers will obscure the answers themselves.

It is assumed that the specialization of study in the area of physics is the way to understand physics. It requires a certain context to even evaluate the data of something as elegant at quantum entanglement and while they may understand the process well enough to define a pattern in the data which defines a control point, they may not realize how that information could apply and by simply ignoring the consequence outside of one's area of expertise is not a very reasonable approach.

Context can bind a person to a failed approach in physics as easily as any other area. It is not a an attempt to point a finger and say that others are wrong, any impressions of how to achieve a result binds me as well.

0 comments:

Contributors

Automated Intelligence

Automated Intelligence
Auftrag der unendlichen LOL katzen