Complexity gaps

This is something I’ve been thinking about a lot lately, that is poorly covered by science as it is cross-discipline and that is frowned upon these days.

And I have no credentials so even if my ideas are 100% right, no one will care. So that leaves me free to do what the hell I want, just as I like it.

But what I’ve been pondering is the complexity gap, as I’ve been calling it. There is an idea pervasive in science that more information leads to better predictive ability. This seems obviously true, right? Yes? Well, but is it? (And there is follow-on that everything can be measured and has a mathematical solution, even if just in principle.)

Just because something seems true means nothing; it actually has to be true.

But there are and always will be enormous complexity gaps in reality. Computer science majors will probably know what I am getting at right away without me having to spell it out, but that’s one of the problems: many scientists I’ve met (save a few like a friend of mine) know almost nothing about any other field!

For problems of certain n complexity, adding even a billion or a trillion, etc., times the information will not change your predictive ability at all. (Computer scientists are probably laughing now; others might be puzzled.)

For certain phenomena, more information might actually lead to worse results.

More information is better than not having it, but it doesn’t magically lead to the truth falling out.