Wednesday, February 29, 2012

Google PageRank Objectivity and Truth

This morning I found a TED talk that connects to the Google algorithm I PageRank.

This TED Talk indicates what appears to me to be the application of that algorithm to tailer search results presented to the person doing the search.

One thought that I expressed yesterday was the possible contribution of a soft social science to the world of hard science to reveal quantum relationships.  Hard science knowledge where information is revealed not filtered to the extent that filtering presents an unscientific view of the physical world.

Eli Pariser is (filtered, biased?) a journalistic truth seeker.  (A paradox?) His TED talk presents evidence of Google selecting and presenting search results based on what it thinks the searcher wants to see or is looking for. 

The subjective product of a social science application of an scientifically objective research algorithm is the antithesis of science when it presents skewed information based on ideological preference.  Google's determination of an individual's ideological preference based on assumptions inferred by analysis of linkages in the person's search history.

If a person chooses to watch Fox News they have selected their poison of preference.  We all get to choose.  However, if we are presented with a subset array from which to to make an objective choice, thinking that we are looking at a truthful representation of the total population of choice available, truth has been manipulated.

Very unscientific of you if you present only the data that your boss wants to see.  Have you ever seen that done in the scientific community?  It does not really fly well does it?  Unless the boss gives you a promotion, of course.

An algorithm used to reveal objective truth and accuracy in science has equal value in obscuring true reality in the social science.

Is that really a revelation?  Or just a scientific confirmation of a known fact?

It is like school lunch.  Who determines the mix of fats and vegetables that is good for us and then presents their selection to us?  I would like to say we ourselves determine the choice from all possible choices based on informed intelligent decisions.

What happens when we depend on Google for our informed intelligence?

Probabilities are expressions of confidence in the world of science.  I think Eli Pariser is probably right (not filtered or biased) about the danger of filtering/manipulation of information sub-optimized on what anybody wants to sell us and influence us to buy.  Things like conceptual ideas or the things they relate to like bikes, boats or guns or candidates.  I would depend on him giving me a better big picture of truth than I would get from Google.

That's life. 

The difference between hard scientific life and social life.  In science, accuracy, clarity is a function of distance from the truth to the extent that intervening variables are absolutely known or expressed as a degree of objectively quantified confidence that they are known.

In social life clarity and accuracy are influenced by filters with an increasing distortion (that could be quantified by a good algorithm) what we see starting beyond the tip of our nose or perhaps, more accurately, the second brain synapse.

Do no evil.

No comments: