Manage Your Algorithm
You
don’t believe you have to manage your algorithm? Click on the link and watch
the video.
http://idorosen.com/mirrors/robinsloan.com/epic/
A
few weeks ago in the New York Times
Evgeny Morozov wrote about the perils of personalization. He meant that
increasingly Internet sites are keeping records of your personal click
preferences and then finding web addresses that match those preferences. In
other words Google’s algorithm is such that if you click on something the
algorithm searches your previous clicks in order to find links consistent with
those. For example, what if you click on this blog (http://www.middleeastmirror.com/peace_and_conflict/) and then later searched the Huffington post blogs for information on the Middle East, links to my blog and subject related to it (Israel-Palestine, media, and democracy) will be statistically favored in the
algorithm. Google keeps track of your click history and then tries to tailor
results to that history.
Now,
at first blush this might seem pretty cool, and it is. And it is certainly
sensible to assume that your click history is a good indication of your
interests and preferences. Amazon has recorded my interest in clicking books
that are categorized as “spy novels.” So when I open up Amazon’s
webpage the algorithm still assumes that I’m interested in spy novels and lists
some new ones. This can certainly be convenient.
But
what Morozov points out is that this can result in an “information
cocoon.” I will keep receiving information consistent with my click subjects
and I will be locked into a pattern of regularized information. I will exist in
a sort of information enclave that will over time make me even more remote from
my friends and other information enclaves. We are not managing the algorithm
the algorithm is managing us.
This
is not a cute oddity associated with modern technology. The threat of over
customizing our information world is real enough. If you are a political
liberal and you have an aggregator on your computer that delivers each morning
a series of websites and blogs, and Google delivers you information after you
initiate a search based on your past history of site choices, then you are
slowly evolving into a more narrow information world. Over time you could
classify the information you received as propaganda. Is, after all, information
delivered to you by an authoritative source (the algorithm) designed to manage
and control knowledge and availability of information for desired (corporate)
reasons.
Cass
Sunstein has written persuasively about this phenomenon and argues that it ends
up in “enclave polarization.” Enclave polarization is the tendency to
talk mostly to people who already agree with you and therefore have your positions
reinforced resulting in an even stronger and more intense sense of being “correct.”
In a word, liberals tend to read liberal information and matters consistent
with a liberal ideology; conservatives read conservative information and
matters consistent with the conservative ideology. People increasingly receive
personalized information consistent with beliefs they already hold, and they
never engage in heterogeneous deliberation. Sunstein cites our uncivil and
contentious political culture, exemplified by raucous talk shows and polarizing
talk radio, as evidence of enclave polarization.
Enclaves
and the tendency to selectively expose oneself to information are well enough
understood and a natural human tendency. But living in an information cocoon
and joining enclaves of like-minded people means the loss of exposure to
oppositional information. This is difficult for many people but the advantages
are important. Exposing oneself to non-like-minded political points of view
creates greater awareness of what other people are thinking, greater awareness
of their rationales and perspectives and increased tolerance. Diana Mutz in her
book, Hearing the Other Side, reports data and explains the value of
exposure to non-like-minded political views.
But
back to the problem of algorithms and big-brother-like infrastructures for the
Internet, there are other issues of concern unrelated to the disadvantages of
information cocoons and polarized enclaves of talk show hosts and political
pundits screaming at each other about nothing. One, there is the matter of
privacy. We don’t have access to the algorithm and we cannot shut it off. We
cannot tell Google to change its algorithm or at least be more open about how
it works. This issue will probably be taken up in future cyber law. Maybe
algorithms will contribute to creativity and new insights when they are
programmed to maximize differences forcing the user to make new connections
between information. And finally, we might ask how the algorithm can encourage
education. Why can’t we program the algorithm to bring us high-quality sources
of information? New sources with ideas we would have never thought of!
When media become dominant – whether it be oral, written, print, or
electronic – they always assume new responsibilities along with certain moral
and political obligations. With the advent of print the “book” became
an object of adoration, the subject of study and analysis, a repository of
ideology and political implications, as well as a new site of legal implications.
Google’s algorithm will be no different.
Posted on July 3, 2011, in Media and politics and tagged Media. Bookmark the permalink. Comments Off on Manage Your Algorithm.