Tuesday, February 28, 2006

The Ethics and Politics of Search Engines

Last night I attended an interesting panel "The Ethics and Politics of Search Engines" sponsored by the Markkula Center for Applied Ethics, and the Center for Science, Technology, and Society at Santa Clara University. There were 3 panelists:

Peter Norvik: Director of Machine Learning at Google
Terry Winograd: Professor of Computer Science: Human-Computer Interaction at Stanford
Geoffrey Bowker: Executive Director of the Center of Science, Technology, and Society at Santa Clara University.

The panel was moderated by Kirk Hanson: Executive Director, Markkula Center for Applied Ethics, and a brief introduction was given by Susan Star, Senior Scholar Center for Science, Technology, and Society.

One of the most interesting discussions centered on objectivity and bias in search page results. For example, Google claims objectivity and impartial results in algorithmic ranking, which (of course in a center for ethics, and social science research) invited concern that the bias is simply in the algorithm (which is created by humans to work in a certain way), rather than direct human manipulation. Still, the panelists praised Google for not hand manipulating search results (in comparison with some other search engines) and while they understood the reasons for the algorithms being secret ((i) because advertisers try to 'game' the ranking mechanism, and (ii) for competitive reasons), would like to have some 3rd party oversee the algorithm design to ensure 'fairness'.

Another interesting discussion concerned legality and transparency of results, two of the panelists had shown what happens if you do searches under google.cn, and google.com. Peter Norvig's example was searching for 'bird flu', which brought in 94 million hits on google.com, and 52 million hits on google.cn. Geoffrey Bowker's example was searching for 'democracy' which brought in 145 million hits on google.com, and 139 million hits on google.cn.

Peter's reasoning for showing this was that while yes, things are censored in China, and Google has no choice, at least they are bringing quite a good service into China for their users. Google feels that it is more important to provide a search, even if they need to compromise it. Geoffrey's reasons for providing an example was different, he is concerned about which 6 million results were missing, and the role that censorship is playing.

While Peter claimed the chillingeffect.org is enabling Google to be transparent about what's missing, Geoffrey noted that while chillingeffect.org goes someway to providing transparency, it should in no way be taken for transparency.

Geoffrey pointed out that the default filtering for users of Google is "use moderate filtering". At this time, if you search using Google for 'breast cancer' without any filtering enabled (i.e. change the defaults) then the top hit is for 'The Breast Cancer Site' which is the site that enables users' clicks to fund free mamograms for women who may not ordinarily get them - go on - go click that button now! If you search for the same search terms with moderate filtering enabled (i.e. the default) then this same site does not feature in any of the results. Of course Google would not really intend or want for that to happen; but the question is raised: who should determine what is ok to filter.

We learned at the panel, (if we didn't know it already) that Google does not filter search results (unless filtering is enabled as in the default); but has put ethical choices into ads availability through filtering. So, in the famous PlayBoy interview that the Google founders gave in 2004, Sergey Brin commented "...we don't accept ads for hard liquor, but we accept ads for wine. It's just personal preference. We don't allow gun ads, and the gun lobby got upset about that. We don't try to put our sense of ethics into the search results, but we do when it comes to the advertising."

Another concern that was raised with the Google search algorithm is that is rewards the popular. As Terry Winograd put it "People who get more attention, get more attention because they got more attention." This is how Google's search algorithm works, since the essence of it is based upon page ranks. What can Google do to support the less popular, and circumvent some of this self-fulfilling popularity going on? The question was raised about personalization of search. Peter Norvig admitted that while there is much interest in this, currently it's difficult to see how it will work in the wide domain of search. The example he provided is that it works well in smaller domains: if I buy jazz at amazon.com, I'm probably likely to buy jazz again, and am unlikely to switch to rap, so Amazon can have a fairly high degree of confidence that if they suggest some more jazz to me, I might well be interested. The search domain is dissimilar in that people are typically doing searches to find out about something new. It makes it difficult to effectively benefit from personalization approaches.

The questions that the panelists have posed:

- What is fair?
- To whom do you pledge allegiance?
- What role should self censorship play?
- Questions of intellectual property
- Questions of storage of information
- What is 'truth'?
- What role should governments play?
e.t.c.

Will be debated and tested for a long time to come...

More blogs about , , .

Technorati Blog Finder

Thursday, February 16, 2006

Yahoo! opens pattern library.

Yahoo have written about the process of creating a pattern library, and now released it under a BSD license . I'm sure that this is probably a lot to do with attracting developers and generating goodwill; but also hope that it will reduce duplication of effort and poor practices, and speed up the development process. The design patterns all have accessibility sections - which is great to see. Yahoo! aren't the first to relase DHTML / AJAX widgit libraries like this though, also see:

Google's open source AJAX libraries

ActiveWidget

Prototype

Blueshoes


More blogs about , , , .

Technorati Blog Finder