Can a ranking system be transparent, inclusive, and successful? That was the topic of a long conversation last week with Lijit’s senior developer Derek Greentree. We kept coming back to questions about transparency and the cost of acquiring votes. And in the end we decided that this is some sort of rule:
The maximum success possible for a system is a function of the transparency of the algorithms and the cost of acquiring votes.
Consider this rough chart:
System | Transparency | Cost/Exclusivity |
---|---|---|
Online | ||
Digg | Low | Low – Pass a CAPTCHA. |
Low | Lower – Create a web page with links. | |
SomethingAwful.com Forums | High | Med – $10 cover charge |
Offline | ||
Political Democracy | High | High – Become a citizen. |
Academy Awards | High. | High – Become a member of the academy. |
American Idol | High | Med – Cost of Text message |
I hear you asking, “Why do Digg and Google get “Low” marks for transparency?”
Digg ranks news stories by the number of members which vote for (“digg”) each candidate. It’s pretty much a pure democracy, with an added time component: old articles are worth less. On the other hand, Google ranks pages by a more complicated algorithm known as PageRank, which treats links on web pages as “votes” for other pages and some pages’ votes worth more than others. It’s a bit like the electoral college, with an added semantic component: pages not related to the search query are worth less.
Do those descriptions sound about right? The thing is, neither is true these days. PageRank is now only one small ingredient of a page’s search ranking. Anyone who pays attention to their page in search listings is familiar with the “Google Dance” when ranking can change unpredictably and sometimes unfairly. Google has become a black box. Digg’s newfound popularity has it struggling to deal with spammers, and has also begun to shroud its algorithms in secret. The most recent Wired magazine, has an article “Herding the Mob” quotes Digg founder Kevin Rose as saying there are antihacking techniques that he can’t talk about.
Jay and Kevin said they couldn’t explicitly detail how Digg’s ranking algorithm works because it would be used by those who want to game the system (the aiding the enemy defense is popular these days), but they gave enough information to understand the basics of how Digg’s version of a democracy works.
So what we see is that these two popular online ranking systems began with public algorithms, but have retreated into secrecy.
On the other hand, systems like the US election process remain part of the public record. Of course, in a democracy it costs a lot to get a vote. For one thing, you have to be born. And if you really want to cheat, you have to mess around with getting the ID’s of dead people and other very messy activities. In the recent Iraq elections the took the extra measure of dipping each voter’s finger in permanent ink to prevent double voting.
Is this trend necessary? What are the underlying principles?
The trend seems to be that to thwart spammer in popular systems, transparency must go down or cost must go up. And in the online world, costs are dropping so low that transparency is being forced down as well.
The web has seen a lot of systems that begin with low costs and high transparency. That’s the very definition of openness. But as the systems experience success, they have 3 choices:
- Raise the costs. E.g. SomethingAwful.com added a $10 cover charge to participate in voting. Metafilter added a $5 cover charge.
- Obscure the algorithms. E.g. Digg adding secret “anti-gaming” algorithms
- Become irrelevant. E.g. Usenet forums overrun with spammers
The most popular choice seems to be obscuring the algorithms.
Should we be alarmed at this? Imagine if the US government took the same approach: they will tell us who won the election, but the exact algorithm used to determine the winner can’t be revealed! One can argue that getting on the front page of a Google search or the front page of Digg is not nearly as important as an election. But the value of such positioning is only increasing in value, and the bad guys are already trying to rig these elections!
I would argue that low transparency is a form of editing. When Digg or Google says that they must keep their algorithms secret, they are in effect saying “Our algorithms are fair, but we can’t tell them to you. You can trust us.” But do we really trust them? Should we? If some quirk of Google’s algorithms somehow helps a company they have a partnership with, how motivated will they be to fix it?
Anyways, those are some beginning thoughts on the subject. Any ideas from you would be appreciated, as I feel there is a lot more to explore here.
Only global ratings suffer from it. Subjective metrics are a lot more attack resistant – http://www.sigcomm.org/sigcomm2005/paper-CheFri.pdf
Yes! The problem is when you want a global “objective” ranking system. When you want ratings with a known bias (e.g. the academy awards, your friend’s delicious tags) then these problems go away.
I guess I should do a follow up post about how Lijit does search in a subjective way, thus avoiding the google problems.
Thanks for the paper reference, Joe…I’ll read it tonight.