Reputation system

A reputation system computes and publishes reputation scores for a set of objects (e.g. service providers, services, goods or entities) within a community or domain, based on a collection of opinions that other entities hold about the objects. The opinions are typically passed as ratings to a central place where all perceptions, opinions and ratings can be accumulated. A reputation center uses a specific reputation algorithm to dynamically compute the reputation scores based on the received ratings. Reputation is a sign of trustworthiness manifested as testimony by other people.[1] New expectations and realities about the transparency, availability, and privacy of people and institutions are emerging. Reputation management – the selective exposure of personal information and activities – is an important element to how people function in networks as they establish credentials, build trust with others, and gather information to deal with problems or make decisions.[2]

Reputation systems are related to recommender systems and collaborative filtering, but with the difference that reputation systems produce scores based on explicit ratings from the community, whereas recommender systems use some external set of entities and events (such as the purchase of books, movies, or music) to generate marketing recommendations to users. The role of reputation systems is to facilitate trust, and often functions by making the reputation more visible.[3][4]

Types of reputation systems

Online reputation systems

Howard Rheingold states that online reputation systems are 'computer-based technologies that make it possible to manipulate in new and powerful ways an old and essential human trait'. Rheingold inclines that these systems arose as a result of the need for Internet users to gain trust in the individuals they transact with online. The innate trait he makes note of in humans is that a function of society such as gossip 'keeps us up to date on who to trust, who other people trust, who is important, and who decides who is important'. Internet sites such as eBay and Amazon he argues seek to service this consumer trait and are 'built around the contributions of millions of customers, enhanced by reputation systems that police the quality of the content and transactions exchanged through the site'.

Reputation banks

The emerging sharing economy increases the importance of trust in peer-to-peer marketplaces and services.[5] User can build up reputation and trust in individual systems but don’t have the ability to use them in other systems. Rachel Botsman and Roo Rogers argue in their book What’s Mine is Yours (2010),[6] that ‘it is only a matter of time before there is some form of network that aggregates your Reputation capital across multiple form of Collaborative Consumption’. These systems, often referred to as Reputation Banks, try to give users a platform to manage their Reputation capital across multiple systems.

Notable examples of practical applications

Reputation as a resource

High reputation capital often confers benefits upon the holder. For example, a wide range of studies have found a positive correlation between seller rating and asking price on eBay,[7] indicating that high reputation can help users obtain more money for their items. High product reviews on online marketplaces can also help drive higher sales volumes.

Abstract reputation can be used as a kind of resource, to be traded away for short-term gains or built up by investing effort. For example, a company with a good reputation may sell lower-quality products for higher profit until their reputation falls, or they may sell higher-quality products to increase their reputation.[8] Some reputation systems go further, making it explicitly possible to spend reputation within the system to derive a benefit. For example, on the Stack Overflow community, reputation points can be spent on question "bounties" to incentivize other users to answer the question.[9]

Even without an explicit spending mechanism in place, reputation systems often make it easier for users to spend their reputation without harming it excessively. For example, a driver with a high ride acceptance score (a metric often used for driver reputation) on a ride-sharing service may opt to be more selective about his or her clientele, decreasing their acceptance score but improving their driving experience. With the explicit feedback provided by the service, drivers can carefully manage their selectivity to avoid being penalized too heavily.

Attacks on reputation systems

Reputation systems are in general vulnerable to attacks, and many types of attacks are possible.[10] A typical example is the so-called Sybil attack where an attacker subverts the reputation system by creating a large number of pseudonymous entities, and using them to gain a disproportionately large influence.[11] A reputation system's vulnerability to a Sybil attack depends on how cheaply Sybils can be generated, the degree to which the reputation system accepts input from entities that do not have a chain of trust linking them to a trusted entity, and whether the reputation system treats all entities identically.

See also

References

  1. Slee, Tom (September 29, 2013). "Some Obvious Things About Internet Reputation Systems".
  2. Lee Rainie and Barry Wellman, Networked: The New Social Operating System. MIT Press, 2012.
  3. Resnick, P.; Zeckhauser, R.; Friedman, E.; Kuwabara, K. (2000). "Reputation Systems" (PDF). Communications of the ACM. doi:10.1145/355112.355122.
  4. Jøsang, A.; Ismail, R.; Boyd, C. (2007). "A Survey of Trust and Reputation Systems for Online Service Provision" (PDF). Decision Support Systems 43 (2): 45–48. doi:10.1016/j.dss.2005.05.019.
  5. Tanz, Jason (May 23, 2014). "How Airbnb and Lyft Finally Got Americans to Trust Each Other".
  6. Botsman, Rachel (2010). What's Mine is Yours. New York: Harper Business. ISBN 0061963542.
  7. Ye, Qiang (2013). "In-Depth Analysis of the Seller Reputation and Price Premium Relationship: A Comparison Between eBay US And Taobao China" (PDF). Journal of Electronic Commerce Research 14 (1).
  8. Winfree, Jason, A. (2003). "Collective Reputation and Quality" (PDF). American Agricultural Economics Association Meetings.
  9. http://stackoverflow.com/help/bounty
  10. Jøsang, A.; Golbeck, J. (September 2009). Challenges for Robust of Trust and Reputation Systems. (PDF). Proceedings of the 5th International Workshop on Security and Trust Management (STM 2009). Saint Malo, France.
  11. Lazzari, Marco (March 2010). An experiment on the weakness of reputation algorithms used in professional social networks: the case of Naymz. Proceedings of the IADIS International Conference e-Society 2010. Porto, Portugal.

External links

This article is issued from Wikipedia - version of the Monday, April 04, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.