fbpx
Wikipedia

Reputation system

Reputation systems are programs or algorithms that allow users to rate each other in online communities in order to build trust through reputation. Some common uses of these systems can be found on E-commerce websites such as eBay, Amazon.com, and Etsy as well as online advice communities such as Stack Exchange.[1] These reputation systems represent a significant trend in "decision support for Internet mediated service provisions".[2] With the popularity of online communities for shopping, advice, and exchange of other important information, reputation systems are becoming vitally important to the online experience. The idea of reputation systems is that even if the consumer can't physically try a product or service, or see the person providing information, that they can be confident in the outcome of the exchange through trust built by recommender systems.[2]

Collaborative filtering, used most commonly in recommender systems, are related to reputation systems in that they both collect ratings from members of a community.[2] The core difference between reputation systems and collaborative filtering is the ways in which they use user feedback. In collaborative filtering, the goal is to find similarities between users in order to recommend products to customers. The role of reputation systems, in contrast, is to gather a collective opinion in order to build trust between users of an online community.

Types edit

Online edit

Howard Rheingold states that online reputation systems are "computer-based technologies that make it possible to manipulate in new and powerful ways an old and essential human trait".[3] Rheingold says that these systems arose as a result of the need for Internet users to gain trust in the individuals they transact with online. The trait he notes in human groups is that social functions such as gossip "keeps us up to date on who to trust, who other people trust, who is important, and who decides who is important". Internet sites such as eBay and Amazon, he argues, seek to make use of this social trait and are "built around the contributions of millions of customers, enhanced by reputation systems that police the quality of the content and transactions exchanged through the site".

Reputation banks edit

The emerging sharing economy increases the importance of trust in peer-to-peer marketplaces and services.[4] Users can build up reputation and trust in individual systems but usually don't have the ability to carry those reputations to other systems. Rachel Botsman and Roo Rogers argue in their book What's Mine is Yours (2010),[5] that "it is only a matter of time before there is some form of network that aggregates reputation capital across multiple forms of Collaborative Consumption". These systems, often referred to as reputation banks, try to give users a platform to manage their reputation capital across multiple systems.

Maintaining effective reputation systems edit

The main function of reputation systems is to build a sense of trust among users of online communities. As with brick and mortar stores, trust and reputation can be built through customer feedback. Paul Resnick from the Association for Computing Machinery describes three properties that are necessary for reputation systems to operate effectively.[2]

  1. Entities must have a long lifetime and create accurate expectations of future interactions.
  2. They must capture and distribute feedback about prior interactions.
  3. They must use feedback to guide trust.

These three properties are critically important in building reliable reputations, and all revolve around one important element: user feedback. User feedback in reputation systems, whether it be in the form of comments, ratings, or recommendations, is a valuable piece of information. Without user feedback, reputation systems cannot sustain an environment of trust.

Eliciting user feedback can have three related problems.

  1. The first of these problems is the willingness of users to provide feedback when the option to do so is not required. If an online community has a large stream of interactions happening, but no feedback is gathered, the environment of trust and reputation cannot be formed.
  2. The second of these problems is gaining negative feedback from users. Many factors contribute to users not wanting to give negative feedback, the most prominent being a fear of retaliation. When feedback is not anonymous, many users fear retaliation if negative feedback is given.
  3. The final problem related to user feedback is eliciting honest feedback from users. Although there is no concrete method for ensuring the truthfulness of feedback, if a community of honest feedback is established, new users will be more likely to give honest feedback as well.

Other pitfalls to effective reputation systems described by A. Josang et al. include change of identities and discrimination. Again these ideas tie back to the idea of regulating user actions in order to gain accurate and consistent user feedback. When analyzing different types of reputation systems it is important to look at these specific features in order to determine the effectiveness of each system.

Standardization attempt edit

The IETF proposed a protocol to exchange reputation data.[6] It was originally aimed at email applications, but it was subsequently developed as a general architecture for a reputation-based service, followed by an email-specific part.[7] However, the workhorse of email reputation remains with DNSxL's, which do not follow that protocol.[8] Those specification don't say how to collect feedback —in fact, the granularity of email sending entities makes it impractical to collect feedback directly from recipients— but are only concerned with reputation query/response methods.

Notable examples of practical applications edit

Reputation as a resource edit

High reputation capital often confers benefits upon the holder. For example, a wide range of studies have found a positive correlation between seller rating and selling price on eBay,[10] indicating that high reputation can help users obtain more money for their items. High product reviews on online marketplaces can also help drive higher sales volumes.

Abstract reputation can be used as a kind of resource, to be traded away for short-term gains or built up by investing effort. For example, a company with a good reputation may sell lower-quality products for higher profit until their reputation falls, or they may sell higher-quality products to increase their reputation.[11] Some reputation systems go further, making it explicitly possible to spend reputation within the system to derive a benefit. For example, on the Stack Overflow community, reputation points can be spent on question "bounties" to incentivize other users to answer the question.[12]

Even without an explicit spending mechanism in place, reputation systems often make it easier for users to spend their reputation without harming it excessively. For example, a ridesharing company driver with a high ride acceptance score (a metric often used for driver reputation) may opt to be more selective about his or her clientele, decreasing the driver's acceptance score but improving his or her driving experience. With the explicit feedback provided by the service, drivers can carefully manage their selectivity to avoid being penalized too heavily.

Attacks and defense edit

Reputation systems are in general vulnerable to attacks, and many types of attacks are possible.[13] As the reputation system tries to generate an accurate assessment based on various factors including but not limited to unpredictable user size and potential adversarial environments, the attacks and defense mechanisms play an important role in the reputation systems. [14]

Attack classification of reputation system is based on identifying which system components and design choices are the targets of attacks. While the defense mechanisms are concluded based on existing reputation systems.

Attacker model edit

The capability of the attacker is determined by several characteristics, e.g., the location of the attacker related to the system (insider attacker vs. outsider attacker). An insider is an entity who has legitimate access to the system and can participate according to the system specifications, while an outsider is any unauthorized entity in the system who may or may not be identifiable.

As the outsider attack is much more similar to other attacks in a computer system environment, the insider attack gets more focus in the reputation system. Usually, there are some common assumptions: the attackers are motivated either by selfish or malicious intent and the attackers can either work alone or in coalitions.

Attack classification edit

Attacks against reputation systems are classified based on the goals and methods of the attacker.

  • Self-promoting Attack. The attacker falsely increases their own reputation. A typical example is the so-called Sybil attack where an attacker subverts the reputation system by creating a large number of pseudonymous entities, and using them to gain a disproportionately large influence.[15] A reputation system's vulnerability to a Sybil attack depends on how cheaply Sybils can be generated, the degree to which the reputation system accepts input from entities that do not have a chain of trust linking them to a trusted entity, and whether the reputation system treats all entities identically.
  • Whitewashing Attack. The attacker uses some system vulnerability to update their reputation. This attack usually targets the reputation system's formulation that is used to calculate the reputation result. The whitewashing attack can be combined with other types of attacks to make each one more effective.
  • Slandering Attack. The attacker reports false data to lower the reputation of the victim nodes. It can be achieved by a single attacker or a coalition of attackers.
  • Orchestrated Attack. The attacker orchestrates their efforts and employs several of the above strategies. One famous example of an orchestrated attack is known as an oscillation attack.[16]
  • Denial of Service Attack. The attacker prevents the calculation and dissemination of reputation values in reputation systems by using Denial of Service method.

Defense strategies edit

Here are some strategies to prevent the above attacks.[17]

  • Preventing Multiple Identities
  • Mitigating Generation of False Rumors
  • Mitigating Spreading of False Rumors
  • Preventing Short-Term Abuse of the System
  • Mitigating Denial of Service Attacks

See also edit

References edit

  1. ^ "What is reputation? How do I earn (and lose) it? - Help Center". Stack Overflow. Retrieved 2022-11-15.
  2. ^ a b c d Josang, Audun (2000). "A survey of trust and reputation systems for online service provision". Decision Support Systems. 45 (2): 618–644. CiteSeerX 10.1.1.687.1838. doi:10.1016/j.dss.2005.05.019. S2CID 209552.
  3. ^ Books in Print Supplement. R. R. Bowker Company. 2002. ISBN 978-0-8352-4564-7.
  4. ^ Tanz, Jason (May 23, 2014). "How Airbnb and Lyft Finally Got Americans to Trust Each Other". Wired.
  5. ^ Botsman, Rachel (2010). What's Mine is Yours. New York: Harper Business. ISBN 978-0061963544.
  6. ^ Nathaniel Borenstein; Murray S. Kucherawy (November 2013). An Architecture for Reputation Reporting. IETF. doi:10.17487/RFC7070. RFC 7070. Retrieved 20 April 2017.
  7. ^ Nathaniel Borenstein; Murray S. Kucherawy (November 2013). A Reputation Response Set for Email Identifiers. IETF. doi:10.17487/RFC7073. RFC 7073. Retrieved 20 April 2017.
  8. ^ John Levine (February 2010). DNS Blacklists and Whitelists. IETF. doi:10.17487/RFC5782. RFC 5782. Retrieved 20 April 2017.
  9. ^ Dencheva, S.; Prause, C. R.; Prinz, W. (September 2011). (PDF). Proceedings of the 12th European Conference on Computer Supported Cooperative Work (ECSCW 2011). Aarhus, Denmark. Archived from the original (PDF) on 2014-11-29.
  10. ^ Ye, Qiang (2013). (PDF). Journal of Electronic Commerce Research. 14 (1). Archived from the original (PDF) on 2017-08-08. Retrieved 2015-04-30.
  11. ^ Winfree, Jason, A. (2003). "Collective Reputation and Quality" (PDF). American Agricultural Economics Association Meetings.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  12. ^ "What is a bounty? How can I start one? - Help Center". stackoverflow.com.
  13. ^ Jøsang, A.; Golbeck, J. (September 2009). Challenges for Robust of Trust and Reputation Systems (PDF). Proceedings of the 5th International Workshop on Security and Trust Management (STM 2009). Saint Malo, France.
  14. ^ Hoffman, K.; Zage, D.; Nita-Rotaru, C. (2009). (PDF). ACM Computing Surveys. 42: 1–31. CiteSeerX 10.1.1.172.8253. doi:10.1145/1592451.1592452. S2CID 2294541. Archived from the original (PDF) on 2017-04-07. Retrieved 2016-12-05.
  15. ^ Lazzari, Marco (March 2010). . Proceedings of the IADIS International Conference e-Society 2010. Porto, Portugal. Archived from the original on 2016-03-07. Retrieved 2014-08-28.
  16. ^ Srivatsa, M.; Xiong, L.; Liu, L. (2005). (PDF). Proceedings of the IADIS International Conference e-Society 2010the 14th international conference on World Wide Web. Porto, Portugal. doi:10.1145/1060745.1060808. S2CID 1612033. Archived from the original (PDF) on 2017-10-18.
  17. ^ Hoffman, Kevin; Zage, David; Nita-Rotaru, Cristina (2009-12-14). "A survey of attack and defense techniques for reputation systems". ACM Computing Surveys. 42 (1): 1:1–1:31. CiteSeerX 10.1.1.172.8253. doi:10.1145/1592451.1592452. ISSN 0360-0300. S2CID 2294541.
  • Dellarocas, C. (2003). "The Digitization of Word-of-Mouth: Promise and Challenges of Online Reputation Mechanisms" (PDF). Management Science. 49 (10): 1407–1424. doi:10.1287/mnsc.49.10.1407.17308. hdl:1721.1/1851.
  • Vavilis, S.; Petković, M.; Zannone, N. (2014). (PDF). Decision Support Systems. 61: 147–154. doi:10.1016/j.dss.2014.02.002. Archived from the original (PDF) on 2017-07-13. Retrieved 2014-06-03.
  • D. Quercia, S. Hailes, L. Capra. Lightweight Distributed Trust Propagation. ICDM 2007.
  • R. Guha, R. Kumar, P. Raghavan, A. Tomkins. Propagation of Trust and Distrust WWW2004.
  • A. Cheng, E. Friedman. SIGCOMM workshop on Economics of peer-to-peer systems, 2005.
  • Hamed Alhoori, Omar Alvarez, Richard Furuta, Miguel Muñiz, Eduardo Urbina: Supporting the Creation of Scholarly Bibliographies by Communities through Online Reputation Based Social Collaboration. ECDL 2009: 180-191
  • by Daniele Quercia and Stephen Hailes. IEEE INFOCOM 2010.
  • J.R. Douceur. . IPTPS02 2002.
  • Hoffman, K.; Zage, D.; Nita-Rotaru, C. (2009). "A survey of attack and defense techniques for reputation systems". ACM Computing Surveys. 42 (1): 1. CiteSeerX 10.1.1.172.8253. doi:10.1145/1592451.1592452. S2CID 2294541.
  • Rheingold, Howard (2002). Smart Mobs: The Next Social Revolution. Perseus, Cambridge, Massachusetts.
  • Cattalibys, K. (2010). "I could be someone else - social networks, pseudonyms and sockpuppets". Schizoaffective Disorders. 49 (3).
  • Zhang, Jie; Cohen, Robin (2006). Trusting Advice from Other Buyers in E-Marketplaces: The Problem of Unfair Ratings (PDF). Proceedings of the Eighth International Conference on Electronic Commerce (ICEC). New Brunswick, Canada.

External links edit

  • Reputation Systems - 2008 tutorial by Yury Lifshits
  • Contracts in Cyberspace - 2008 essay (book chapter) by David D. Friedman.

reputation, system, this, article, includes, list, general, references, lacks, sufficient, corresponding, inline, citations, please, help, improve, this, article, introducing, more, precise, citations, september, 2010, learn, when, remove, this, template, mess. This article includes a list of general references but it lacks sufficient corresponding inline citations Please help to improve this article by introducing more precise citations September 2010 Learn how and when to remove this template message Reputation systems are programs or algorithms that allow users to rate each other in online communities in order to build trust through reputation Some common uses of these systems can be found on E commerce websites such as eBay Amazon com and Etsy as well as online advice communities such as Stack Exchange 1 These reputation systems represent a significant trend in decision support for Internet mediated service provisions 2 With the popularity of online communities for shopping advice and exchange of other important information reputation systems are becoming vitally important to the online experience The idea of reputation systems is that even if the consumer can t physically try a product or service or see the person providing information that they can be confident in the outcome of the exchange through trust built by recommender systems 2 Collaborative filtering used most commonly in recommender systems are related to reputation systems in that they both collect ratings from members of a community 2 The core difference between reputation systems and collaborative filtering is the ways in which they use user feedback In collaborative filtering the goal is to find similarities between users in order to recommend products to customers The role of reputation systems in contrast is to gather a collective opinion in order to build trust between users of an online community Contents 1 Types 1 1 Online 1 2 Reputation banks 2 Maintaining effective reputation systems 2 1 Standardization attempt 3 Notable examples of practical applications 4 Reputation as a resource 5 Attacks and defense 5 1 Attacker model 5 2 Attack classification 5 3 Defense strategies 6 See also 7 References 8 External linksTypes editOnline edit Howard Rheingold states that online reputation systems are computer based technologies that make it possible to manipulate in new and powerful ways an old and essential human trait 3 Rheingold says that these systems arose as a result of the need for Internet users to gain trust in the individuals they transact with online The trait he notes in human groups is that social functions such as gossip keeps us up to date on who to trust who other people trust who is important and who decides who is important Internet sites such as eBay and Amazon he argues seek to make use of this social trait and are built around the contributions of millions of customers enhanced by reputation systems that police the quality of the content and transactions exchanged through the site Reputation banks edit The emerging sharing economy increases the importance of trust in peer to peer marketplaces and services 4 Users can build up reputation and trust in individual systems but usually don t have the ability to carry those reputations to other systems Rachel Botsman and Roo Rogers argue in their book What s Mine is Yours 2010 5 that it is only a matter of time before there is some form of network that aggregates reputation capital across multiple forms of Collaborative Consumption These systems often referred to as reputation banks try to give users a platform to manage their reputation capital across multiple systems Maintaining effective reputation systems editThe main function of reputation systems is to build a sense of trust among users of online communities As with brick and mortar stores trust and reputation can be built through customer feedback Paul Resnick from the Association for Computing Machinery describes three properties that are necessary for reputation systems to operate effectively 2 Entities must have a long lifetime and create accurate expectations of future interactions They must capture and distribute feedback about prior interactions They must use feedback to guide trust These three properties are critically important in building reliable reputations and all revolve around one important element user feedback User feedback in reputation systems whether it be in the form of comments ratings or recommendations is a valuable piece of information Without user feedback reputation systems cannot sustain an environment of trust Eliciting user feedback can have three related problems The first of these problems is the willingness of users to provide feedback when the option to do so is not required If an online community has a large stream of interactions happening but no feedback is gathered the environment of trust and reputation cannot be formed The second of these problems is gaining negative feedback from users Many factors contribute to users not wanting to give negative feedback the most prominent being a fear of retaliation When feedback is not anonymous many users fear retaliation if negative feedback is given The final problem related to user feedback is eliciting honest feedback from users Although there is no concrete method for ensuring the truthfulness of feedback if a community of honest feedback is established new users will be more likely to give honest feedback as well Other pitfalls to effective reputation systems described by A Josang et al include change of identities and discrimination Again these ideas tie back to the idea of regulating user actions in order to gain accurate and consistent user feedback When analyzing different types of reputation systems it is important to look at these specific features in order to determine the effectiveness of each system Standardization attempt edit The IETF proposed a protocol to exchange reputation data 6 It was originally aimed at email applications but it was subsequently developed as a general architecture for a reputation based service followed by an email specific part 7 However the workhorse of email reputation remains with DNSxL s which do not follow that protocol 8 Those specification don t say how to collect feedback in fact the granularity of email sending entities makes it impractical to collect feedback directly from recipients but are only concerned with reputation query response methods Notable examples of practical applications editSearch web see PageRank eCommerce eBay Epinions Bizrate Trustpilot Social news Reddit Digg Imgur Programming communities Advogato freelance marketplaces Stack Overflow Wikis Increase contribution quantity and quality 9 Internet Security TrustedSource Question and Answer sites Quora Yahoo Answers Gutefrage net Stack Exchange Email DNSBL and DNSWL provide global reputation about email senders Personal Reputation CouchSurfing for travelers Non Governmental organizations NGOs GreatNonProfits org GlobalGiving Professional reputation of translators and translation outsourcers BlueBoard at ProZ com All purpose reputation system Yelp Inc Academia general bibliometric measures e g the h index of a researcher Reputation as a resource editHigh reputation capital often confers benefits upon the holder For example a wide range of studies have found a positive correlation between seller rating and selling price on eBay 10 indicating that high reputation can help users obtain more money for their items High product reviews on online marketplaces can also help drive higher sales volumes Abstract reputation can be used as a kind of resource to be traded away for short term gains or built up by investing effort For example a company with a good reputation may sell lower quality products for higher profit until their reputation falls or they may sell higher quality products to increase their reputation 11 Some reputation systems go further making it explicitly possible to spend reputation within the system to derive a benefit For example on the Stack Overflow community reputation points can be spent on question bounties to incentivize other users to answer the question 12 Even without an explicit spending mechanism in place reputation systems often make it easier for users to spend their reputation without harming it excessively For example a ridesharing company driver with a high ride acceptance score a metric often used for driver reputation may opt to be more selective about his or her clientele decreasing the driver s acceptance score but improving his or her driving experience With the explicit feedback provided by the service drivers can carefully manage their selectivity to avoid being penalized too heavily Attacks and defense editReputation systems are in general vulnerable to attacks and many types of attacks are possible 13 As the reputation system tries to generate an accurate assessment based on various factors including but not limited to unpredictable user size and potential adversarial environments the attacks and defense mechanisms play an important role in the reputation systems 14 Attack classification of reputation system is based on identifying which system components and design choices are the targets of attacks While the defense mechanisms are concluded based on existing reputation systems Attacker model edit The capability of the attacker is determined by several characteristics e g the location of the attacker related to the system insider attacker vs outsider attacker An insider is an entity who has legitimate access to the system and can participate according to the system specifications while an outsider is any unauthorized entity in the system who may or may not be identifiable As the outsider attack is much more similar to other attacks in a computer system environment the insider attack gets more focus in the reputation system Usually there are some common assumptions the attackers are motivated either by selfish or malicious intent and the attackers can either work alone or in coalitions Attack classification edit Attacks against reputation systems are classified based on the goals and methods of the attacker Self promoting Attack The attacker falsely increases their own reputation A typical example is the so called Sybil attack where an attacker subverts the reputation system by creating a large number of pseudonymous entities and using them to gain a disproportionately large influence 15 A reputation system s vulnerability to a Sybil attack depends on how cheaply Sybils can be generated the degree to which the reputation system accepts input from entities that do not have a chain of trust linking them to a trusted entity and whether the reputation system treats all entities identically Whitewashing Attack The attacker uses some system vulnerability to update their reputation This attack usually targets the reputation system s formulation that is used to calculate the reputation result The whitewashing attack can be combined with other types of attacks to make each one more effective Slandering Attack The attacker reports false data to lower the reputation of the victim nodes It can be achieved by a single attacker or a coalition of attackers Orchestrated Attack The attacker orchestrates their efforts and employs several of the above strategies One famous example of an orchestrated attack is known as an oscillation attack 16 Denial of Service Attack The attacker prevents the calculation and dissemination of reputation values in reputation systems by using Denial of Service method Defense strategies edit Here are some strategies to prevent the above attacks 17 Preventing Multiple Identities Mitigating Generation of False Rumors Mitigating Spreading of False Rumors Preventing Short Term Abuse of the System Mitigating Denial of Service AttacksSee also editCollaborative filtering Collective influence algorithm Commons based peer production Defaulted executee Government by algorithm Honor system Influence for hire Karma Online participation Online presence management Reputation capital Reputation management Sesame Credit Sharing economy Social Credit System Social currency Social media optimization Social profiling Social reputation in fiction Social translucence Subjective logic Trust metric Web of trust WhuffieReferences edit What is reputation How do I earn and lose it Help Center Stack Overflow Retrieved 2022 11 15 a b c d Josang Audun 2000 A survey of trust and reputation systems for online service provision Decision Support Systems 45 2 618 644 CiteSeerX 10 1 1 687 1838 doi 10 1016 j dss 2005 05 019 S2CID 209552 Books in Print Supplement R R Bowker Company 2002 ISBN 978 0 8352 4564 7 Tanz Jason May 23 2014 How Airbnb and Lyft Finally Got Americans to Trust Each Other Wired Botsman Rachel 2010 What s Mine is Yours New York Harper Business ISBN 978 0061963544 Nathaniel Borenstein Murray S Kucherawy November 2013 An Architecture for Reputation Reporting IETF doi 10 17487 RFC7070 RFC 7070 Retrieved 20 April 2017 Nathaniel Borenstein Murray S Kucherawy November 2013 A Reputation Response Set for Email Identifiers IETF doi 10 17487 RFC7073 RFC 7073 Retrieved 20 April 2017 John Levine February 2010 DNS Blacklists and Whitelists IETF doi 10 17487 RFC5782 RFC 5782 Retrieved 20 April 2017 Dencheva S Prause C R Prinz W September 2011 Dynamic self moderation in a corporate wiki to improve participation and contribution quality PDF Proceedings of the 12th European Conference on Computer Supported Cooperative Work ECSCW 2011 Aarhus Denmark Archived from the original PDF on 2014 11 29 Ye Qiang 2013 In Depth Analysis of the Seller Reputation and Price Premium Relationship A Comparison Between eBay US And Taobao China PDF Journal of Electronic Commerce Research 14 1 Archived from the original PDF on 2017 08 08 Retrieved 2015 04 30 Winfree Jason A 2003 Collective Reputation and Quality PDF American Agricultural Economics Association Meetings a href Template Cite journal html title Template Cite journal cite journal a CS1 maint multiple names authors list link What is a bounty How can I start one Help Center stackoverflow com Josang A Golbeck J September 2009 Challenges for Robust of Trust and Reputation Systems PDF Proceedings of the 5th International Workshop on Security and Trust Management STM 2009 Saint Malo France Hoffman K Zage D Nita Rotaru C 2009 A survey of attack and defense techniques for reputation systems PDF ACM Computing Surveys 42 1 31 CiteSeerX 10 1 1 172 8253 doi 10 1145 1592451 1592452 S2CID 2294541 Archived from the original PDF on 2017 04 07 Retrieved 2016 12 05 Lazzari Marco March 2010 An experiment on the weakness of reputation algorithms used in professional social networks the case of Naymz Proceedings of the IADIS International Conference e Society 2010 Porto Portugal Archived from the original on 2016 03 07 Retrieved 2014 08 28 Srivatsa M Xiong L Liu L 2005 TrustGuard countering vulnerabilities in reputation management for decentralized overlay networks PDF Proceedings of the IADIS International Conference e Society 2010the 14th international conference on World Wide Web Porto Portugal doi 10 1145 1060745 1060808 S2CID 1612033 Archived from the original PDF on 2017 10 18 Hoffman Kevin Zage David Nita Rotaru Cristina 2009 12 14 A survey of attack and defense techniques for reputation systems ACM Computing Surveys 42 1 1 1 1 31 CiteSeerX 10 1 1 172 8253 doi 10 1145 1592451 1592452 ISSN 0360 0300 S2CID 2294541 Dellarocas C 2003 The Digitization of Word of Mouth Promise and Challenges of Online Reputation Mechanisms PDF Management Science 49 10 1407 1424 doi 10 1287 mnsc 49 10 1407 17308 hdl 1721 1 1851 Vavilis S Petkovic M Zannone N 2014 A reference model for reputation systems PDF Decision Support Systems 61 147 154 doi 10 1016 j dss 2014 02 002 Archived from the original PDF on 2017 07 13 Retrieved 2014 06 03 D Quercia S Hailes L Capra Lightweight Distributed Trust Propagation ICDM 2007 R Guha R Kumar P Raghavan A Tomkins Propagation of Trust and Distrust WWW2004 A Cheng E Friedman Sybilproof reputation mechanisms SIGCOMM workshop on Economics of peer to peer systems 2005 Hamed Alhoori Omar Alvarez Richard Furuta Miguel Muniz Eduardo Urbina Supporting the Creation of Scholarly Bibliographies by Communities through Online Reputation Based Social Collaboration ECDL 2009 180 191 Sybil Attacks Against Mobile Users Friends and Foes to the Rescue by Daniele Quercia and Stephen Hailes IEEE INFOCOM 2010 J R Douceur The Sybil Attack IPTPS02 2002 Hoffman K Zage D Nita Rotaru C 2009 A survey of attack and defense techniques for reputation systems ACM Computing Surveys 42 1 1 CiteSeerX 10 1 1 172 8253 doi 10 1145 1592451 1592452 S2CID 2294541 Rheingold Howard 2002 Smart Mobs The Next Social Revolution Perseus Cambridge Massachusetts Cattalibys K 2010 I could be someone else social networks pseudonyms and sockpuppets Schizoaffective Disorders 49 3 Zhang Jie Cohen Robin 2006 Trusting Advice from Other Buyers in E Marketplaces The Problem of Unfair Ratings PDF Proceedings of the Eighth International Conference on Electronic Commerce ICEC New Brunswick Canada External links editReputation Systems 2008 tutorial by Yury Lifshits Contracts in Cyberspace 2008 essay book chapter by David D Friedman Retrieved from https en wikipedia org w index php title Reputation system amp oldid 1195188848, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.