NotesFAQContact Us
Search Tips
Back to results
ERIC Number: ED554082
Record Type: Non-Journal
Publication Date: 2013
Pages: 93
Abstractor: As Provided
ISBN: 978-1-3031-5525-3
Ending Conflicts and Vandalism in Knowledge Collaboration of Social Media
Zhao, Haifeng
ProQuest LLC, Ph.D. Dissertation, University of California, Davis
Social media provide a multitude of opportunities for knowledge contribution and sharing. However, the content reliability issue has caused comprehensive attention, especially on credible social media, such as Wikipedia. Despite Wikipedia's success with the open editing model, dissenting voices give rise to unreliable content due to two reasons, vandalism from malicious users and conflicts on controversial topics. Vandalism is any addition, removal, or change of content in a deliberate attempt to compromise the integrity of articles. Conflicts are disagreement of contributors who constantly remove the others' modifications, which makes the articles instable. A huge amount of administrative effort is required for large wiki systems to produce and maintain high quality pages. However, vandalism and bias are still commonly exist. From our perspective, current Wiki systems as well as other social media systems are deficient in editing model and knowledge representation model. Motivated by the insufficiency of the existing Wiki systems, this dissertation endeavors in solving the unreliable content problem in both free Wikis and access-controlled Wikis. Although it utilizes Wiki system as research platform, the concepts and approaches could be extended to other social media. By our study, social context (including background, trust relationship, interest and interaction) plays a significant role in online collaboration and knowledge recognition. However, it is missing or improperly used in most major social media systems, such as Wikipedia. Moreover, the current social media lacks the understanding of readers' psychological need. Since the absolute Neutral Point Of View (NPOV) may not be possible in some controversial topics, it is more important to care about different readers' need before presenting knowledge to them. In this dissertation, we sufficiently applied social context analysis techniques, together with data mining and semantic web to solve the content reliability issue. We proposed three Wiki models, SocialWiki, TrustWiki and SmartWiki. SocialWiki focuses on solving vandalism by an improved editing model within access-controlled Wiki systems. TrustWiki and SmartWiki concentrate on conflict problem, but their approaches could handle vandalism too. They question the necessity of uniform content for all readers with physiological proof. TrustWiki presents a new knowledge representation model to provide readers with personalized and credible knowledge. SmartWiki differentiates two types of readers, "value adherents" who prefer compatible viewpoints and "truth diggers" who crave for the truth. It provides two different knowledge representation models to cater for both types of readers. The experiment shows, with reliable social context information, TrustWiki can efficiently assign readers to their compatible editor community and present credible knowledge derived from that community. Its extended version, SmartWiki, could be more versatile to provide compatible knowledge as well as reveal the truth. [The dissertation citations contained here are published with the permission of ProQuest LLC. Further reproduction is prohibited without permission. Copies of dissertations may be obtained by Telephone (800) 1-800-521-0600. Web page:]
ProQuest LLC. 789 East Eisenhower Parkway, P.O. Box 1346, Ann Arbor, MI 48106. Tel: 800-521-0600; Web site:
Publication Type: Dissertations/Theses - Doctoral Dissertations
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A