Publication Date
In 2024 | 0 |
Since 2023 | 0 |
Since 2020 (last 5 years) | 0 |
Since 2015 (last 10 years) | 3 |
Since 2005 (last 20 years) | 3 |
Descriptor
Accuracy | 2 |
Information Retrieval | 2 |
Classification | 1 |
College Students | 1 |
Comparative Analysis | 1 |
Competence | 1 |
Comprehension | 1 |
Correlation | 1 |
Decision Making | 1 |
Difficulty Level | 1 |
Documentation | 1 |
More ▼ |
Source
Information Research: An… | 3 |
Author
Ravana, Sri Devi | 3 |
Bukhari, Sarah | 1 |
Hamid, Suraya | 1 |
Ijab, Mohamad Taha | 1 |
Koh, Yun Sing | 1 |
Rajagopal, Prabha | 1 |
Samimi, Parnia | 1 |
Webber, William | 1 |
Publication Type
Journal Articles | 3 |
Reports - Research | 3 |
Education Level
Higher Education | 1 |
Postsecondary Education | 1 |
Audience
Location
Malaysia | 1 |
Laws, Policies, & Programs
Assessments and Surveys
What Works Clearinghouse Rating
Bukhari, Sarah; Hamid, Suraya; Ravana, Sri Devi; Ijab, Mohamad Taha – Information Research: An International Electronic Journal, 2018
Introduction: The understanding of the information-seeking behaviour of international students using social media is limited. Therefore, the purpose of this research is to model the information-seeking behaviour of international students when they use social media to find information. Method: A mixed method approach was employed to collect data…
Descriptors: Foreign Countries, Information Seeking, Foreign Students, Social Media
Samimi, Parnia; Ravana, Sri Devi; Webber, William; Koh, Yun Sing – Information Research: An International Electronic Journal, 2017
Introduction: Despite the popularity of crowdsourcing, the reliability of crowdsourced output has been questioned since crowdsourced workers display varied degrees of attention, ability and accuracy. It is important, therefore, to understand the factors that affect the reliability of crowdsourcing. In the context of producing relevance judgments,…
Descriptors: Reliability, Value Judgment, Competence, Individual Characteristics
Rajagopal, Prabha; Ravana, Sri Devi – Information Research: An International Electronic Journal, 2017
Introduction: The use of averaged topic-level scores can result in the loss of valuable data and can cause misinterpretation of the effectiveness of system performance. This study aims to use the scores of each document to evaluate document retrieval systems in a pairwise system evaluation. Method: The chosen evaluation metrics are document-level…
Descriptors: Information Retrieval, Documentation, Scores, Information Systems