NotesFAQContact Us
Collection
Advanced
Search Tips
Back to results
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ1020209
Record Type: Journal
Publication Date: 2013
Pages: 26
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-1547-9714
EISSN: N/A
Systematically Evaluating the Effectiveness of an Information Systems Capstone Course: Implications for Practice
Gill, T. Grandon; Ritzhaupt, Albert D.
Journal of Information Technology Education: Research, v12 p69-94 2013
The information systems capstone experience for undergraduate majors is a topic frequently discussed in academic circles within the field. Though very little consensus exists on this topic, there are core themes that emerge in the design, development, and delivery of the undergraduate capstone courses. Our analysis of prior literature reveals four components to capstone experiences that are often used in these courses, which include 1) a focus on real-world, complex information systems design, development, and integration; 2) the use of the case method to engage students in critical thinking, problem-solving, and decision-making; 3) the use of ePortfolios to assist students in integrating knowledge across the discipline and showcasing their learning to potential employers; and 4) an emphasis with the undergraduate information systems majors interacting with seasoned professionals in the field. These dimensions were carefully considered in the implementation of a capstone course at a large research extensive university in the southeastern United States. This project was funded by a "National Science Foundation" (NSF) grant program. The paper presents a novel information systems undergraduate capstone course that includes two major design themes: case pedagogy and individual projects. The course is carefully described in terms of the student learning objectives, salient features, and philosophy. Further, the course was systematically evaluated using a multi-method approach, involving five distinct data sources: the "Critical Thinking Assessment Test" (CAT), the "Student Assessment of Learning Gains" (SALG), the "Student Assessment of Instruction" (SAI), faculty review of mini-cases, and student and instructor reviews of final projects. These data were examined both descriptively and inferentially. Results indicate that students were generally satisfied with the course design. However, some negative comments were made. Students did not show significant increases in critical thinking skills as measured by the CAT. However, faculty review of the pre- and post-mini-case submissions shows that students had significant gains in evaluating information, creative thinking, and learning and problem-solving. Implications for practice are provided in light of the findings.
Informing Science Institute. 131 Brookhill Court, Santa Rosa, CA 95409. Tel: 707-531-4925; Fax: 480-247-5724; e-mail: contactus@informingscience.org; Web site: http://www.informingscience.us/icarus/journals/jiteresearch
Publication Type: Reports - Evaluative; Journal Articles
Education Level: Higher Education; Postsecondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A