NotesFAQContact Us
Search Tips
Peer reviewed Peer reviewed
PDF on ERIC Download full text
ERIC Number: EJ848184
Record Type: Journal
Publication Date: 2004
Pages: 8
Abstractor: ERIC
Reference Count: 22
ISSN: ISSN-0818-8068
Shaping University Teaching towards Measurement for Accountability: Problems of the Graduate Skills Assessment Test
Chanock, Kate; Clerehan, Rosemary; Moore, Tim; Prince, Anne
Australian Universities' Review, v47 n1 p22-29 2004
Over the last decade, in both Australia and Britain, universities have been under increasing pressure to make themselves accountable for the extent to which they cultivate in their students transferable skills and "attributes" that will prepare them for the rapidly-changing world of employment. The Government's "Striving for Quality" document begins by pointing out that "Australia is actively positioning itself within an international "knowledge-based economy," which has placed new demands on higher education." The Government has questioned the effectiveness of higher education in this regard, in view of the Nielsen survey on employer satisfaction with graduates conducted in 2000. Here, employers reported that graduates often had weaknesses in the areas of "creativity and flair"; oral (and, to a lesser extent, written) communication; "interpersonal skills"; and "understanding of business practice." Much consideration has been given to the question of what role universities should play in the preparation for work; and, concomitantly, to ways of measuring the outcome of their efforts. This article looks at a testing instrument commissioned by the Department of Education, Science and Training (DEST) and developed by the Australian Council for Educational Research (ACER). This "Graduate Skills Assessment" (GSA) Test is still at a pilot stage, but the Government has shown some interest in it as a method of quality assurance, and appears to be leaving it open as a policy option in the future. It is timely therefore to look closely at what this test promises and how it works. On its website, ACER has made available a sample set of questions representing the types of questions that might be asked in any iteration of the GSA. It has also posted an example of the assessment report that candidates receive, showing them their score, how it compares to "the distribution of the results for the middle 60% of students from fields of study similar to [theirs]," and what the scores at each level indicate about the candidate's skills in each category. In this article, the authors quote from the sample questions, and from the descriptions of what ACER says can be inferred about a candidate who scores at Level 3 (the highest level). The authors consider to what extent particular sample questions seem able to test what they purport to test, at the level at which they claim to test it. The authors comment only on the ones that are publicly available.
National Tertiary Education Union. PO Box 1323, South Melbourne 3205, Australia. Tel: +61-3-92541910; Fax: +61-3-92541915; e-mail:; Web site:
Publication Type: Journal Articles; Reports - Evaluative
Education Level: Higher Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Identifiers - Location: Australia