NotesFAQContact Us
Search Tips
ERIC Number: ED196792
Record Type: RIE
Publication Date: 1980-Nov
Pages: 22
Abstractor: N/A
Reference Count: 0
Evaluation Capacity-Building in Social Work: A Comparison of Professional Education and In-Service Training.
Johnson, Paul L.
This paper examines the relative impact of a professional education program compared to an inservice training program for evaluating the effectiveness and efficiency of social work. There is controversy regarding the appropriate place of professional education and inservice training in program evaluation capacity building. Advocacy for inservice training is sometimes associated with a perspective that evaluation is a job, while advocacy for formal training is more closely tied to the view of evaluation as a profession. Capacity building is operationally defined in two respects: (1) the student or trainee's perception of increased capability or confidence in performing evaluation tasks, both managerial and technical, and (2) the student or trainee's later involvement in agency evaluation activities. Questionnaires were sent to two groups: the first group were graduates of the masters program at Syracuse University School of Social Work and the second group had undergone inservice evaluation training by the New York State Department of Social Services. The Syracuse graduates praised the blend of administrative and casework practice skills in their curriculum. In comparison, the workshop participants said that their evaluation training was most instructive at the on-site training sessions but that too much knowledge was presented too fast and the material tended to go over their heads. The Syracuse graduates expressed high self-confidence in their capability of designing and implementing data collection instruments and their ability to apply statistics using computer statistical packages. In contrast, the workshop participants said their stronger confidence was in taking on the tasks of organizing an evaluation study, defining the evaluation question, interpreting the findings, and writing the evaluation report. They saw the technical areas of instrument design, sampling, and statistics as problem areas for which they would seek assistance from specialists. Regarding their present evaluation activities, the variations in degree and kind of involvement for both graduates and workshop trainees was large. (Author/RM)
Publication Type: Reports - Evaluative; Speeches/Meeting Papers
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Note: Paper presented at Annual Meeting of the Evaluation Research Society (Washington, DC, November 19-22, 1980).