NotesFAQContact Us
Collection
Advanced
Search Tips
50 Years of ERIC
50 Years of ERIC
The Education Resources Information Center (ERIC) is celebrating its 50th Birthday! First opened on May 15th, 1964 ERIC continues the long tradition of ongoing innovation and enhancement.

Learn more about the history of ERIC here. PDF icon

Showing all 7 results
Peer reviewed Peer reviewed
Stufflebeam, Daniel L. – Studies in Educational Evaluation, 1985
Point of entry problems are faced by evaluators when asked to start an evaluation at an inappropriate time or to perform an inappropriate study. The fundamental purpose of evaluation is to help improve services. Guidelines are presented for choosing when to do a context, input, process, or product evaluation. (GDC)
Descriptors: Elementary Secondary Education, Evaluation Methods, Evaluation Utilization, Evaluators
Peer reviewed Peer reviewed
Scheerens, Jaap – Studies in Educational Evaluation, 1985
In order to better understand the influence of the organizational setting on evaluation, this conceptual framework was developed and tried out in a meta evaluation of innovatory educational programs in Holland. Four components are explained--contigency factors, organization structure, policy-making, and evaluation research--and a checklist is…
Descriptors: Evaluation Criteria, Evaluation Utilization, Meta Analysis, Meta Evaluation
Peer reviewed Peer reviewed
Schott, Franz; And Others – Studies in Educational Evaluation, 1984
The relationship between instruction and a test is defined as a parallel content-valid relation. This article describes the PLANA procedure, which approaches the problem of content validity by applying constructional rules for producing or judging the content validity relationship between the instructional objective and the items. (BW)
Descriptors: Criterion Referenced Tests, Educational Objectives, Instructional Design, Teacher Education
Peer reviewed Peer reviewed
Shepard, Lorrie – Studies in Educational Evaluation, 1980
The success of state assessment programs depends on how well the results are reported to their various audiences. This paper presents 11 guidelines for improving reporting practices, including: plan ahead; develop different reports for different audiences; and field test report formats, language, and content. (BW)
Descriptors: Audiences, Audiovisual Aids, Educational Assessment, Field Tests
Peer reviewed Peer reviewed
Plomp, Tjeerd; Van Der Meer, Adri – Studies in Educational Evaluation, 1980
The explicit statement of course objectives and a retrospective analysis of student test results are used in the evaluation of a mathematics course for engineering students. (BW)
Descriptors: College Mathematics, Course Evaluation, Course Objectives, Data Analysis
Peer reviewed Peer reviewed
Newfield, John – Studies in Educational Evaluation, 1980
When the basic unit being studied is a grouping of individuals, both students and items can be sampled. When this method is applied to the use of log sheets to measure fidelity of program implementation, the normal limitations of self-reporting are reduced. (BW)
Descriptors: Item Sampling, Measurement Techniques, Program Implementation, Questionnaires
Peer reviewed Peer reviewed
Smith, Denise M.; Smith, Nick L. – Studies in Educational Evaluation, 1981
Evaluation audiences can understand evaluation reports if the reports are written in common language, use proper organization, and incorporate information which provides the audience with a context for understanding the report. Suggestions about how this can be achieved are offered. (Author/RL)
Descriptors: Evaluation, Evaluation Criteria, Research Reports, Technical Writing