Search results for author:"Randy Elliot Bennett"
Total records matched: 19 Search took: 0.153 secs
Education Policy Analysis Archives Vol. 9, No. 5 (2001)
Describes the many causes of pressure to change large-scale assessment in the United States and suggests that the largest factor facilitating change will be technological, especially the use of the Internet. The Internet will help revolutionize the...
Journal of Special Education Technology Vol. 8, No. 2 (1986) pp. 44–52
The paper presents a framework for studying the use of computers in special education. The framework consists of two dimensions, one describing the service delivery areas that comprise special education and the other, general questions concerning...
Teachers College Record Vol. 116, No. 11 (2014)
Background/Context: There is little question that education is changing, seemingly quickly and in some cases dramatically. The mechanisms through which individuals learn are shifting from paper-based ones to electronic media. Simultaneously, the...
The Relationship of Expert-System Scored Constrained Free-Response Items to Multiple-Choice and Open-Ended Items
Applied Psychological Measurement Vol. 14, No. 2 (1990) pp. 151–62
The relationship of an expert-system-scored constrained free-response item type to multiple-choice and free-response items was studied using data for 614 students on the College Board's Advanced Placement Computer Science (APCS) Examination....
Journal of Educational Measurement Vol. 28, No. 1 (1991) pp. 77–92
The relationship of multiple-choice and free-response items on the College Board's Advanced Placement Computer Science Examination was studied using confirmatory factor analysis. Results with 2 samples of 1,000 high school students suggested that...
State Education Standard Vol. 3, No. 3 (June 2002) pp. 23–29
This article discusses some of the advantages of computer-based testing and highlights efforts by several states and organizations to introduce electronic assessment. It also describes the challenges policymakers face in planning and implementing...
Review of Research in Education Vol. 39, No. 1 (March 2015) pp. 370–407
On the surface, this chapter concerns the evolution of educational assessment from a paper-based technology to an electronic one. On a deeper level, that evolution is more substantive. In the first section of this chapter, those stages are briefly...
Three Response Types for Broadening the Conception of Mathematical Problem Solving in Computerized Tests
Applied Psychological Measurement Vol. 24, No. 4 (2000) pp. 294–309
Describes three open-ended response types that could broaden the conception of mathematical problem solving used in computerized admissions tests: (1) mathematical expression (ME); (2) generating examples (GE); and (3) and graphical modeling (GM)....
Applied Measurement in Education Vol. 5, No. 2 (1992) pp. 151–69
New developments in the use of automatically scorable constructed response item types for large-scale assessment are reviewed for five domains: (1) mathematical reasoning; (2) algebra problem solving; (3) computer science; (4) architecture; and (5)...
Generalizability, Validity, and Examinee Perceptions of a Computer-Delivered Formulating-Hypotheses Test
Journal of Educational Measurement Vol. 32, No. 1 (1995) pp. 19–36
Examined the generalizability and validity and examinee perceptions of a computer-delivered version of 8 formulating-hypotheses tasks administered to 192 graduate students. Results support previous research that has suggested that formulating...
Applied Measurement in Education Vol. 9, No. 2 (1996) pp. 133–50
Four human judges agreed highly among themselves about the presence of errors committed by 60 adults solving algebra word problems, but were in less agreement about categorizing faults. An expert system agreed with judges about correctness of...
Journal of Educational Measurement Vol. 34, No. 1 (1997) pp. 64–77
A computer-delivered problem-solving task based on cognitive research literature was developed and its validity for graduate admissions assessment was studied with 107 undergraduates. Use of the test, which asked examinees to sort word-problem stems ...
Graphical Modeling: A New Response Type for Measuring the Qualitative Component of Mathematical Reasoning
Applied Measurement in Education Vol. 13, No. 3 (2000) pp. 303–22
Investigated the functioning of a new computer-delivered graphical modeling (GM) response type for use in a graduate admissions assessment using two GM tests differing in item features randomly spiraled among participants. Results show GM scores to...
Validity and Fairness in Technology-Based Assessment: Detecting Construct-Irrelevant Variance in an Open-Ended, Computerized Mathematics Task
Educational Assessment Vol. 8, No. 1 (2002) pp. 27–41
Evaluated whether variance due to computer-based presentation was associated with performance on a new constructed-response type, Mathematical Expression, that requires students to enter expressions. No statistical evidence of construct-irrelevant...
Journal of Technology, Learning, and Assessment Vol. 8, No. 8 (June 2010)
This paper describes a study intended to demonstrate how an emerging skill, problem solving with technology, might be measured in the National Assessment of Educational Progress (NAEP). Two computer-delivered assessment scenarios were designed, one...
Journal of Educational Measurement Vol. 35, No. 3 (1998) pp. 250–67
Examined alternative-item types and section configurations for improving the discriminant and convergent validity of the Graduate Record Examination (GRE) general test using a computer-based test given to 388 examinees who had taken the GRE...
Does It Matter if I Take My Mathematics Test on Computer? A Second Empirical Study of Mode Effects in NAEP
Journal of Technology, Learning, and Assessment Vol. 6, No. 9 (June 2008)
This article describes selected results from the Math Online (MOL) study, one of three field investigations sponsored by the National Center for Education Statistics (NCES) to explore the use of new technology in NAEP. Of particular interest in the...
Evaluating an Automatically Scorable, Open-Ended Response Type for Measuring Mathematical Reasoning in Computer-Adaptive Tests
Journal of Educational Measurement Vol. 34, No. 2 (1997) pp. 162–76
Scoring accuracy and item functioning were studied for an open-ended response type test in which correct answers can take many different surface forms. Results with 1,864 graduate school applicants showed automated scoring to approximate the...
Psychometric and Cognitive Functioning of an Under-Determined Computer-Based Response Type for Quantitative Reasoning
Randy Elliot Bennett; Mary Morley; Dennis Quardt; Donald A. Rock; Mark K. Singley; Irvin R. Katz; Adisack Nhouyvanisvong
Journal of Educational Measurement Vol. 36, No. 3 (1999) pp. 233–52
Evaluated a computer-delivered response type for measuring quantitative skill, the "Generating Examples" (GE) response type, which presents under-determined problems that can have many right answers. Results from 257 graduate students and applicants ...