Citation
State educational testing practices

Material Information

Title:
State educational testing practices
Added title page title:
OTA background papers
Creator:
United States. Congress. Office of Technology Assessment.
United States. Congress. Office of Technology Assessment. Science, Education, and Transportation Program
Publisher:
U.S. Congress. Office of Technology Assessment
Publication Date:
Language:
English
Physical Description:
279 p. : ill. ; 28 cm.

Subjects

Subjects / Keywords:
Educational tests andmeasurements -- United States ( LCSH )
Genre:
federal government publication ( marcgt )

Notes

General Note:
This report discusses state educational testing practices. Testing as an indicator of educational attainment is a characteristic of the American educational system. While there are many questions surrounding the use of tests for various purposes, when American public policy turns periodically to focus on public education, tests tend to increase.

Record Information

Source Institution:
University of North Texas
Holding Location:
University of North Texas
Rights Management:
This item is a work of the U.S. federal government and not subject to copyright pursuant to 17 U.S.C. §105.
Classification:
Y 3.T 22/2:12/2003012553 ( sudocs )

Aggregation Information

IUF:
University of Florida
OTA:
Office of Technology Assessment

Downloads

This item is only available as the following downloads:


Full Text

PAGE 1

State Educational Testing Practices December 1987 NTIS order #PB88-155056

PAGE 2

. L -. ~ ----.. -----------. ., -. .,. A .. STATE EDUCATIONAL TESTING PRACTICES John Andeln, Assistant Director Science, Information, and Natural Resources Division Nancy Carson Naismith, Program Manager Science, Education, and Transportation Program Contractors Northwest Regional Laboratory Susan M. Bennett and Dale C. Carlson California Keith L. Cruse Texas Thomas E. Fisher Florida Steven Koffler New Jersey Winsor A. Lott New York Wayne Martin Colorado Wayne Neuburger Oregon Edward D. Roeber Michigan

PAGE 3

. ., ,. .. Table of Contents INTRODUCTION ..** eeo*o@Oo oo**OO* .o***ee .*oo. **o* Oo. oO*a**o ***. Ooo. o**oe. eo 1 ANALYSIS OF OTA SURVEY OF STATE TESTING . . . . . . . . 3 State Assessment Programs . . . . . . . . . . . . . 5 Table I: Authorization and Purposes of State Assessment Program . .. .......5 Table II: Program Characteristics . ., . . ., . 0 . . . 14 Table III: Uses of State Assessment Data l *Oo. o.* OOo. eoo*ee* O*eeo*oo *e.*.*. 23 Table IV: Variables Used to Aid Interpretation of Data l * * ******... 42 Table V: Test Construction .**ee** .*. **. **** **. **. .*** .*. .. e . . 54 Table VI: Reporting Test Scores ... 59 Table VII: Effects of Program . . . . l * * . . 64 Table VIII: Functions of Technical Staff l .*** *.. e * ...***... 70 Table IX: Staffing and Expenditures for Program, 1984-85 l * ...***... 75 Table X: Testing Time Required (Minutes Per Students). . . . . . ..82 Table XI: Changes in State Assessment Program l e.**.*.*.*...**...*...**.*. 87 Minimum Competency Testing Programs l *.. **. *e. . 99 Table 1: Characteristics of Programs .*** ***. *.. ***. * e * . . . 102 Table II: Testing Programs ..** .*** ... *.. ..* b*. .**. ... b . . . 116 Table 111: Reporting Practices of Testing Programs. . . . . . .. ...130 Table IV: Examples of Changes in State and Local Educational Programs and Practices Resulting from State Minimum Competency Programs. ........136 Table V: Functions of Technical Staff and Failure Rates . . . . . .141 Table VI: Testing Time Required (Minutes per Student) * **.*..... 147 Table VII: Changes in Minimum Competency Program . . . . . 153 TESTING SNAPSHOTS OF EIGHT STATES 0...000...0........0.....00.0.,...0. 161 California *.. 163 Colorado o*. . 16 4 Florida .**. *.** ... .*. ... m*. **** ... **. ..*. ... ***. .. e*...*..*...**...*.. 192 Michigan . . . . . . . . . . . . . . . 202 New Jersey . . . . . . . . . . . . . . 217 New York *.. ***. . 230 Oregon . . . . . . . . 237 Texas 267

PAGE 4

. ,. . INTRODUCTION Testing as an indicator of educational attainment is a characteristic of the American educational system. While there are many questions surrounding the use of tests for various purposes, when American public policy turns periodically to focus on public education, tests tend to To give an indication information that offers two testing. First, OTA supported increase. We are currently in such a period. of the present level of activity, OTA has compile d approaches to understanding the current climate for a survey of the states to identify the extent of two types of testing now in wide use testing for assessment purposes and tests to determine minimum competency. The survey data was compiled by the Northwest Regional Educational Laboratory in 1985. Second, eight states were selected, and people active in testing were asked to describe, in their own words, the forces and some of the results of those forces. Thus, this document offers two ways to observe trends. have incorporated minimum competency testing into their passage into a higher grade or for graduation from high school. behind increased testing, A large number of states requirements, either for The object of this testing is to establish certain standards of learning that should be mastered by all students and to ensure that objective criteria are used to measure basic achievement. A related effect is to influence curriculum through specifying certain material that by definition must be covered. Testing for assessment, a less familiar term, has come into use as a method for understanding comparative achievement by groups of students, and by schools or school districts. Assessment testing is considered to be more insightful and give more useful information to educators than comparison based simply on traditional achievement tests.

PAGE 5

.{ >, As in any study of American education, aggregate data cover a wide variety of different circumstances. Most decisions on testing are still made at the level of the states or the school district. Increasingly, however, decisions are shifting to the state level. This trend is consonant with increased belief by state legislatures and citizens that a broad responsibility for producing well educated citizens requires state-level action. This trend is often coupled with increasing interest in competitiveness and a related belief that a state cannot do well in attracting employment without a strong educational base. Many of the state "vignettesM reveal this philosophy. Examination of the state vignettes, the explanatory notes on testing data, and the raw data, will provide a snapshot of a certain type of testing in wide use in the mid1980s. As with any survey data, exact numbers of figures, particularly dollar amounts, are difficult to compare across states. The tables should be read as general indicatorsof trends. 2

PAGE 6

. _. ANALYSIS OF OTA SURVEY OF STATE TESIING Introduction The Office of Technology Assessment (OTA) commissioned a survey of statemandated standardized testing programs in each of the 50 states and the District of Columbia. The purpose of the survey was to update information secured in earlier studies conducted by the Education Commission of the states and the Center for the Study and Evaluation at UCLA. The following is a list of the tables used to report the data received: State Assessment Table I Table 11 Table 111 Table IV Table V Table VI Table VII Table VIII Table 1X Table X Table XI Authorization and Purposes of State Assessment Program Program Characteristics Uses of State Assessment Data Variables Used to Aid Interpretation of Data Test Construction Reporting Test Scores Effects of Program Functions of Technical Staff Staffing and Expenditures for Program, 1984-85 Testing Time Required (Minutes per Student) Changes in state Assessment Program A telephone survey of 50 state education agencies (SEAS) and the District of Columbia in June and July. of 1985 was conducted under the supervision of Dr. Gary D. Estes, Director of the Assessment and Evaluation Program of Northwest Regional Educational Laboratory. The difficulty of securing reliable and precise data by telephone on subjects as complex as these is apparent, but every effort was made to secure and report information that did not exceed the limitations of the method.

PAGE 7

.-l -$ { :, = ,. I Minimum Competency Table 1 Characteristics of Programs Table U Testing Programs Table 111 Reporting Practices of Testing Programs Table IV Examples of Changes in State and Local Educational Programs and Practices Resulting From State Minimum Competencies Program Table V Functions of Technical Staff and Failure Rates Table VI Testing Time Required (Minutes per Student) Table VII Changes in Minimum Competency Program -. ., I

PAGE 8

-.. .. -. .. .. . . . -. . .. -STATE ASSESSMENT PROGRAMS Table I Authorization and Purposes of State Assessment Program As of 1985, state legislatures had authorized state states. In three of these states state education agency assessment programs in 19 authorization preceded the legislative mandate. The state education agency was the sole authorizing agency in three additional states. Three more states reported authorization without specifying whether it was legislative, state education agency, or some other source. h at least four states the state board of education was named as the authorizing body. The movement to introduce or to improve state assessment programs has gained momentum recently. Between 1983 and 1985, six states (Alaska, Colorado, Indiana, Iowa, Idaho, and South Dakota) authorized new programs, and 19 additional states introduced major changes in existing programs. As of summer 1985, 13 states reported they had no state assessment program. Not only do the authorizing bodies differ among states, but the stated purposes for which assessment programs were established differ from state to state and reflect little common content across states. The Connecticut program, for example, was authorized by the state board of education as a vehicle by which it could carry out its legislative responsibility for determining the efficiency and efficacy of education programs. This program, first implemented in 1971, was changed in 1985 using a testing program designed to reflect mastery of a uniform curriculum. In most states, laws providing for the establishment of state assessment programs specify the type of students to be tested and areas to be measured but often do not define the states purposes for implementing the programs. Some do specify the purposes of state assessment. Indiana states its program is in place to identify students needing remediation so the state can allocate funds to assist schools having such students. Kentuckys program is designed to provide diagnostic and analytical information for use 5

PAGE 9

,.,, in improving curricula at local levels. Maryland collects normative data at the school, district and state levels to insure accountability. Minnesota uses state assessment data in local district planning and evaluating, and in state education agency planning, evaluating and reporting to the state legislature. Mississippi reports it uses state assessment data for decision making in education generally. South Carolina says the state assessment program provides data school advisory councils use in developin g improvement plans. Louisianas program provides state, district and schools with data useful in the diagnosis of educational needs of individual students. South Dakota states the purpose of its program is to fulfill the need for information indicating the educational status of the state. I / 1

PAGE 10

.. -,. .-A -. . . & ... . -------State Alabama (A) Alaska (A) Ar Lzona (A) Mbnsae (A ) Ca 11 fornla (A) Colorado No stat program :onnectlcut (A) (A) >t.ato SawE : mandate Au k N gis. ve tenl Y Y Y N l s t N N N N N N : Other name) .B. E. ,. B.E. N u N S.B .E, (If)t i(: I G latemt major 1985 1985 : mting Ianged nut e~ tar; rw lriab 1( e add~ 1985-E 1984-8 1984-85 Connect i cut Mastery Program St. (c) Wording, SEA rules, regu lations Jot stated. Hill admm&ster a standardize schieveman t test. State Board required :0 uniformally test pupils annually in reading, languaq arts and math. Year #1 of pilot: 3,6,9,11 will be tested using standardized tests; all regular students, excludlng spec~al ed. Year #2 of pilot: -k at instruments with samples State Board of Educat~on shall determine the efficiency and efficacy of education program. -cally R) kc Data @mpilad for the Office of ~chnology Asseeoent by Northweet I seLocted/constructed ilot program began U~y 1, 1985 astery Program LS new eparate proqram, start eptember 1985. ret(s) Regional Sducationel LakOrato~, 1985 7

PAGE 11

. . . . -..J 1 Statm Delaware (A) District of Columbia (A) Floria(a ) korqla ( A Hawaii (A) Idah o Indiana (A) State Assesment Table I Authorization and Purposes of State Assesment Program A U egis Y Y NA 3/2/84 orized SEA l dmin. N N N Y N NA 1976 by: Other (name) N N S.B.E. N S.B.E. NA N Year authorized 1978 N R NA id 60s 984 None 3/2/84 Legislated Year implemented 1978 1971 1971 Mid 60s 1905 1976 1978 Year latest major changes 1985 1985 1981 April, 1985 1985 1984 Wording, SEA rules, regu ulations they put out manuals for who, when; not regulations. Board will approve superintendent testing program annually for criterionreferenced test and normreferenced test. Do not have. Department will conduct assessment in achievament, aptitude and competency areas. NA Will be after July Competency testinq 1, 1985. and remediation program to identify lowest percentage of students for remediation. 1978 Board ruling required districts to report achievement results to Board Combined withHi Competency under student Assment (SSAT 1 & 2); no comment under Mi Competency 1984 appropriate for lowest 15 pe of third grade population. 1978 proqram had funds. I 8

PAGE 12

---. .. . ..4. .-. . State Assmssmmnt TableI I Authorization and Purposes of State Asme!asment Program State iowa No state proqram Kansam (A) Kentucky (A) Louisiana (A) Au s 985 eqls?ltior Y f Y Y Y )rized SEA ldmin. N N N N N N : other name) N N N N N N rear authorized 1985 dels b e develp 1. 1979 1978 1976 1976 1971 Year implelented possible program to begin in 1987 1981 1979 1978 1976 1971 Year latest major changes 1984 None 1984 wordinq, SEA rules, regulations State Board of Education in conjunction with state education agency will develop models for procedures for testing; models for higher order thinking skills and critical thinking skills at ,7,10,11 by January 1987. Desiqied to determine the level of minimum comtence of students in grades 2,4,6, & 10. Focus of tests in grades 2,4,6 to determine students competence in reading and math. Students in grades & 10 are also tested in reading and math but to asses their ability to function comptently in adult society. To provide diagnostic and analytic information to be used to improve curriculum at local level to provide state, districts and school-level data for diagnostic information on students. Requires proqram to assess on a regular basis 4,8,11, public elementary and secondary schools approved for tuition purposes. For purposes of accountable] to the State Board of Education; will collect normative data at school, district and state levels. Caments cry loose, nothinq undated. 1976 program fourth grade aseessed only. 9

PAGE 13

.-. ... . .. . . stat.e assachusetts No state proqram (Bill currently in legislature) . Minnesota (A) Mississppi Missouri (A) Montana No program A) state Nebraska No stat program State Assessment Table I Authorization and Purposes of State Assessment Program Au legis 1970 1976 Y Y orized SEA admin. 1969 1970 N N Other (name) N N N N Year authorized 1969 1970 1982 1975 Year implemented 1969 1970 1984 1975 Year latest major changes 1979 1904-85 1985 Wordinq, SEA rules, req ulations stablished that State Board f Education shall conduct annual assessment of 4 & 7 rades in math, language and as they deem appropriate. Planning, l valuating and l portinq legislation: provides for local control f state assessment (optional n item bank; technical assistance and mastery in comsnunication and math. Districts need to plan and evaluate. State program purposes for testing are for decisionmaking. 1975 was a voluntary program requiring periodic assessment in Enqlish, reading, social studies, science, language arts, civics and math usinq NAEP model. 1985 program mandated assessment by state. c omments Law did not spe purposes. 10

PAGE 14

. . . . . Stat e !Jevada No state proqram New Hampshire No state program New Jersey No state program New Mexico (A) New York (A) North Carolina (A North Dakota (C) No state program State Assessment Table I Authorization and Purposes of State Assessment Program Au N NA 1977 orized SEA admin. N NA N Other name) S.B.E. NA N Year authrizad 1972 NA 1977 Year implemented 1972 Regents exam: 1978 PEP test 1965 Comp: 1979 1977-78 Year latest major changes 1981 NA 1983 Wording, SEA rulee, requlations Provide for the evaluation of student performance, both during and upon completion of the program. Purposo not l xplicitly tated--just stipulates what will be tested: Reqents exam program tests proximately 1 million students in qrades 9-12: here are 22 different subject exams taken over our years. NA NDL: There is no mendated state-wide assesamen t in North Dakota. Each fall, c omments Has no state assessment. In 1978 and 1980 they sampled about 6,000 students in 5-10 district in grades 5,8, and 11. major Changes: in tests themselves # different subjects decreased over years, original tests were ess only, now use objective 6 essay questions. methods of development oriqinally by SEA staff now claasroom teachers develop tests amount of local latitude l originally run from SEA now LEA S do most of th scoring, recordkeepinq & issue reqents diploma now a cooperative proqr between SEA & LEAs. NDL LEAs test qrades 3,5,7,9 and 11 at their option. About 66 percent of the students are tested. Host use SRA. State Education Department is being reorganized. A new director with an emphasis in testinq and curriculum development is coming in. Changes may occur then. 11

PAGE 15

... .. -. -. . . . . ., ,+ Stato Ohio No state program Oklahoma No state program Oregon (B) Pennsylvania Rhode Island (A) (A) South Carolina (A) south Dakota (A) Tennessee Texas No state program Utah (A) OHl: OH2: state Assessment Tabe I Authorisation and Purposes of State Assesment Program Au Y NA Y 1977 N Not N Ohio orized SEA admin. N NA N 1971 N availble N By other N N N N S.B.E. fo r S.B.E. Year authrized 1974 1965* 1966 1971 1984 view 1975 I Year implemented 1974 1970 as a service to district 1975 1971 1985 1975 Year latest major changes 1985 1985 1977 1984 3 test 1-12 in reading Wording, SEA reading, reqlation Not specified Orginally had in state law. been to build curriculum around goals and lot based on subject matter: critics said too general, ranted l pacificity; l ffectiva .985-86 change to satisfy critics of SEA administration SEA shall conduct achievement and aptitude testinq in a inform testing proqram. 1977 Education & Finance Act School advisory councils shall conduct needs assessment and school improvement programs and use state testing data for improvement plans. Intention is to get an indication of educational status of State. NR ing, math and l rently requires LEA's writing each year. This began in 1983 from a State Board decision of 1982. Test results are used primarily for local curriculum development. NO data are given to the State. The SEA does provide technical assistance in administration and interpretation. Two million students are tested at a cost of $5,000,000--all of which is appropriated by the legislature to go directly to the districts. Of that, $2,000,000 was spent to buy new tests this year. Each year, there is a move in the legislature to begin collecting state-wide data. Chances look better each year, but it has yet to pass. Competency Based Education Program requires continuous monitoring of stadent progress K-12 which can be construed as a state testinq program. In addition, each district is required to give the three tests deecribed in footnote OH4. OH1 OH2 Field testinq 1965-67 to decide direction o program; 1967-69.to develop instruments. 1985-back to l very pu tested as before 1975 July 1985-3,6,8,10 to be tested across subj tested. Unable to get exact wording of policy. 12

PAGE 16

. . 4 . . . ----- State A8memament Table I Authorization and Purposes of State Assessment Program State .Jennont Yo stat( p roqram Jirqlnla (A) Washlnqton (A) West Virginia (A) Wisconsin Au Y Y Y Not N orized SEA admin. N N N availabl Y Other (name) N N N for interview N Year authrized 1950 1976 1962 .View. NA Year implemented 1950 1976 1962 1984 Year latest major changes 1972 1985 1985 Wording, SEA rules, regu lations From time to time, State Board of Education should administer tests to measure progress of students in schools (later law specified norm-referenced tests) Superintendent (SPI) shall conduct basic skills assessment with assistance of local districts. NR Voluntary proqram; no law. Funds are allocated by legislature. Coments Respondent is fairly new to the department, so he was not clear on historical lnformat;13

PAGE 17

. ., -. >, . ... Table II Program Characteristics Tabulation of the grade levels at which subjects are tested in the various states reveals little uniformity of practice. The subjects of reading, math and language arts are most generally tested. Grade levels most often tested are 3 or 4, 8 and 11. Arizona tests students every year from first grade through twelfth, Kentucky K-12. Thirty-four states reported having an assessment program test in reading. Of these states, all but Wyoming which requires a writing test, also have a math test. Twenty-four states include language arts in their testing programs. Writing is tested in 16 states. Somewhat less than half as many districts administer science, social studies and writing tests as administer reading, math and language usage or language arts tests. A few states include subjects such as citizenship, critical thinking, personal or life-skills, business and career education, art and music, reference skills, computer literacy, environment, energy and health as part of the state assessment program. A few states have multiple subject-area tests across several grade levels Alabama, for example, tests reading in grades 1, 2, 4, 5, 7, 8 and 10; math at levels 2, 4, 5, 8 and 10; language arts at 1, 2, 4, 5, 7, 8 and 10; science at 2, 5, 8 and 10 and social studies in grades 2, 5, 8 and 10. Sources of testing instruments used in the state assessment program were the state education agency in 13 cases, the state education agency through a contractor in 8 cases, and a publisher% standardized test in 19 cases. The majority of states administer tests to all students in grade levels to be tested in a particular year rather than using sampling procedures. In most cases, testing of particular grade levels year after year is followed. However, in a few cases the tests are administered to different students in different subjects from year to year so that the impact of the program is spread over several years. 14

PAGE 18

. . . . State Alabama Alaska Arizona Arkansas California Subjects tested Reading Math Language Arts Science Social Studies Reading Math Reading Math Languaqe Arts Writing Reading Math Language Arts Reading Math Lanquaqe Arts Social Studies Critical Thinking I State Assessment Table II Program Characteristics Grade levels 1,2,4,5,7,8,1 2,4,5,8,10 1,2,4,5,7,8,1 2,5,8,10 2,5,8,10 4.8 4.8 1-12 1-12 1-12 4,8,11 4,7,10 4,7,10 4,7,10 3,6,8,12 3,6,8,12 3,6,8,12 8 8 Ins SEA) N N N Y custom developed SEA thru contractor N Item bank also N N N Instruments Publ. Stan. dardized SAT SAT SAT SAT SAT N ITB S SAT 9-12 SRA N from 1984 N NAEP ETS doing sc orln$ N N Pilot Advisory test primary trait I I test (Combining published I Approx. tested (84-85), all subjects 385,000 15,000 461,000 100,000 In grades 4,7, 10 combination of published items) Notes Add grade 1,4,7 to Science and Social Studies in 1986. Switched to CAT and SAT in 1984. 1985 changed Voluntary program to mandatory. required reporting by district. Specified speicial education students included. Social Studies (Critical Thinking added 84/85. Grade 8 testing added 84/85. Fall 85 (Grade 8 12) writing sample to be added. SOURCE; Data Complied for the Office of Technology Assessment by Northwest Regional Educational Laboratory, 1985. 15

PAGE 19

.. .,. ,. .. State Colorado i Connecticut A. B. State Assessment Mastery Proqram Delaware D.c. Subjects tested To be determined Life Skills W ill be tested reading math language Arts Writing Science Social Studies Business Ed. Career Ed. Art & M USIC Reading Math Language arts Writinq Critical Thinking Reading lath Language Arts Writing Science Social Studies Reading N RT Math NRT Lang. Arts NRT Science NRT Social St. NRT Other CRT in Reading math, Science Lanquaqe Arts Grade levels 11th 4,8,11 4,8,11 4,8,11 4,8,11 4,8,11 $,8,11 12 4,8,11 4,8,11 4 In Fall 85 36: add grades 4,6, 1-8, 11 1-8, 11 1-8, 11 9 11 11 3,6,8,9,11 3,6,8,9,11 3,6,8,9,11 3,6,8,9,11 3,6,8,9,11 3,6,8,9,11 1-6 state Assessment Table 11 Program Characteristics Ins NA Y N RT Custom developed SEA thru Contractor NA N N CRT truments Publ. standardize( N N CTBs CTBS led Writing sample N Holistic and analytical and P.T Scored holistically analtically Other N N L Degrees, E war N lly call Approx. no. tested [-85), all subjects 7-OK 40,000 per grade 60,000 (stdd) 7,5(30 (writinq) 39,000 ( ( N N N N Y N Y Y Y Notes Legislatur specified pilot prog grades 3,6 9,11. life Skill. n grade 1 Testing roted ye o-year. his is a new program All regula sudents a most speci education students. Florida Combined wiht M.C. under SSAT 62. 16

PAGE 20

s. . . ~. L.. --State Georqla Hawaii Idaho Illinois Indiana Subjects tested Reading, Math Readinq, Math Wrin g Science Critical Tkq. Athletics/P. E Health Social Studies Reading, Math, Lanuage Arts, Writing, Science, Social Studies Reading Math Lanquaqe Arts Writing science Social Studies ote: This varies year and subject subject area-subjects each Readlnq, Math Writing Grade levels 1,3,6,8,10 2,3,6,8,10 3,6 3 3 3 Grade 11 Grade 11 4,8,11 4,8,11 4,8,11 4,8,11 4,8,11 4,8,11 es year-to area-tothey cycle year. State Assessment Table 11 Program Characteristics Instr SEA Y Y N Y i Custom developed SEA thru Contractoe N Y N N Y comments Publ. standardized N SAT (at all grade levels) Test of Achievement and Proficiency N For pilot Bed Writing Sample 985-86 iloting olistlc with some rrimary trait or grades 6,8,1O SAT (holistic analytical N Y N Holisti and P.T Other N N N N Approx. no. tested (-85), all subjects 320,000 88,000 11,917 7,500 (note commen t columns lard 2) 63,1OO Notes Use Georgia teachers to rite all test goe s through contranct with Gorgia St. Univ. (acts as fiscal agent to do item writing) SEA copyRight tests Moved test from 4th to 3rd grade. For those takinq all subtests. Another qrade to be determlned. 17

PAGE 21

, -. . -----. State IOWA No state proqram Kansas Kentucky Lousia n maryryLa Massachusetts No. state proqram Migchigan Subjects tested Readinq, Math Reading, Math Languaqe Arts Spelling, Reference Skills Reading, Math Writing Reading, Math Language Arts Writing, Science, S OC ial Studies Readinq, Math Language Arts Reading, Math Writing, Science, Social Studies Other Grade levels 2,4,6,8,10 K-12 (4/85 7,10 4,8,11 3,5,8 4,7,10 State Assessment Table 11 Program Characteristics In SEA N Y N Y N N custom developed SEA thru Contractor Y CTBS subcontrac t Y Y N N Publ. standardize N CTBS T.C,'s N N CAT N l ed Writing Sample N N N N N N other N N N N N N Approx. no. tested (-85), al Subjects 150,000 710,000 120,000 48,000 175,000 330,000 l N N N Y c Y Y Y Y and Studies Y N and Notes s 10th grade added in on a volum basis. La in provided fundinq.

PAGE 22

. .. .-, J: . -.. .._. . ----. State Minnesota Mississippi Montana No state program Nebraska NO state program Nevada N O state program New Hampshire No state prog ram Subjects tested Reading Math Languaqe Arts Writing Science Social Studies Computer Lit. Personal Skill Enerqy Health Reading Math Lanquage Arts Other Reading Math, Other, Science, Social Studies Grade levels 4,7,8,11, 12 3,4,8,11 4,8, 11 6,9,11 4,8,11 4,8,11 8,11 10-12 6,9,12 6,9,11 L-4 L-4 1-4 3-12 6 & 12 4 6 & 12 6-12 6-12 State Assessment Table II Program Characteristics Ins SEA Y N Y custom developed SEA thru contractor N Y N truments Publ standardize< N N N led Writing sample Analytic for Rehetorical lin g Ian tie Holistdally analytilly below tandard) N the] N rehtorical inquistic Holisticall 85 to I N Approx. tested (84-85) all subjects 270,000 17,000 N N N Notes Added in 1984-85. Added 6 & 8 to NRT in Othrr subject areas tested by *87 a qood Possibllity. l Grades to be determined for program. l Language Arts included in 19

PAGE 23

.. . . . i ... . -. Stat. A8socement Teblo II Program Character i.tic8 State New Mexico New York North Carolina North Dakota stat l program ohio No state pragram oklahoma No State program Subjects tested Reading, Math Languaqe Arts English Math Social studies Science, Foreign Language, few in Business Education, Writing Readinq Math Language Arts Writing Science Reading, Math Writing Reading(, Math Language Arts Science, Social Studies Critical TKg. Citezenship usage,arts and humantie s Grad. levels 3,5, 8 3,6 5,8,8,11,1 2 1,2 ,3,6,9 1 ,2,3,6,9 1,2,3 ,6,9 6,9 3,6,9 Instruments SEA N N Wri c l Y ten Custom l veloped SEA thru Contractor ng ce N N N N Insruments Publ standardized CTBS N CAT : Reading math Langauge N N l d Writing Sample N N Focused holistic ccore scale .4 na N ly Other N lass room each N lly degree f Reading power Commit or Approx. no. tested ( -85) l ll subjects 55,000 Up to LEAs did not have s inf o < 475,000 25,000 ee s .20, 000/ ) qrade objectives each evel develop Notes 95 of LEA give Scien and S OC ial Studies which are optional. 84/85 char from grade 4,7,11 voluntary proqram; n, year to fi better wit comp. test )11 1 test grades 6,7 11 will f test a grade instrument 10 not kno subject art will be the same as 20

PAGE 24

. . ---: .. A ., ., -. ..-..-. . .. .. .---a-State ?Jmde Is Land South Carol ina !,],eswe Not i .ra L 1 ah le for ,,* r ,) i t?w ,: x.3 > No state r ).1 ram ) t ah /e rmont No state program ~J L rg In la .+ash Lnq ton Scl : Sub]ecte tested leadlng, Math, anguage Arts )ther taading Iath languaqe Uts Writing, ;cience, ioc la 1 ,tud les Othe r Readlnq, Xa th Ianquaqe Arts Sc I ence Soc la 1 stud Le Reading, Math Language Arts Critical Tkg. ?ther Reading. Math, Language Arts Science Social Stud~e: Readiny, Math Languaqe Arts Grade levels 3,6,8,10 4,5,7,10 4,-1,10 4,8,11 5,11 4,8,11 4,8, 11 Stata Asse*~ t Table II Program Characteristics Ins SEA N N N Y N N custom tveloped :EA thru ontractor .ife Skill: eet 8, 10 N N N N N uments Publ stan ardizac ITBS 3,6,8 CTBS SAT CTBS SRA CAT ad Iri tint Ss?!Y?L ~alyti :oring 7-79 )-83 >list i 35 tee >be tandar Led N N N N N State test LS Ln Ats first year. This year Lt 1s not mandatory. (1985 86 Lt WL 1l ~.) ll?st LS thus be= qlven to a non-random non-stratified sample of the 21,000 N N N N N Xgre >f Read] 3th q lamp] Approx. no. tested -85), all subjects 1,400 200,000 21,000 7,500 200,000 s 110,000 g Power ade r /8 Y l N Y Y [4, Notes l 1985 Metro. Achievement Test to be given in gradee 366 in wrltlnq onl y becauae of budget limitation. 5th grad. readinq added in Plan to add 9, drop 10 in : 4.5,7,9 l SD1 21

PAGE 25

State West Virginia Wisconsin Not avmlable for interview Wyoming Sub]ects tested leading, Math, ktnguage Arts, ;clence, locial Studies teadinq, kiting Grade levels 3,6,9,11 4,8,11 Stat* Ae8*8sment Tablo II Program Characteristics Ins Custom S w u N weloped 5EA th~ :ontractor N N wments Fubl. standardize CTBS Yrs :oncurr Iith Iationa :esting prlmar .rait) Ised Writi m SSmpl( N N 3the] N N Approx. tested ( ; ~4-85), al. subjects 115,000 0,000 M. D* v -. L N 0! f ,E 1 Y N Notoa 22

PAGE 26

Table III Uses of State Assessment Data Most of the 38 states that have assessment testing programs report multiple uses of them. The number of states reporting various uses of state assessment data is as follows, in order of frequency of use: public accountability (34), curriculum improvement at the state level (33), monitoring student achievement trends (30), informing educational policy (27), making comparisons with national norms (28), making comparisons among districts within the state (17), making comparisons among regions in the state (13), incentives and sanctions (8), and rating of schools (2), with another contemplated for the near future (Georgia). There is little evidence that state assessment data is being used for purposes of giving or denying funds to school districts on the basis of student performance, but there are selective uses of this type in a few states. For example, California has established an educational improvement fund based on improvement of 12th grade scores over the previous year. Connecticut is phasing in a mastery testing program which wili be used to identify schools needing additional money based on mastery level statistics. Michigan, which dropped a program in 1974 that withheld funds from districts not showing improvement in state assessment results, now bases funding for compensatory education on these results. South Carolinats 1984 law identifies districts where the quality of education is seriously impaired, and it is anticipated that sanctions may be used where such instances are found. These sanctions may not be monetary. Washington provides remedial assistance for percentages of students scoring in the lowest quartile in grade 4. Since 1980, Virginia has provided a system for allocating funds based on state assessment data. Florida employs a system of education programs based on state assessment data. for remedial education funding compensatory 23

PAGE 27

. ..-. < . .,, ... In Alabama and New York, the legislature and the State Board of Regents, respectively, work with the state education agencies to see that deficiencies in the school systems, as revealed by state assessment data, are addressed by state education agencies using resources other than financial. District level curriculum improvement was the most frequently mentioned local use of state assessment information. Comparison of results among schools was also mentioned several times. California and Pennsylvania have developed sophisticated systems of data analysis and reporting. California groups schools according to socioeconomic status (SES), aid to families with dependent children (AFDC) and English proficiency measures in an effort to make more justifiable the comparisons of performance among schools. A more complete accounting of the variables used by the different states in aiding interpretation of test results is found in the discussion of Table IV. 24

PAGE 28

. -. . L. ., . . . .-. -. ._ d z z z z c m m VI z $ 4 u s I z L1 z z z z z u l z --J 25

PAGE 29

; ., I c w Cal UY .c In -1 4 ., .> z .-J z H n aJ n $ m r% 3 -, u =T z z 26

PAGE 30

. . ,<--. . *-. ., . ------ 4 =z z z z a 1 1551 u a -d U bt a u I z z w o I .s1 z 1 21 M UYl 1-41 z z > u al 27

PAGE 31

. .. . -. ~ : ... .2 1 ml n m u z xm u u .u m 2 2 8

PAGE 32

. . .-. 4 . . .. -. -, <---A . --. ... .-.. . .- ..-.,.. (n c o U u .-l b m ., x w -4 u m G .-l .-( 2 9

PAGE 33

. . %.8 .., . ---. ~ ---. E u., u) a J z z z z 1 4 z z z z z c 4 ( 4 t 30

PAGE 34

. i i ( ( I I I ~ :-.. . .-..-s.-..-. z z z I I bl ii u z ::. 5 L1 m I al 141 31

PAGE 35

. . . ~ ,.. ., -..44-.. . 1 W m s z z z ., ,. z z z z z z .+ k -d 32 d In b-l

PAGE 36

. . . . --, -~ ---. -...-9A-. !..+. -l &l Q u a i% I al I al a 1 33 ii ? -u ) c 3 0 Z z Lc 3 % 5

PAGE 37

, 4, . ? i . -. .W !. .. -. .. --=.. %. -. -, ~k i a l u .A cl-l a) ; .: L1 al -, 4 i i u a 0 1+1 .6 6 4 > ti e I d I x al 34

PAGE 38

. . < .. .. ---. > .-A . .~ .. .4 3 I I E u)] I A I 1 k ml z b L, c o Z z 35

PAGE 39

. . . t.. . ; -. .~. .---. . .-& . I ; ,0 z :-i I a G u U1 i 1:1 36 i

PAGE 40

. >-m z z E c 37

PAGE 41

38

PAGE 42

H I ml I z z A E 39

PAGE 43

u z z z z z z z z z z z 40

PAGE 44

, . . -. 3 u z z z z z

PAGE 45

, . -. .----. . . . Table IV Variables Used to Aid Interpretation of Data Efforts to compare the performance of students, classes, schools, and school districts on tests lead naturally to questions regarding the validity of such comparisons. A number of states now collect student demographic data and school/district variable data in order to assist users of state assessment data in making more valid comparisons and judgments. Student variable data now collected by states include the following in order of frequency of states collecting the data: sex (20), race/ethnicity (17), amount of homework (10), family income (9) type of handicap (8), television viewing time (7), number of parents (6), and validity of student performance as judged by the teacher (4). Other student variables reported include parental education, family occupation of head of household, community type, access to libraries, number of times residence changed, number of siblings, order of birth, home reading materials, ESL Bilingual information, student/teacher/principal attitudes toward the testing program, textbooks used, teacher load (both of the above relating to a specific subject), repeater status, migrancy, and a smattering of pupil/teacher attitudinal variables. School/district variables in order of frequency mentioned by states include: Title 1 or socioeconomic status data (14), district and school size (17), and urban/suburban/rural classifications (4). Other school district variables mentioned include per capita income; per pupil costs; class size; pupil: teacher ratio; Chapter 1, remedial, compensatory, and bilingual status; dropout rate; attendance rate; pupil mobility data; participating in gifted child programs; and eligibility for free and reduced lunches. 42

PAGE 46

. I -. ---. + -. .. . z z z z z z z z z I al &J fu u) z z z z z z z z u z z z z z z z z a 43 <

PAGE 47

. -. ,, -. -. . . 2 m z z z z IQ $-t u rn u Li b : t z z I x z z z z z z I I z z z al u m i% m ., 4 4 i I I &

PAGE 48

. . . -. . .-., An a z z z z z I 01 z z z z z x l & z I 1! 45

PAGE 49

{ . . . \ .+ z z 1 I z > 1 -14 J m 4 6 .

PAGE 50

. z z z z z z z z Il l z z z z z z z z I I i z 47 I I I

PAGE 51

.. . m d z > z z J z z z H 0 : d u u z z z z Ii ad : u u -d h z z z z z z z z z z z m u m c .+ 3 u -4 > a E z z z z & > E= z z u z z z z u .. 4 8 x .-

PAGE 52

. -. . J.J u! al I I u I z z > z 1 I z z z z 1 I .49

PAGE 53

-. ... ., . c .m d z z z z z z z m z z z z m c -d al u a d 50

PAGE 54

--, .. > ,. L1 3 b L1 c AJ a u) c L% E c 3 a JJ s u-i o > H a) 4 A ~ 1 I 1 ; Q 1 x AJ k z z u al u m 4J m, n 51 I

PAGE 55

4 -.. . z z t z z z x r % z z tk0 t5 -d u m z z z z z z ( A z z z z z z u m 52

PAGE 56

. . .4J n L1 i G .4 4 u > .) > .4 u d n i \ i c v n z z z w all z z z c o 2 3 i! al E z Q z z 01 z z 1 fu .4 = .53 I

PAGE 57

Table V Test Construction The majority of states with assessment to avoid bias in test items for both race and programs have employed formal procedures sex. More than half of the states surveyed reported using pretested and statistically analyzed items. Fourteen states reported tests that use item calibration related to item response theory (IRT). This is a significant development of the past several years that indicates growing acceptance of the values of IRT in testing construction. Some of these states used IRT calibration on only part of the tests used. The movement toward IRT and the introduction of matrix sampling in a few states seemed to be the chief changes in test construction technology occurring in state programs. Very little change was reported in norming practices, except for some movement toward criterion referenced testing (CRT) measurement in the 1970s and a return to norm-referenced testing (NRT) or a combination of both CRT and NRT in the 1980s. Pennsylvania reported a move from district to school norming information. Few changes in reporting practices were noted except for references to 1l mor e sophisticated forms of reporting. This probably refers to the increased use of variables as discussed under Table IV for both students and schools in the reporting and interpretation of test results, and the continuing trend away from reporting grade level equivalents. 54

PAGE 58

., .. . -,. -. . state Assessment Table V Temt Construction Formal Procedure. to Test Corstriction Items pretested, items analyzed Construction items calibrated using IRT Significant Changes Since Avoid Bias Program Began in: Norminq N State Alabama Construction Switched CAT to SAT in 1984 N Y Y Y Y N Y Y : With MAT n samr te DId away with grade equilvalance in 84, 85 Alaska Y N Expected in 85/86 1985 Start updating my district for comparative purposes I A r izona Arkansas Y Y Y N Y Y Y Changing from CAT to present tests. Y Y: Expanded test and norms I I Item selection I part of the test selection Y Y Y those included as With new part of test development Y Y California Colorod o Y Y 1972 matrix sample and state developed tests N Percent correct to scale scores 3,6,9 state program Y Y Y Matrix sampling added in 1981 Connecticut N Used business proqram to set perfor mance standards on Business Exam only 1984 Y I Started IRT and Y I Y Y Y Delaware Y Y with CAT and last CAT did not. N year switched to CTBS: CTBS uses Note CTBS manual for specification N N District of N N Columbia N or CAT N Y t to change the N.R.T.) (me\ Florida N N (Combined with M C. under SSAT 1 2 note M.C. comments.) N N N N Georgia Y Y Bias review Y: Switched to IRT calibration N Y: Added scale scores to scoring system. Y: Rasch panel and measurment I I statistics l Data Complied for the Assessment by Northwest Regional Educational Laboratory, 1985. SOURCE: Office of Technology 55

PAGE 59

. . -. .-. 0--. -. = . .. State ASSESSMENT Table V Test Construction Formall Procedures to Avoid Bias? Racial Sex Test Construction Items pretested, Items calibratm Items analyzed using IRT Significant Change Since Program Began Construction Nothing Reporting N N State Hawaii Commercial tests N N Y Y Idaho Y Y Done throuqh published standardization process. Y Y Test publisher updating from 82-85 norm Y Y: Logist Y Y Items are I always reviewed by commities even If they are technically appropriate; I LOgist analysis is formal procedure. Y program N Y Y Y Y Y N Y Y Y Y Indiana Y Y 1: 1984 change to competency testing program has a l-year cycle. N N Iowa No state N N NR and CRT in 1985 Y Y N N N N N N 1985 test Kansas Kentucky [Approach to whole assessment changed In 1985) Y N N N N N N N N N Louisina Maine Maryland Y Y Massachusetts Michigan Minnesota No state program In 1972 switched to CRT Y N More sophisticate Y Y Y Y Y Y N N N Test analysis has Become more psychometric over the years. 56

PAGE 60

. -. -----. c . statm Aaaeasment Tabla V Test Construction State Mississippi Missouri Formal Procedures to Avoid Racia1 Y N Bias Sex Y N Montana No state program Nebraska No state program I I Nevada No state program I New Hampshire No state program I I I New Jersey No New mexico New York North carolina North Dakota ate program NA NA Y Y Exam committee YI Y By test publisher; for science no; for writing a general committee that developed prompts looked at and did not find bias I I state program Ohio No statep rogram Oklahoma No st Oreqon Pennsylvania Test Con tame pretested, items analyzed N Y NA Y Y te proqram N Y Y Y Y: Field tested look at ltems and how ethnic groups respond; items read by different groups. I truction terns calibrated using IRT N N NA Reading items are calibrated usinq an IRT model. : Current CAT used IRT Y N Significant Changes Since Program Began in: Construction I Norming I Reportinq N N N N Random sampling In I N 1984/85 1905 test anticipated to look at item difficulty, score reporting, etc. NA N NA N Depends on change in test edition, L.c. test publisher may change test with each new edition: in writinq and science new tests constructed no norminq, N techniques looking at bias, item selection technique and item writing techniques. First time have normed test. Y : Moved from district basis to school basis. (: Y N N N More comprehensive, better layout. 57

PAGE 61

. State Rhude Is 1 and South Carolina South Dakota Formal Procedure to Avoid Bias? Y Y Using standarized test. NA NA Tennessee No interview I Texas No state program Until Y Y No state progra m NA Y NA NA Y NA Wisconsin Not available for Wyoming Y Y State Assessment Table V Test Construction Test Con [tame pretested, items analyzed { truction Items calibrate using IRT N Not appriate -using standarized test NA Y NA Y NA Interview Y NA N NA N NA Y Construct ion N Changed test Significant Changes Since State test i S in Its First year. Thiss year It is not mandatory. (1985-86 it will be) Test is thus being given to non-random nontratified sample of the eliglb 21,000 e pupils. N NA N NA N Program Began in: Norming Y: 1975 [new program W ill use standarized test) N NA NA N NA Reporting Will improve. More sophisticate State test is in a first year. This year it is not man datory. (1985-86 will be) Test is thus being given t a non-random nonstratified sample 21,000 e pupils. N NA N NA N

PAGE 62

L Table VI Reporting Test Scores The methods for reporting assessment testing data varies widely from state to state Assessment test scores are summarized for the entire state in 32 states, for individual schools in 31, by districts in 32, and by individual classes in 21 states. Individual student or group reports are prepared for state education agency curriculum personnel in 32 states, the media and public through a state education agency report in 32, principals and superintendents in 34, for state boardsof education in 33, students and teachers in 29, legislatures in 31, and the general public in 31 states. In addition, sample questions from the assessment instruments are made available to those requesting them in 20 initially. Alabama reported that The formats for the score Some states report raw scores states. Hawaii reported that this practice took place it made items available only to teachers and educators. reporting also varies considerably from state to state. (21), some percentiles (23), standard scores (21), grade level equivalents (6) and IRT scale scores (4). Stanines and percent correct data were reported by 5 states and NCE data by 7. In several cases, states indicated that they use different types of score reporting for different tests and/or more than one type for the same test. The diversity in methods of test score reporting in individual states is one of the things that makes across-state comparisons difficult even when the same tests are used. 59

PAGE 63

. . .+. . 4 . . . . . . . . . State Assessment Table VI Reporting Test Score s State Alabama Alaska Arizona Arkansas Connecticut Delaware D.C. Florida Scores Summarized Test Results Received by: Indicate Individual Y Y Y Y Y Local option Y Y See Y Score Y Y Y Y Y Y Y Y notes. Y b Y Y Y Y Y Y Y N Y Y Y Y Y Y Y Y Y --Y Group Summary Y Y Y Y Y Y Y Y ---Y Y o teac and educ Y Y N N Y Y N --Y (GS) Notes/ Other (specify) rs ors Did not indicate if I or GS. Did not know if I or GS. Region (county). urban, rural, suburban. mastery test will apply to all categories. Did not indicate if I or GS. ---------------. Did not indicate f I or GS. I i Y N en ec Y N grd ,3 ,8 0 Dropped in 34/85 N N Y N N N Y ----N Score au N N N 3,6,8 N N N ---N Reported Notes/ Other (s pecify Grade equivalent, ICE, stanines, S.D. tanine normal urve equivalent NCE) Report percent correct. Mastery test W ill report correct; number and of objectives mastered Degrees of Reading power unit score. Holistic writing score. NCE See test M.C. ob] M.c. c omments: combined with report mastery for grades 1,3,6,0 SOURCE: Data Compiled for the Office of Technology Assessment by Northwest Regional Educational Laboratory, 1985.

PAGE 64

State Table VI Reporting Test Scres State Hawaii Idaho Illinois Indiana Iowa No state proqram Kentucky Lousian a Maine Maryland Machusett s No state proqram Michigan Minnesota Scores Summarized Y Y Y Y Y { Y Y Y Y N Y N Test Results Received by: Indicate Individual Y Y Y* Y Y Y GS Y Y Y Y Y Y core f Y Y N Y Y Y GS Y Y Y Y Y N Y Y N Y Y Y Gs Y Y Y Y Y N Y N N N Y Y GS Y Y Y Y Y N Y: lit: N N Y Y Y N Y N Y Y N N y (GS) Notes/ Other (specify) lly Do not know if 1 or GS. l Did not indicate if or GS. Do not know if [ or GS. Anyone who wants it. Parents. Parents. N N N N N N N Y N N N N N s Reported Notes/ Other (specify) Stanines. Stanine normal curve equivalent (NCE) 1985 < CRT: Items pas I of items passe, !tc. .ocal assessment Iata is provided my way they wanl 1985: by distrl knd state correct.

PAGE 65

., ., 4 -, . ----. -----.State Aeoos,ent Table VI Reporting Test Scores state Montana NC state program Nebraska state program Nevada state proqram New Hampshire No state proqram New Jersey No state pragram new mexico r New York North Carolina Nort h Dakota No state program Ohio N O sta oklahoma No state program pensylvainia Y Y teaches Test Results Received by: Indicate Individual GS Y Y Y Y I GS Y Y Y Y Y (GS) Notes/ Other (spepecify) her) Districts requir( to present comprehensive assessment report :0 the local boards at a public meting. Do not know if I ir GS. o not know if or Gs. u n r N A I Types of Scores Reported N N CAT N N N N only N Pass/ fail. Writing : focused holistic score scale Stanine

PAGE 66

. . 0 d.. ,. ,.. . State Assessment Table VI Reporting Test Scores Scores Summarized Types o f scores Reported Test Results Received by: Indicate Individual (GS) Y Y 1 [ l I Y 1 N Notes/ Other (specify) Parents. Notes / Other (s pecify) 1985 program W il do it all. NCE. State Rhode Island N Y r Y Y N Y Y Y Y Y Y Y N Y Y GS GS GS Y GS Y Y Y Y Y GS Y Y GS GS Gs Y GS Y Y N Y N N Y Y N N Y N N N N N Y I ,, 9 South Carolina South Dakota GS Report includes results by school size. Tennessee Not available for Interveiw TEXA S state program Gs Gs Y gs Y GS Gs Y Gs Virginia At LEA discretion. (NCE). NCE. Washington Y: West Virginia, Publisher has items readily available. Wisconsin Not available for interview Wyoming Y ETS piggyback= tests. Go directly to district for comparision with nation.

PAGE 67

., . -.. . Table VII Effects of Program The changes reported in state educational policy that resulted from stat e assessment may be summarized as follows: 1. A move away from testing a sample of students to the testing of all students in grade levels and subjects tested. 2. A trend toward identifying and providing assistance to school systems showing specific educational needs. 3. A move toward mandatory as opposed to optional or voluntary testing. 4. A tendency to expand the areas and grade levels covered by the state assessment tests. 5. The linking of state assessment programs to state school improvement programs. Examples of changes in local programs and practices revealed that the state assessment program was affecting local curricula by bringing them into line with the objectives of the state assessment tests, by identifying skills needed to teach to state assessment objectives, by causing reexamination of certification requirements for teachers in areas tested, and by bringing increased attention to the teaching of writing. 64

PAGE 68

. . .. ., . . In general, state education agency personnel interviewed did not appear well informed regarding the effects of state assessment programs on local programs and practices. Pennsylvania's practice of the state education agency surveying and reporting on local uses of state assessment data is a noteworthy effort to enlighten state personnel and others on local uses of test results. The development of state curricula was attributed to the state assessment program by a number of state personnel. A number of state curriculum guides have been changed to reflect inclusion of skills tested in the state assessment programs. 65

PAGE 69

. State Alabama Alaska Arizona Arkansa8 California Colorado NO state program Connecticut D.C. Delaware Florida Georgia Hawaii Idaho Illinois Indiana Iowa No state program Kansas SOURCE: State Assement Table VII Effect of Program Change in State Education Policy Emphasis on needy systems. Reporting of results by district mandatory grograms. N Y: part of current legislation came from test results. 1983-84 mandate upgrading assessments, include more and critical thinking. of grades Addition of mastery program-new trend for state. N N added standards for student achievement (note M.C. c omments) Massive emphasis to change curriculum. Too soon to tell. Y: school size issue. 1984 leqislation. N Data Compiled for the Office of Examples of Changes i n Local Prog rams and Practices Instructional alignment of test drawn into curriculum. Local attempts to align curriculum with test. Y: in some LEAS tests lead curriculum. Y: change of use of results: LEAs using results to analyze curriculum, summer schools (those who need remediation). Writing emphasis. Continuous program of change based upon results. N N N Basic skills emphasis. Too soon to tell. Y: writing (analytical scoring scale) N Technology Assessment N Changes in State Required Curriculum N N N N Model curriculum developed. New graduation requirements. N N N Combined with M.C.--see M.C. cements. Y: curriculum guides changed to reflect inclusion of skills tested. Appropriation Increased significantly in last five years. Too soon to tell. Y: assessment is driving curriculum. N N by Northwest Regional Educational Laboratory, 1985. 66

PAGE 70

, ,, -. 1 ...-& Tablo VII Effects of Proqram . Changes in Examples of changes i n Changes in State state State Education Policy Local Programs and Practices Required Curriculum Kentucky Lousiana Maine Maryland Massachusetts No state program Michigan Minnesota Mississippi Missouri Montana No state program Nebraska No state program Nevada NO state program New Hampshire No state program 1985 -all grades tested K-12. Changed type of test from CTBS standardized to CRT and NRT in all grades to test preidentified skills in five areas. Required annual performance report. Sanctions are now a possibility. N 1985 school improvement plan requires districts to meet needs as indicated by state assessment data. N Research on Effective Schools based on MI assessment; focus of assistance based on model. 1984 -local control optional program. N 1985 mandated program, regular assessment, Language Arts included. List of skills Identified. Local districts mandated to teach to these. N School improvement plan. Varies with school. Changes in certification code regarding who teaches math and science. Program for teaching fractions came from need. Early Childhood Education proqram. Y See policy. N N Development of a state curriculum framework. N Y: but big Impact at local level. More precise. N 67

PAGE 71

. . . . State Assesment Table VII Effects of Program Changes in Examples of Changes in Changes in State State State Education Policy Local Programs and Practices Required Curriculum New Jersey No state program New Mexico New York I North Carolina North Dakota No state program ohio N o state program Oklahoma No state program Oregon Pennsylvania Mode Island South Carolina South Dakota Tennessee Not available for Interview N N Y: No specific details given. Teaching of writing now Y: previously no district comparisons for accountability; test results now routinely go home to parents (now a policy). Pendinq: census rather than sample testinq. Y: refer to Table III. More active interest in promoting basic skills. Mandated program in 1985. Every pupil tested across all subjects listed. School Improvement Plan added 2.5 Million in 1985. NOW mandatory. Sample now universal State test is in Its first year. is thus being given to a non-ran emphasized in schools as a result of test. Y: test helped to brinq a focus on curriculum-awareness level increased; however, no specific program changes. Emphasis on writinq resultinq improved writing scores. Y: refer to Table III N N This year it is not mandatory. non-stratified sample of the N N N Have state curriculum now. N N N 985-86 it will be.) Test 000 ellgible pupils. 6 8

PAGE 72

, Texas No state program Utah Vermont N O state program Virginia Washlnqton West Virginia !41scone.in Not available for Interview Wyoming . changes in State Education Policy State and district graduation reguirements have been changed. Big shakeup in 1972. Caused mainly by improper administration of norm-referenced tests. Established remediation assistance program. N Not yet. State Aessesment Tsble VII Effects of Program Examples of changes in Local Program end Practice N Minor changes in response to test outcomes. N N Not yet. Change in State Required Curriculum Assessment showed poor math ability. lath curricula have been changed. N Y : state guidelines currently being developed. N Not yet. 6 9

PAGE 73

Table VIII Functions of Technical Staff Thirteen states reported they employ their own technical staffs who conduct and upgrade the assessment programs they use. The state assessment technical staff offers assistance to local school districts in interpreting scores in 32 states, and assistance in administering tests in 27 states. Most states also provide services to such individuals as local education agency administrators (30), principals (26), and teachers (22). 70

PAGE 74

State Assesment Table VIII Functions of Technical Staff State Alabama Alaska Arizona Arkansas California Colorado No state Connecticut A. and B. D.C. Delaware Florida Combined with M.C. Georgia Hawaii Idaho SOURCE: Technical Staff Employed to: Upqrade tests Y Y N N y Y N Staff looks at Technical specifications but does not upgrade tests. Local Assistance Given Administer tests Y Witten guidelines Pretest workshops Y N Y Y N Workshops Y Y I Interpret scores using results Y Upon request Y Y Y Y Y Y Y Y Y Groups Receiving Teachers Y N Y : For interpreting scores/using results Y Y Y Y Y Y: Also test administration and counselors Assistance Y Y Y y: For interpreting scores/using results Y Y Y Y Y Y N Y Y Y: For administering I test then they provide inservice for teachers Y Y Y Y Y N N Data Compiled for the Office of Technology Assesment by Northwest Regional Education Laboratory, 1985. 71

PAGE 75

State Assessment Table VIII Functions of Technical Staff Technical Staff Employed to: Local Assistance Groups Receiving G Administer tests en Interpret scores using results Assistance Principles Y LEA admin. Upgrade tests State Teachers Y Illinois Y N Y N Y Y Y Y Y Y N Y Y: Regional workshops throughout Y State Y Y Y Y Y Y Y Y N Y Indiana Y N N Y Iowa No state program Kansas Kentucky. Louisiana N N N Y Change tests N Y Y Y N Y Y Y Y Maryland N Massachusetts state program No Michigan Y Y N Y Y N N Y Y Y N Minnesota Mississippi N Missouri Initially, then decreased N N Y .1 montana No state program Nebraska No state program 72

PAGE 76

. . . . State Assessment Tabl e Functions of Technical Staff State Nevada No state program New Hampshire No state program New Jersey No state program New Mexico New York North Carolina North Dakota N O state program Ohio No state program Oklahoma No state program Oregon Pennsylvania Rhode Island South Carolina South Dakota Technical Staff Employed to: Upqrade tes t N N N N Y Y: In 1985 Y N Local Assistance Administer test s Y N Y N Workshops In 1985 Y Y Interpret scores using results Y Y: If ~ S request it Y Y Y In 1985 Y Y Groups Receiving Teachers N N Y Y N In 1905 N N Assistance Principals Y N Y Y Currculum directors In 1985 N N LEA admin. Y Y Y Y N In 1985 Y Counselors 73

PAGE 77

State ,. Tenessee Texas No program Utah Vermont proqram Virg Wash nia nqton state No state West Virqinia Wisconsin Wyominq Technical Staff Employed to: Upqrade tests Not available for int Y (2) N N N Not available for int N State Assaessment Table VIII Functions of Technical Staff Local Assistance Gi Administer tests view Y Y Y Y view Y: In 1985 Interpret scores using results Y Y Y Y Y: In 1985 Groups Receiving Teachers N N Y Y In 1985 Assistance Principals Y N Y Y Y: In 1985 LEA admin. Y Testing directo Y Test coordinato y : In 1985 ? 74

PAGE 78

Table IX Staffing and Expenditures for Program, 1984-85 Extreme caution is advised in interpreting the information in this table. For many reasons it is not reasonable to compare costs among states because of the difference in the size of programs, the numbers of students served, the number of areas tested, and the size of the population of the state itself. In some instances staffing costs could not be accurately reflected in the budget to the complexity of the programs or departmental structure. In a few cases it appears that assessment total budget figures also include costs of the minimum competency program. Also, some states do their own scoring and did not count this cost; others have booklets already produced and in the schools and did not report these costs. And, finally, some districts reported usually large budgets this year because they are involved in developmental work. Perhaps the most useful statistic in the table is the one relating to the budgeted amount per pupil for the state assessment program. Since it is arrived at by a division of the total budgeted amount by the total number of students tested, it provides a basis for interpreting the state per pupil investment. Even here, factors not named above might also contribute to the wide differences in reported costs: 1) state use of its own tests, in which case the cost of development may not be reflected in the current budget; 2) administration of whole batteries of tests to the same students as compared with matrix sampling or rotation of subjects and grade levels from year to year; 3) size of the state, in which case the maintenance of the staff and program may be somewhat more costly than in states with larger numbers of students; 4) the use of outside contractors when the entire testing process is simply reported in the contract costs, excluding state personnel costs; 5) and perhaps most important, the itself. For example, programs with large writing scoring costs. character and scope of components obviously the program have higher 75

PAGE 79

. ,, Staffing of assessment offices is also variable, and is generally, but not always, related to the size and scope of the program offered. Size of staff varies considerably among states having comparable budgets. For example, Kentucky, with a budget of $1.5 million has a staff of 1.5, whereas Michigan with a budget of $1.25 million has a staff of six. Another contrast is Mississippi which administers $200,000 budget with one staff member and Missouri, which has six staff members administering a budget of $124,000. It would be difficult to evaluate the meaning of these differences without detailed information on the history and current status of these programs and the reasons money is budgeted as it is. Wide differences in expenditures for scoring, purchasing, and developing tests also encountered. This is to beexpected in view of the fact that many states score own tests and do not have this expenditure broken out. were their Apparently, accounting for the cost of development of tests in the states is difficult, for very few states were able to provide these costs unless they were in a development year, with a specific budget for this. New York and Michigan were the only states providing them for the 1984-85 school year. In general, changes in expenditures for state assessment have not changed radically over the past 4 years, or in the most recent 2 years. example, California has increased 250 percent in the past 2 years and Hawaii has increased 300 percent showed an increase of 500 percent over a 7-year There are exceptions to this. For past 4 years and 175 percent in the over the past 4 years. Minnesota period. Washington increased its expenditures 100 percent over the past 2 years while Oklahoma had an increase of 90 percent in that same period. Other states reported modest increases or budgets that remained the same or declined somewhat over these periods. 76

PAGE 80

State Alabama A L as ka Arizona Arkansas CoLorado NO state proqram Conneticut Master y Proqram: Deleware Total S.A. budget, 1984-851 $770 ,000 S50 -60K S795 ,46 5 (Excludinq personnel $190,000 (Includes scoring; cost is mostly scoring since test booklets ALREADY In schools S1OO,OOO 1.4 MILLION over 3 years startING $140,522 (stdd) $36,000 (writinq) Total S A. staff B 1 2 4 n q early 11 1.5 1 1984 2 State Assessment Table IX Staffinq and Expenditures for Proqram, 1984-85 Total SEA curriculum staff 45 Separate but work closely) 3 0 35 50-65 for comparabe group) 2 2 N thus breakdown Total students tested 1984-853 385,000 15.000 461,000 100,000 1,1OO Million 7,500 40,000 60,000 (stdd) 7,500 (wr.) n of mete may Budgeted per pupil S2.00 S3.67 usinq 55K 14A $1.90 S2.73 NA NA $2.34 (stdd) $4.80 (wr.) inex ~ 1984-B Scorinq 385,000 S5,000 440,00 0 stdd) 9,500 wr.) Note column. 560,000 NA NA 71,900 stdd) Do not have figures dying or scor or S A Expenditures for: Purchasing/ Developing cost $385.000 N S274,000 (stdd) $500.00 (wr.) information in first N NA NA NA ndf4Cproqraamay b. Approximate Change in Expanditures for .980-81 to 1984-85 Increase 50% decrease. 18. 5 50% increase increase 10% year. Ne w funded separate L bAandMCproqrinmsyb.comMned, ~<. ,AA and MC proqraa may b~inod or one and the same, thus fiqure may reflect a combined SA and MC staff. S.A. 1982-83 t o 1984-85 90% increase. 50% decresae. 31. 6 Stayed same. 1759 increase Added 5th grade. Includes cash for CAP proctors.) Increase 10. New. s&me. Students tested, not nudxr of tests administered. 77

PAGE 81

. ..-. ., 1 State Assessment Table IX Staffing and Expenditures for Proqram, 1984-85 State Florida Georgia Hawii Idaho Illnois Indiana Iowa No state proqram Kansas Kentucky Louisiana Maine Maryland Total S.A budget, 1984-85 $300,00 0 Combined $720,000 Including personnel) ,200,00 0 $21,000 200,000 229,900 $240,000 $830,000 Local system all costs. Total S A staff 11 wit h 3.5 2 .5 5 2 1 1.5 7 6 to pay 12 n all pograms. in this program.) 1 SA and MC proqram may be combined, t Total SE curriculum staff Not part discussion M.C. c 31 N 8 NA NR 2 15 45 17 35 D breakdc Total student tested. 1984-85 3 39,000 NRT 45,000 CRT comments next to 320,000 88,000 11,917 7,500 80,500 150,000 710,000 120,000 48,000 175,000 of costs may Budgeted per pupi l $2.00 ate. $1.80 $2.27 S1.76 $26.67 $3.69 NA $2.11 $2.00 10.40 N be inexact 1984Scoring $150,00( $l.50/ student N Note in 54,000 NA NA $500K NR Contract develop new test scoring high. /J 5 A Expenditures for: Purchasing/ Developinq cost $150,00 0 S250,000 $200,000 in first column. NA NR NA 1 Million NR includes test lt and scoring for Writing test costs are signifia Approximate xpenditure s for 980-81 1984-85 Same (Doing less.) % 3009 increase -7 Same o state thus figure may reflect a Students tested, 1982-83 1984Same with same ch e, s s c s crea: in 198 % reasc a 5( past r Its. aae.

PAGE 82

State Massachusetts state program Michigan Minnesota M ississppi Missouri Montana No state program Nebraska No state program Nevada No State program New Hampshire No state program New Jersey No state program New Mexico 1 ,5A and MC pr c Total S.A. budqet, 1984-85 1 1.25 Mil. $265,000 NA total S A. 2 staff 6 7 1 6 7 am may be combined, . State Assessment Table IX staffing and Expenditures for Program, 1984-85 Total SEA curriculum staff 7 0 0 6 37 thus breakdownt Total students tested 1984-85 3 330,000 270,000 140,000 17,000 55,000 n of costs ma Budgeted per 1 pupil 3.79 1.10 Local assmt.. .98 [State assmt. cost is less.) 1.43 7.29 NA be inex 1984-8 Scoring $300K $ .98 per pupil 75/p Available $1.58 per NA t or SA Expenditures for: Purchasing/ Developing cost $150,000 N booklets. for grades 3 $ 4. Local costs. and MC program may be Approximate Change in Expenditures for 1980-81 to 1984-85 20 % 500 % over 7 ar s ye Deacrease Gone to M.C.T. N NA and the ~SA and MC proqram may be combined or one and the same, thus figure may reflect a crmbinad SA and MC staff. S.A 1982-8 3 1984-85 Increase Big increse in 985. Decrease Anticipate Increase 1985 NA same. Students tested, not numner of tests administered.

PAGE 83

. . ,. . -s-. State New York North Carolina North Dakota No state program Oklhoma-N o state program oregon Pennsylvaina Rhode Island South Carolina total S.A budget, 1984-85 $21O ,000 $1 .1 Mil $100, 000 $550 $600 ,000 $45,000 S420K (1.2 Mil budget, combined SA&MC) Total S A staff 10 test develor s 4 prof. editors; 4 admis's spread over several programs. 1; prorated portio n 16 others for this testing program. 2 9 Also includes l.c. 1 14 Includes C. staff units in one. State Assesnent Table IX Staffinq and Expenditures for Program, 1984-85 Total SEA curriculum staff NA NA 8 NA o NA Total student: tested 1984-853 Info. available from LEAs only 475,00 0 25,000 150,00 0 428,000 (M.C. 578,000 Total 1,300 300,000 (M.C.) .75,000 (SA) SA and MC program may be combined, t 18 br.akdo~ Of mmtm may Budgeted per pupil NA NA $4.00 $3.04 $34.62 $2.18 1984-I Scoring Local cost 80 of total budget. $65K NA $1,200 $OOK or SA Expenditures for: Purchasing/ Developing cost $21O,OOO NA N NA $10,000 Admin. $20,000 $60K in 84/85 because of addition of 5th grade. and MC program may be Approximate Change in Expenditures for 1980-61 t 1984-85 Approx. 7 Same as inflation increase, decreased in price over year until added science writing. 25% Stayed the Same Same one and the ~SA and UC proqram may be combined or one and the came. thus figure may reflect a combined SA and MC staff. Students tested, not number of tests administered. A 1982-83 to 1984 Approx. 7 Same as inflation (increase) note comment in previous column. same same. Expected Increase 300 in ,985. Same with basic skills no part of program. Same. 8 0

PAGE 84

. . State South Dakota Tennessee Not available for interview Texas No state program Utah Vermont No state program West Vigirnia Wisconsi n available for interview Wyoming .5A and MC proc Total S.A. budget, 1984-651 S70,000 S1OO,OOO $1,600,00( 3150,000 NR $ 1OOK Total S A. staff 2 1 1 6 1.5 1 0 may be combined, t Stat. Assessment Table IX Staffing and Expenditures for Program, 1984-85 Total SEA Curriculum staff 9 4 0 40 NA They play no role in assmt. 3 Total students tested_ 1984-85 3 21,00 0 7,500 200,000 110,000 115,000 8,0000 is breakdown of costs may Budgeted per Pupil $3.33 $3.08 NR $1. 36 NR $12.50 exac t 1984-[ Scoring NR 15,250 95,000 $100,000 NR 18K or SA Expenditures for: Purchasing/ Developing cost NR $10,000 (Special purchase in 1984-85.) N N $71K to ETS MC program may be Approximate Change in Expenditures for .980-81 to 1984-85 $70 K 15 1 5 Increase NR Increase 5-1 0 NR NA e and the ~SA and MC proqram may be combLned or nne and the game, thue fiqure may reflect a combined SA and MC staff. students tested, not n{utier t}f tests ~~lni~tered. A. ,982-83 to 1984-85 $70 K 5 Increas e NR. Increase 0 0 cover 8 grade cen s NR NA Budget will increase by 10 in 5/86. same.

PAGE 85

! Table X Testing Time Required (Minutes Per Students) The information in Table X has been reordered in Table Xa to show a frequency distribution of testing times required by subject. States such as Hawaii that indicated a range of times are not included in the frequency distribution table, and States such as Delaware, that show a range of times by grade levels, are included but counted only once where times are duplicated for a frequency interval. Most of the indicated times are estimates. The mid-point and spread of the distribution for each subject is easily seen in Table Xa. Time of testing seems to be about the same for reading, math, and language arts, probably because these subjects are included in batteries with each test in the battery taking approximately the same amount of time. For these subjects the mid-point of testing time is in the category of 50 to 59 minutes for math and language usage and 60 to 69 minutes for reading. There is greater variation in the time of writing tests administered, and in general the time devoted to testing in writing tests is greater than in each of the other three basic skills subjects. The shortness of the science and social studies test is more a reflection of the poor definition of the curricular requirements of these fields than an indication of the amount of time required to test student knowledge in these subjects. It is unlikely that information of much value can be secured on student knowledge of these fields in the small amounts of testing time being devoted to them.

PAGE 86

State Assessment Table X Testing Time Required [Minutes per Student) Alask a Arizona Arkansas Californi a Colorado No state program Conneticut Mastery progrom Delaware D. c. Florida Georqia Hawi i Reading 4.5 grade : 60 10th grade: 30 60 60 Grd 1, 65 2 : 64 3 : 70 4 : 60 5-6, 60 7 -8 : 60 11, 60 60 60 min. 125 min 130 160 125 95 40 1 hour This subject subject Math 4th 5th 60 IOt h 30 60 Y 50minutes 60 Grd 1 : 34 2 : 44 3, 56 4 : i-, d 5-6, 6 4 7-0 : 64 11, 64 60 60 l 135 min. Gr. 2: 70 min 3 :75 6 :95 8:95 10: 40 40 1/2 hour varies from t area-to-! each year, Language Arts 4th 5th 60 10th 30 N Y 1 Y 60 Grd 1: 20 2: 46 3: 42 4: 47 5-6: 47 7-8: 47 11: 47 60 60 N 40 Writing N N Y N art of L.A. 40 Grd. 9: 2 45-rein. classe s N Did not know-just piloting 30 N 1 1/2 hour and (They cycle Science 4th ,5th 30 10th : 15 N N N N N Y Grd. 11: 40 min. 60 60 l N Gr.3: 20-2! 40 N social Studies 4th ,5th 30 10th : 15 N N 2 Class periods N Grd. 11: 40 min. N N Grd.3: 25 4 0 N Critical Thinking N N N N N N N N N l N NA N N Other/Notes Varies by grade level and specific test used; ranges from 2 h.s. to 4 elementary. -5 hours total time. 60 for all other tests. 1985 program. Ref. Spellinq skills Grd 2: 14 -o3: 13 -o4: 12 15 5-6: 12 15 7-8: 12 15 11: 12 15 Combined with M.C. Note comme nts under M.c Testing times for esthetics, P.E., l health not available. he test is a speed est. DO NOT QUOTE OR CITE WITHOUT PERMISSION OF THE U.S. OFFICE OF TECHNOLOGY ASSESSMENT

PAGE 87

Indiana Iowa No state program Kansas Kentucky Louisiana Maine Maryland Massachusetts No state program Michigan Minsot a Mississipip Missouri Montana No state program Nebraska No state program Nevada No State program New Hampshire No state program New Jersey No State program New Mexico l New York . . . State Assessment Table X Tooting Time Required (Minutes por Student) Reading 7-I 70 NA 120 60 40 8 0 Untimed 45 80 75 5 0 Y Math 44 70 NA 120 60 40 180 Untimed 45 80 75 50 Y Language Ar t N N NA N N 4 0 N 45 80 N Writing 50 N NA 12 0 7 5 N 60 timed 135 N N N Y Science N N NA N 15 N NA 45 N N Not required 50 social studies N N NA N N N NA N N N No t required 50 L Y Critical Thinking N N NA N N N N N N N N Y 1 Other/notes Standard Regents exams-approximately 3 in length l ot h 1 1/2 hours. 84

PAGE 88

. . State Assement Table X Testing Time Required (Minutes per Student) state North Carolina North Dakota No state program Ohio No state proqram Oklahoma NO state program Oreqon Pennsylvania South Carolina Tennessee Not available for interview Texas No state program Utah Vermont No state program Virginial l Reading Grd 1: 57 2: 59 3: 69 6: 45 9: 45 65 l 4 5 45* 30 50 Grd 468: : 4 5 5 7 math Grd 1: 44 2: 52 3: 55 6, 60 9: 60 50 l 45* 4 5 ) 5 50 Grd 4&8: 60 1: 44 Languageo Arts Grd 1: 12 2: 32 3: 31 6: 38 9: 38 N l 45* 45 95 50 rd 4&8: 8 11: 1: Writing 50 90 l N 45 N N N Science 50 N l N 45 30 N N social Studies N N N 45 30 N N Crica l Thinking N l N N N 50 N Other/Noters Matrix sampling total package grades 5,8,11: 2-2 1/2 hours. l 45 minute Iowa Test time. *Standard CTBS test times. *State uses SPA Test. 85

PAGE 89

State West Virginia Wisconsin Not available for interview Wyominq Reading 50 60 min. for reading and writing combined Math 50 N State Assesment Table X Testing Time Required (Minute For Student) Language Arts 50 N Writing N See reading columm. science 50 N social Studies 50 N critical Thinking N N I

PAGE 90

. ., ._. . Table Xa Frequency Distributions of Testing Time Required by Subject Language Socia l Critica l Reading Math Arts Writing Science Studie s Thinking i I I I 10-19 I I 211 1 I I I I I I 20-2 9 I Il l I I I I I I I 30-39 1 211 \ 1 1212 1 I I I I I 40-49 5 1716 [ 2 413 1 I I I I I 50-59 4 814 1 2 11111 1 60-6 9 10 514 1 l 70-79 3 1 2 I 1 I I I I I I I 80-8 9 1111 [ 1 I I I 90-99 1 1312 1 3 I I I 100-109 I I I l l 1 I I I I 110-119 I I I I I I I I I I I I 120-129 1 I 1 I I I I I I I I I 130-139 1 I 1 I I 140-149 I I I I I I 160-169 I I I I I 170-179 I I I I I I I 86a

PAGE 91

. Table XI Changes in State Assessment programam Major changes in assessment programs have occurred in this decade. Changes that occurred in the 1970s were mainly changes in tests (often switching from on e standardized test to another) and changes in subjects and grade levels tested. Of special interest is the fact that several states moved from norm-referenced to criterionreferenced testing during this period, a trend which has been reversed in the 1980s. Although matrix sampling was introduced in California in the 1970s, it was not introduced until the 1980s in other states adopting this procedure. At this time, however, the shift is definitely away from sampling of any kind to testing all students in the subjects and grades to be tested. In general, the movement appears to be toward increased use of standardized tests, accompanied by more sophisticated methods of reporting scores that enable comparisons to be made that take into account differences in socioeconomic levels, types of districts, racial composition of schools, etc. This may be contrasted with a few situations in which different approaches are being used that have some interesting features. For example, Minnesota has moved to a local option testing program backed by a strong program of technical assistance, and availability of tests in a wide range of subjects. Oregon plans to make available a list of approved tests requiring that districts select from among them while using results of an equating study to accumulate results and mak e comparisons among districts. Kentucky is moving to a mandatory testing of all students in all grade levels K-12, using custom designed tests that can produce both national norm and criterion-referenced information. 87

PAGE 92

Major Changes in the 1970s California Moved from commercial to locally developed tests. Introduced comparison score bands (SES, etc.); matrix sampling. Hawaii Introduced use of tests for certification as well as achievement; introduced technical support for schools which doubled with new tests. Michigan Added 10th grade tests; moved from sanctions to school improvement program; moved to CR testing; changed certification codes (to include competencies measured by SA tests). Minnesota Based the hiring and assignment of new teachers on needs derived from test data; added subject tests. Washington Changed from CTBS to CAT (1979). Virginia Changed to administration of tests. West Virginia Changed Utah Dropped science, Georgia Changed from Illinois Evaluation and SRA (1972); major changes responding to improper local to CTBS (1973). added reading (1978). NRT to CRT (1 976). Assessment programs merged (1978). California Added social studies, grade 8; piloted writing, grade 8; more grades added; critical thinking added; Instruction and Improvement Fund incentive plan introduced. Hawaii Introduced improved tests, expanded program. Oregon Moved from sampling, grades 4, 7, 11 to census, grade 8, but using local option from state approved list of tests; equating of test norms from approved list underway. Alabama Tests changed, improved; needy system identified for legislature, SEA assistance; GLE reporting eliminated; moved from sampling to census. Alaska Moved from sampling to census. Colorado Piloted new program for grades 3, 6, 9, 11 with standard tests. Connecticut Mastery testing program added to SA program; matrix samplin g introduced for SA program. Indiana Moved to mandatory program; legislature provided funds for remediation in districts identified by SA as needing help. 88

PAGE 93

Kentucky Changed from CTBS to CTB custom tests yielding both NR and CR information; testing at all grade levels K-12 introduced. Maine SA tied to state improvement plan, matrix sampling introduced; technical support to local districts introduced; parent reports added; all students tested, grades 6, 8, 11. Michigan None. Minnesota Moved to local option testing with strong technical support; expanded tests available from department (personal skills, energy). Missouri Moved to mandated program; language arts added. Rhode Island Moved to mandated program; moved from sampling to testing all pupils in grades tested. South Carolina School improvement plan introduced with SA; moved to mandatory programs; moved from sample to census testing of grade levels included; identification of districts where education seriously impaired could lead to sanctions. New Mexico Dropped grades 6, 11; added grade 3. Virginia Introduced funding for remedial education based on SA results. West Virginia Dropped cognition ability test. Utah Change in SA funding from Title IV to state legislature. Illinois Changed in areas tested; types of tests used in reading, writing, and science; types of scores reported (added norm scores). Several states have introduced item response procedures that should result in improved test construction and scales for the interpretation of results. Connecticut has introduced a mastery testing program in addition to its state assessment program. Sanctions have not been extensively used, but where they have, the trend is to drop this approach in favor of tying state assessment results to systems of identifying needy school districts for purposes of state support, or tying results to state or local school improvement programs as in Michigan and Maine. Finally, in the 1980s there is a decided trend toward making state assessment testing mandatory (as opposed to optional) for local school districts. 89

PAGE 94

. . *J -. Approximately half of the states reporting state assessment programs have now had them in effect for ten or more years, reflecting the tendency of programs to remain in place once established. However, major changes have been noted by most of these districts over a period of years, and even by a number of established for shorter times. State education agencies were asked in the OTA survey to indicate changes that are currently being contemplated in state assessment programs. Information submitted for the most part confirms the directions that have been established in the 1980s, including the movement toward norm-referenced measurement, expansion of subject and grade levels being measured, mandatory testing on the part of local districts, testing all students instead of samples of students in grade levels tested, introduction of more variables to assist in interpretation of test scores, and greater provision of technical assistance to local districts. Nothing submitted suggests that significant, innovative changes are being planned in the technology of testing, or in the philosophy, purposes or objectives of these programs.

PAGE 95

State Assessment Table XI Changes in State Assessment Program State Alabama Alaska Arizona Arkasas California Years program in Place 19 10 5 5 13 C U rrent Pro Y N u N -1 ram L N N Y Y Y D D o Switched from CAT to SAT in 1984, Emphasis on needy systems receiving attention of legislature and assistance from SE Eliminated grade equivalence in tee reporting in 1984, Increased fundin g 1981 from sample o census Areas tested and grade levels: change from had been in 197 writing added in grades 4, 8, 11 in 198 ( Grad. levels chanqed 1980 -3.6,8 1981 -4,5,6,8 1982-4,6, 7,8 1983-4,7, 10 eve loped tests; use matrix sampling. 975-Reporting. Use comparison Score Band push for quailty indicators and target dates for districts. 983-F34-More grades added; critical thin) added Agencies and Organ. c l Change other citize Currently Contemplated Charges Add grades 1,4, & 7 to science and social studies in 1986 .985 mandatory reporting by district Will change next yea to go from SRA to MAT; will keep grade levels the same: add science and social studies as mandatory (have been optional) change in contractor at end of 5 years built into program Is subtests. Science. 85/86 Grade 8 85/36 Writin g Grade 12 test, APP More critical think Add science and soc studies to grade 6 Agencies and Organ. nq for l chanqe Other Table II Table 111 Table V Tables IT, v SOURCE: Data Compiled for the Office of Technology Assessment by North~st Regional Educational Laboratory, 1985. 9 1

PAGE 96

State Assessment Tabl e Changes in State Assessment ProgrAgencies and Organ. Agencies and Organ. That Y Working Chang Othe Contractor advanced system Current Pr Y N Y comments N Years Program in Plac e (Continued) 14 7 14 d with 14 Currently Contemplated Chanqes t Other State California (Con Major Changes 1984-85-Add social studies to grade 8; pilot writing, grad 8; introduceed. improvement Colorado No state program Pilot program for 1985-86 in grades 3,6,9 6 11, using standardized tests Nothing anticipated until pilot proqram underway I Connecticut 1984-New mastery proqram added different than stat assessment Entire program being rethought tor 1986-Grades 4, 6 & 8 added in Mastery Test Proqram ~-Matrix sampling Added writinq year; Changed tests CAT to CTBS Do not anticipate major chanqes may chanqe test (securit a biq issue) District of Columbia None About to chanqe N.R. Florida Combined Minimum Compet Minimum Compet Georqia Areas tested; Adding several grades of N.R.T. beqinninq next year addinq writing Chanqed in 1976 from N.R.T. to C.R.T. and have added grades Changed reportinq methods to reflect type of test 92

PAGE 97

... . State Assesment Table XI Changes in State Assessment Program Illinois Indiana Iowa No stat program r 1nqa9 comb 1 i II 1 mum Compf qin lmum Compt Years program in Place 10 he 9 9 with n(. y, 9 ncy no I I Agencies and Organ. I 1Tt urrent ro J n 4 4 u M v N 9 I J r. W ) Major Changes ; Y F 1975-Tests obsolete, ~gh error rates, student att ltude poo 1 1 1979-Add competence : used tests for certi fication, not just achievement: include 1 technical support si ce 1979 has doubled he to new tests in I } 1981 1981-Added writing, affective domain, rade 3, dropped 4th: hew areas for qrade ~Id & science, social scle ce ec~s~onmakinq; atti d tests optional now. In 1978 changed everythlnq-evaluatlo and assessment merqed: o Areas tested 1983 0 Types of tests o Reporting methods changed, Originall t )uet reported p va U( 19134-Legislature 1 Y provided funds for remediation. Mandat r ke o b for c o c ange ther ofc.of Instruct Students Currently contemplated changes o Expansion of s o 0 funding (refused` for competency Want to add grades 8 $ 10 May shorten grade 3 testing (comp. 24 hrs. achiev. 7 hrs. Chanqes are anticipated after J U ly 1 additional grades to be tested in 1986. 10 other changes planned for this 3 year program (1984-87 no funds. Jan. 1987 models to be develop Agencies and Organ. Working $ Y Y Y for Change Other Statewide Comisslons

PAGE 98

state Kentucky Louisiana Maine Maryland Michigan Minnesota The trend in th e state is for Legislature to support the SEA in providing lette r for Iocal accoun t State Assessment Table XI Changes in State Assessment Program Years program in Place 6 8 8 1 5 n 1 6 1 5 mechnism Current Major Changes 1984-State policy changed, all grades tested (K-12) ; required curricula; type of test change possible sanctions. None 1984-State improvement plan matrix sample technical support) report to parents all students in qrades 6,8 6 11 1972-Switched to C.R.T ,changes in certification code 1974-Until then san e were used, after 1974 school imp. plan 1977-qrade 10 added 1979-Law for funding added Increase use of testing Hiring and assignment of teachers based on needs from data Moved from NR program to classroom testing with 3 parallel samples Added new subject 1984-Moved to local option system with state technical support (See Table VII) Agencies and Organ. Work Chang( Other tions with Educat Y Currently Contemplated Changes 1986-A1l five areas W ill be tested, writing included None None rs ion) lone 1986-Plan to add science on every pupil basis; would like a cycle of 4 subjects on an every pupil basis Increasing of students passing te legislative fundinq for 1985-86 to fine ways to challenge students New legislation says t O continue what SEA is doing. New fo r .985 are item bank and technical assistance Agencies and Organ. n g c l for ange Other

PAGE 99

State Assessment Table XI Changes in State Assessment Proqram Aqencies and Orqan. Aqencies and Orqan. it Wor k for Chanqq Workinq for Change i 1 c Other Othe r Current Years Program in Place 2 10 changes Currently Contemplated Chanqes State Mississippi Y N I o Early childhood ed added o Curriculum more For 1987: o Instruction will be changed o Add grades 6 & 8 with norm-reference tests o Subject areas tested o Analytic scorinq for those below 40 D Bias to be studied by committee precise Missouri I 1984-85-Random sampling added Add language arts assessment in 1985 1985-Mandated program regular assessment; lanquaqe arts assess chanqe in instruction cultural bias to be included Proposed by State Superintendent, mand, testinq at grades 3,6 8 & 11. Districts chose 1 of 6 tests, has not passed. Posible chances within 18 months Nebraska-No state program, no planned changes New Hampshir e No state program Considering testimg grades 4, 8 & 11, beginning 1985-86 New Jersey No State program, no problem changes 95

PAGE 100

s -. -current State New Mexico -. New York NO rth carolinia North Dakota No state program no planned changes Ohio No State program, no planned changes O H OH 1 : Years programsran in P 1 ace 13 Since 1878 8 1, changes, qes There is no state Assessment Table XI Chanqes in State Assessment Program I Agencies and Organ. 1981-Dropped grade added grade 3 Regents Exam. Program in tests themselves r 0 # different subjects decreased over yea s 0 original were essay now use objective and essay question method of development o originally by SEA I staff, now classroom teachers develop t s Areas tested expande science 1984-85 writing 1983-84 Types of tests used changed Reportinq methods changed when type of test changed I madated state-wide Annual Still Commision co support Currently Contemplated Change Exit competencies are designated for minimum competency test. plan to add items to CTBS testing progress towards these competencies in grades 3, 5 & 8 Minor realigning of subjects nothinq of great significant on nor Irs it fall, M A S test qrades 3,5,7,9 and 11 At their optio n 66 percent of the students are tested. Most use SRA. State Education Department is beinq reorqanized. A new director with an emphisis in testing and curriculum development is coming in. Chanqes may occur then. Ohio apparently reuires LEAs to test 1-12 in reading, math and writinq each year. This began in 1983 from a State Board decision of 1982. Test results are used primarily for local curriculum development. No data are given to the State. The SEA does provide technical assistance in administration and interpretation. Two million students are tested at a cost of $5,000,000--all of which is appropriated by the legislature to go directly to the districts. Of that, $2,000,000 was spent to buy new tests this year. Each year there is a move in the legislature to begin collecting state-wide data. Chances look better each year, but It has yet Agencies an d Organ. b a I Y I .96 othe r Reqents to pass.

PAGE 101

State Assessment Table XI Changes in State Assessment Program I Agencies and Organ. Currently contemplated Changes Agenciee and Organ. w n I Change Other content for C Change Other Current Years Proqrarn in Place 11 10 5 1 Not State Oreqon o 0 0 To add more grade levels (3,5,8 & 1 Chanqe tests to match state goals Make tests Initially reading Y and math. This assessment changed reading and math; tests currently specify appropriate available to dist for full district testing tests to district and gather data from all districts in reading and math. Changad testinq from grades 4,7 & 11 to grade 8 only. Pennsylvania Rhode Island South Carolina P A Y 1985-Every pupil Y Grade level shifts 3,6,8 & 10 tested across subject listed tested with a standardized I test. Y n show 1984-Ident~fles districts where .986-Drop 10th grade add grade, Seque will be 4,5,7 6 9 in reading, math, lanqu arts and social scie quality of education seriously impaired o Mandatory testing o Sample to universe o 5th grade reading o Could lead to sanctions not for districts not show improvement Next year mandatory for all LEAs; will add interest and aptitude tests South Dakota Brand new program r interview I Tenesse e I I PAl: 1985 variables to Student variables Sex Interpretation of data: ?Al: School variables Teacher questionaire Items: Relationship with parents Education level Supervision in school Class size Parents education Type of community Race Mobility-frequency of sch. chg. Students perception of parents interest in school TV viewing habits Parents expectations of education Reading materials in home Number times classroom observed for instructional purposes Perception of buuildint leadership Teacher initiated environment Freedom f rom disruption Perception of discipline Involvement in planning Students report how much time spent reading at home Students report how often required to write in school School variables Grade enrollment Low income Tuition School climate Condition variables Students perception of ability to do Students report amount of timee to do Students report how often tested homework math assiqnments Students report how quick tests returned to them (grades 8 $ 11) Students perception to classroom discipline (grades 8 11) Number hours students employed per week (grade 11) How often receive direct instruction for math, English, .science, social studies (grades 8 & 11) Percent of students taking math, English, science, social studies (grades 8 & 11) Interest in school all grades Percent academic college preparation students (grade 11) 97

PAGE 102

Stat. Assessment Table XI Changes in State Assessment Program .S. . Stat e Texas No stat program Utah Vermont No state proqram, yo expected changes Washington West Virginia Year s Program i n Place 10 changes 35 9 23 no t Current N N Y N Y ram Y Y N Y for N Major Changes Exit level to be administered 1st to llth grades in 1985-86 1978-droppod science added reading 1984-added language critical think & other Title IV money until 1981, then Legislature appropriated funds 1972-Chanqed to SRA 1980-Began financial provision for remedi ed. First 3 years used CT B 1979-Changed to CAT 1984-Test all 8th grades vs. sample 1973-Changed to CTBS 1985-Dropped cognit abilitles test intervie w None Agencies and Organ. change Currently other Contemplated Changes 1966-will sample students and test wi a normed test to compared with new TEAMS test and provide a comparison base fo the future Desire to l xpand grades and subjects further, no firm plans I I None Appropriate for 198! o Census in 4,8 & 10 (4,8 6 10-FIAT vs CAT) Addinq more demographic dat a Pilot test 1985 for LEA's writing starting 1986 (analytic/holistic scorinq) None Agencies and Organ. Working for Change Other WA Roundtabl Committee LEA's

PAGE 103

MINIMUM COMPETENCY TESTING PROGRAMS Introduction The peak growth period for statewide competency testing was 1975-77. As Figure 1 shows, this growth leveled off in 1982 Although a few states will be phasing out competency testing, most states are maintaining their current programs with some of these states making changes. Typical changes are adding new skills to be tested or adjusting the cutoff score that students must exceed. Currently 11 states require high school students to pass competency tests in order to get a diploma Four additional states have plans to add a competency test requirement for high school graduation. Figure 2 shows the different purposes of competency testing. As is the case with assessment testing, minimum competency testing programs vary widely from state to state. Nine states reported their minimum competency programs were tied to the state assessment programs. Sixteen states reported responsibility for administering the minimum competency program rests with the state agency. Eighteen states said the program is mandated by the state, but administered by the local districts, often with the local school district defining both the competencies to be measured and the standards to be met. The diversity of these programs is evident by the data in Table 1, a summary of which follows. 99

PAGE 104

. . FIGURE 1 Number of State s .. ) 40 30 10 0 7 l l l l l l Mandating Competenc y Testin g l ..*.... l l 73 74 S 76 77 78 79 80 81 82 83 Year of Mandet e SOURCE : OTA. 100

PAGE 105

.+.- . . . FIGURE 2 PURPOSES OF STATE MANDATED COMPETENCY TESTING PROGRAMS Elem onitor Sch. standard s HS Grad Standards Remedial & Diagnostic SOURCE : OTA 101

PAGE 106

Table I Characteristics of Programs Responsibility for administering the minimum competency programs was found to be about evenly split between state education agencies and local education agencies. Broad areas of competence to be measured normally are defined by state education agencies, but responsibility for the specific definition of competencies is about evenly split between the two agencies. The purposes states give for the competency testing are: remedial/diagnostic (27 states), standards for high school graduation (16 states, plus 4 more to be added in future years), monitoring of local education agencies educational programs (11), elementary graduation standards(l). More states reported using state-produced tests for their minimum competency program than any other type of test. Seventeen reported using state-approved or prescribed tests, 9 reported that local education agencies were given the option of producing their own tests, and 6 reported that local education agencies were to produce their own tests by state mandate. Most minimum competency testing is confined to the areas of reading, math, language arts, and writing. The even spread of number of states reporting use of minimum competency tests at each grade level above grade 2 reveals that minimum competency programs have been designed to track student progress over a period of years so that any need for remediation can be identified at intervals along the way. Typically, the tests are administered periodically as in grades 3, 6, 9 and 11 or some similar configuration. In a number of states, tests are administered in every grade within given ranges, and in 2 states, Kentucky and Vermont, they are administered in every grade, K-12. 102

PAGE 107

i z z z z I I al >

PAGE 108

. I I nz z z z z z z z z z z `````` m

PAGE 109

x z z z z z I H z u u I a z 105

PAGE 110

. u, u. z z A+ I t+ .1 I bl Ml z . -. --. z I I I

PAGE 111

z I h z 107

PAGE 112

z z z z I ,108

PAGE 113

I I i i ul I z z u 109

PAGE 114

z z v z z z z 1 z z 1 0

PAGE 115

z I C I z z

PAGE 116

u x I z u c ( z 1x 0 1

PAGE 117

. 1 m 113

PAGE 118

u I z u o z I c 114

PAGE 119

. I z 115

PAGE 120

Table Il Testing Programs States rely more heavily on their own tests for minimum competency programs than is true for state assessment programs. Twenty-one states reported writing items for their own tests, sometimes using item banks. Some of these banks were built by the states themselves, and others were secured from test publishers. Criterion-referenced tests are most often used, with nationally standardized tests and national norms being used by relatively few states The task of setting standards for the minimum competency tests was undertaken by the state board of education in eight states, the state education agency in six, testing specialist/state education agency contractor in five, subject matter specialists in five, and educator/citizen committees in four states. In cases where the state education agency or state board of education set the standards, it was usually with input from groups mentioned above. As would be expected with criterion-referenced programs, the type of standard normally set was a percent right of items attempted, sometimes by total tests sometimes by specific competencies; or the number correct of number attempted based on predetermined acceptable score cut-off points, usually performance level desired in performance levels. Five states reported use of IRT scale in combination with professional judgment relating to the scale score terms. Only two states reported use of normreferenced scale cut-off scores. Seven states reported linking their standards to holistic writing ratings (e.g., New York specifies a 65 percent rating based on a model answer for a given topic). Race and bias reviews are reported for tests used in all but a very few States. Statistical analysis of items used in tests is also reported by all but a few. 116

PAGE 121

The fact that most states have developed their own tests, and that these tests are criterion-referenced measures employing standards arrived at by a variety of procedures, suggests that the rigor with which these tests have been constructed and the quality of the tests varies widely with the competence and experience of the state education agencies developing them, and with the procedures by which standards are set and student results evaluated. ; 117

PAGE 122

. o l z u I i 1 z l u 118

PAGE 123

. .. ~ m c z z z z z o z z z ?w3 s z z z z z c 119

PAGE 124

I I 120

PAGE 125

z u z z 121

PAGE 126

z 122

PAGE 127

u ii z z z E z z I z z z z z z z z z z z z z z 123

PAGE 128

z z z z a z a 7 z z z z z z T u 124

PAGE 129

z u c -u c u z z 125

PAGE 130

v z J I 126

PAGE 131

i, I z z z z m 127

PAGE 132

I I a z z z z z z z a z z z z z a z z 128

PAGE 133

z I al z z I z z z z 129

PAGE 134

Table III Reporting Practices of Testing Programs The methods diversity of testing 13 use raw scores, of reporting minimum practices in the states. 15 use percent correct. use IRT scale scores, 3 use percentiles, and competency test results also reflect the Seventeen states report using pass/fail data, Among states that report derived scores, 9 2 states report standard scores. Most states report a mix of these types of scores, and within a given state that mix may vary depending on the subjects being tested. Reports of test results are distributed to teachers and students principals in 25, superintendents in 25, state education agency curriculum n 25 states, personnel in 22, state boards of education in 22, media and public through state education agency reports in 20, legislatures in 21, and the public on request in 20 states. In general, the reports to students and teachers are individual score reports, while the reports made available to the other parties named The common use of minimum are summary reports. competency test information for remedial purposes suggests that most tests yield information on specific objectives within the tests, and a number of states yield information on specific objectives within the tests, and a number of states explicitly point to the fact that pass/fail requirements were set for each objective within the tests. The trend, however, appears to be away from criterionreferenced standards for each objective toward pass/fail standards based on overall IRT scale score, with added diagnostic information for specific objectives. 130

PAGE 135

Minimum Competency Table 111 Reporting Practices of Testing Programs State A labama Alaska No proper Arizona Arkansas California Colorodo No data District of Columbia N Y N of 1 N N N N Y N N Y N 1 N Rule N N N N N N N Derived scores N N Y N D. R. P N CRT obj mastered Each com petency must be passed. Y Y y. tile Law y N y. l y* Y Y* I y* Rule y* N y. l y* y. y Results Made Available to: y* N Y* N y* l N ,{* y. y* N y* N y+ l y* y* y* er f ~ y* N N y, l N y* performance N y* N y. ,* N y. y* Notes/ changes l l l N l l Did not lndlcate L f IS or GS Parents-Law; did not indicate if IS or GS. Did not Lndicate Lf IS or GS. not requl red; number I > f studerks at talnl nq minimum compete ncles requirement available to SEA. l Every LEA has 3 different pollcy. l Dld not lndlcate Lf IS or GS l Dld not lndlcate lf IS or GS. Dld not In{llcate if Data Compiled for the Office of Technolo~ Assessment >y Northwest Reqional Educational Laboratory, 1985. 131

PAGE 136

. Minimum Competency Table III Reporting Practices of Testing Programs Reported Derived scores Results Made Available to: S c y y* GS y l y* y* Performace y y* Y* ;s y* l y, y. permance al N y Y* l Y* Y* Notes/ changee State Hiwiaa Y Y Y N Y ) N Y N Y N l Did not Lndicate IS or GS. l Dld not ~ndlcate Idaho IS Or GS. Illlnols Not applicable I Indiana No progr$ Iowa No program Y Y J Y N Y Y t Y Y Y .,, K,III>.i> .,,,. i k;, Did not KS or GS Did not Is or GS None nd nd cate cate Maine No program Maryland Massachusetts Did not indicate IS or GS. These are an LEA however, Leas re SEA: 1) standard: and 2) percentage students that do meet standard, I Minnesota No program Mississippi Y N Did not indicate is or GS. 132

PAGE 137

Minimum Competency Table III Reporting Practices of Testing Program Montana No program Nebraska Nevada North Carolina NA N prog N N Y l North Dakota No prog Ohio NA for NA N N Y Y Y NA Reported Derived scores NA : High school only N Other scale Y N N NA y* Is Is GS Y NA I S GS y GS GS GS Y l NA Results Made Available to: N y GS GS GS Y NA mance y GS GS Gs y** NA Notes/ changes None Until Spring 1985 percent correct on number of items right. l Did not indicate if IS or GS. None None l Adjusted raw score to a common scale. l On sub-tests. l ** Did not IS or Results of tests are not provided to the state (including pass/fail rates on an annual basis. SEA evaluates 1/5 of all dirstricts each year for accreditation (All districts every 5 years.) Part of evaluation is to check to see that mimum standards of competency are in compliance. This evaluation includes examining test results. program is too new for any useful data from accreditation reviews. 133

PAGE 138

IIII Minimum Competency Table 111 Reportinq Practices of Testinq Programs Pennsylvania Rhode South North Types of Infer] Oklahoma No program Oregon Y Island No program Carolina N Y t Dakota-No program I Not available interview Utah Vermont Y NA NA Y N NA NA { Y N 9th Graded Test NA NA NA NA Y N Repot N A I NA N flag objectives on which student needs work N NA NA Y IS NA NA Is 1 NA y* y* GS NA NA Gs Results Made Available to: I N A y* y* GS NA NA GS NA y* y* GS NA NA N Notes/ changes State does not C O data Test not used agair it is administered. l Did not Indicate IS or GS. l Did not indicate IS or GS. The state does not publish a state-wide report. Information provided to district school and district he data must be resented at an offi school board meetinq These meetings are p News media make a ha attendinq as many 10 board meetings as po and thereby forming own state-wide rep, 1A 134

PAGE 139

Minimum Competency Table III Reporting Practices of Testing Programs State Washington N O West Virginia Wisconsin Not Wyoming No program I No program yet in place; see Table VIII I I I I I available for Interview. I state data; district Let required to assess. Results Made Available to: Notes/ changes 135

PAGE 140

Table IV Examples of Changes in State and Local Educational Programs and Practices Resulting from State Minimum Competency Programs Reports of changes in state education policy attributed to minimum competency programs range from the general comment of the Connecticut office that results have been used constantly to improve programs, to the listing of extensive changes by states such as Florida and Georgia. Florida attributes these changes to the minimum competency program: a 1976 Educational Accountability Act resulting in improvements in kindergarten through postsecondary education including initiation of a state compensatory education program, a college sophomore testing program, increased high school graduation requirements, a new primary education program, a new middle school education program, and changes in the principal and teacher certification examinations. Georgia cites the adoption of policies dealing with changes in certification and staff development and the establishment of public school standards by the state board of education as direct consequences of this program. North Carolina states that students simply no longer graduate without minimum competencies. Examples provided of changes in school programs and practices include greater emphasis on writing in the schools, examination and restructuring of curricula and programs, increased attention to remedial education, improved student performance as measured by achievement tests, use by school districts of state-developed support materials such as spelling lists, more local curriculum development and evaluation, and improved methods of diagnosing student needs in school systems. The few states that report an impact of the minimum competency program on state curriculum and instructional support cited better definition of the basic skills and developmental skills required in the minimum competencies program and their incorporation into the curriculum frameworks and guides of state departments. 136

PAGE 141

Minimum Competency Table IV Examples of Changes in State and Local Educational Proqrams and Practices Resulting From Stat. Minimum Competencies Program Type of Change Noted [ State State Education Policy School Programs, Practices Alabama First grade graduation Redeveloped curriculum often requirements in 1983 for 1985. becomes part of school policy. I Alaska No proqram Arkansas california SOURCE: Data N Y Law went into effect in 85% of students must be in a school achievement improvement. N N Constant use of resykts No N Y N 1983 and must be implemented by 1987-88; achieving mastery or need to be involved program; students have 2 years to show Y: Parent conference required to tie curriculum to assessment. N t O improvement of programs I N: Already tied to curriculum I Y: : n 1976 Education Accountability, Act; once implemented, I started a long-term series of improvements from Kinderarten I thru post-secondary e.g. initiation of a state compensatory education program, initiation of college sophomore testing program I Increased high school graduation requirements; new primary education proqram; new middle school education program; principals I certification exam: teachers certification exam. Y: Policies added dealing with Y: chanqes in certification and staff development based on need identfied by lower test scores in some I grades; pubilc school standards established by board added. I Schools having to meet new standards a S a result of test scores. Complied for the Office of Technology Y More curriculum development and evaluation. Assessment by Northwest State Curriculum, Instructional Support N.C. were incorporated into course of study. N Y: Course content guides required through educational standards; they specify core curriculum in all subject areas (Includes basic skills. developmental skills, and extensions for brighter students) N N Y: Curriculum frameworks which establish content for all h.s. courses. Y: Just adopted because of testing all qrade levels in all subjects specified a minimum of what objectives must be tauqht. Regional Educational Laboratory, 1985. 137

PAGE 142

Minimum Competency Table IV Examples of Changes in Stat. and Local Educational Programs and Practices Resulting From State Minimum Competencies Program State Hawii Idaho State Education Policy undergoing serious review. Illinois Not applicable I Indiana -Noproqram Iowa No program Kansas N Kentucky Same as for state assessment Lousian a I Maine No program ram Massachusetts 1. 2. Minnesota No proqram Mississippi Missouri I N N N Montana No proqram Changes made in 1984 and 1985. 1986 grades will be withheld at 9th grade if failed. N Type of Change Noted School Programs, Practices N Y N N Consideration requirments. of proqram N teachinq of writing and cope and sequence of subject. Look at currculum Some spellinq proqrams now use liSt from state developed spellinq test. Schools report students. v d work from lower half of Morwre courses offered for remedial math, writinq. Writinq test has ifluenced writinq curriculum -better results. State Curriculum, Instructional Support N N N Chanqe reported, example not reco N N 138

PAGE 143

Minimum Competency Table IV Examples of Changes in State and Local Educational Programs and Practices Resulting From State Minimum Competencies Program State State Education Policy I I New Hampshire No program New Jersey New Mexico New York North Carolina Notrh Dakota No Several policies changed. Type of Change Noted School Programs, Practices State certification based on results. Compulsory education funding based on results. Despite secure items, changed each year, scores have improved. This implies changed school practices. Teaching of writing now emphasized in schools as a result of competency test. Students no lonqer qraduate without minimum competencies Specific fundinq for remediation was provided (average $8 million I Oklahoma No prog Pennsylvania ram Pending a movement toward minimum N proqram 1984: Shifting of l1th to 10th grade in 1906. ion a year to work on progam) program I N: New program ) N N Y: Many districts have hired additional teachers in readinq and math since they had to create remedial programs (had to create new or different programs) ; some districs have creative preventive programs and others have begun to review readinq and math programs to see how they reflect objectives being tested. Because of funds for compensatory education and tests based on objectives defined by legislature, Specific objectives and skills are qiven by qrade to teachers and students with Sample test items. State Curriculum, Instructional Support Graduation requirements were revised. N N N 139

PAGE 144

A Minimum Competency Table IV Examples of Changes in State and Local Educational Program and Practices Resulting From State Minimum Competecies Program r State State Education Policy South Dakota No program Tennessee Not Texas Utah Vermont Virginia available for interview Legislature has changed l requirements. NR Emphasis used to be on pupils with lower scores, now shifting away from that. Washington No proqram I Type of Change Noted School Programs, Practices Accreditation change affected local programs. Remedial help increased due to test. Consequently bottom 50 has improved Many schools screen those their scores. NR give a pre-test to to receive special tutoring before 10th grade test. State Curriculum, Instructional Support Same bill that changed accreditat. chanqed state curriculum. N NR N 140

PAGE 145

Table V Functions of Technical Staff and Failure Rates The staffing of minimum competency offices in state education agencies follows the pattern of state assessment offices and often includes the same personnel. Thirteen states reported technical staff employed to upgrade tests, and 10 employed testing personnel to provide local assistance. Technical assistance is provided to local school districts in interpreting test scores and using the results by 26 states, and in the administration of tests by 22 states. Local education agency personnel receiving assistance from the state agency include principals (19 states), local education agency administrators (24 states), and teachers (17 states). The Texas Education Agency reported that its personnel give workshops to regional educational service center personnel, who in service and other assistance to local or local education agency personnel. turn provide in141

PAGE 146

---. Minimum Competency Table V Functions of Technical Staff and Failure Rates art : Functions of Technical Staff P I Technical Staff Local Assistance Given Interpret scores Administer using tests resu1ts Groups Receiving llth Grade Part 11 : Failure Rates to: Provide Assistance Principals Y Overall 1984-85 Initial 1984-85 Minorlty Non-minority loca1 assistance LEA admin. Y Y: Law Y Usaually test coordi eacheri Y N Y N NA Y N N Alabama Alaska No proqram Y N Y N Y N Y N NA Y apply N Y Y N Y: Law Y : Test Y Of a possible four attempts NA NA 6% 64% 54% 14% N NA N Arizonia Arkansas N Y NA NA NA NA NR NA NA 12th : 9% 1lth : 78% 9th : 64% 6th : 28% (1983) N NA N Cordinator Workshops nator's principals and counserlors California Y Primarily during 1977-78 N NA Y N N NA Y hen tern bank first came out N NR Colorado NA NA NA Y No data Reading : 4% Math : 17% Writing : Connecticut Y Y N NA 50% 8% Lanquaq Arts: 6% NA 50% Delaware Does not Provide suggestion on how to use item bank in puttinq toqether I test N N District of Columbi a N Y Y Y SOURCE : Data Compiled for the Office of Technology Assessment by Northwest Reqional Educat xonal Laboratory, 1985. 142

PAGE 147

1 Minimum Competency Table V Functions of Technical Staff and Failure Rates Part I: Functions of I Local Assistance Technical Staff Groups Receiving Assistance Technical Staff Gi Employed to: Provide Upqrade local Administer tests assistance tests Interpret scores using results Y Y Y Y Y Y Y Y Y Workshops Part 11 : Failure Rates Overall 1984-85 Initial 1984-85 Minority I on-minor i State Florida (Communication = reading and writing combined) communicaton: Communi8% Math: 3 6 Math : N N SEA staff may only be hired i f the leg islature authorized positions; the leqis lature has authorized positions, Y Trail Y Y : Also ing workshops Y Y : Test adminis raters Y Y Y Y Y 12% cation: Math: White 7 1 6 Black 26 reflect Hisp. 20 a new test with higher standards White 10 hack 32 Hisp. 22 but not with specific charge to do either of these. Reading 8 Math 13% Reading 5% Math 119 N NA Reading 16% Math 29% N NA Reading 2 Math 4 Georgia Y Y Y Y N Y Y Y N Y N Y Workshops Y Y N Y Y Y N Y Y N Y N Y Y Y Hawaii Idaho N N NA NA counselors illinois Not applicable Indiana No program I Iowa No program Will possibly CO llect this data ne: ear; at present they only report percentage o f students who meet and exceed standard in two subject are. Y N Y Y Y N Kansas NA NR NR NA NA NR NR NA NA NR NR NA NA NR Kentucky Louisiana Y:Chanqed N Maine No program NR NA Maryland Massachusetts Y Y 143

PAGE 148

A---Minimum Competency Table V Functions of Technical Staff and Failure Rates I Technical Staff Part I: Function of Technical Staff Meal Assistance Given Groups Receiving nte r Assustance to: Provide local assistance Part II: Failure Rates Overall 1984-85 Initial 1904-85 Minority bon-minority Too soon for data ----I scores being Upqrade State tests Mdnubuster tests LEA admin N N Y Y Y Y Y results Teachers Y Y Michigan No program I Minnesota No program Mississippi Missouri N Y: Has tapered off as need decline N N N N N Y N Y N N Y --N Y ---36$ NA Writing 18% 23% NA Writing 2-3% NR NA No data NR NA No data NR 2 conferences Montana No program Nebraska Nevada N Y: 5-6 Y N Y Y Y Y f: If LEAs Request it. Y Y N N N Y Y N Y N years ago. Review by ACT, panel of experts Math 3: 20% lath 11: 10% Readinq has changed althouqhl test more dificult New Hampshire N O program New Jersey New Mexico New York N Y N Y Y N NR 24% NR 14 % I Pupil Evaluation Test (G lot 25% Very different I Regents Cometency Test SlightReadinq 10% better Writing 20% Math 30 he & Math same 20% I I I (1) Minority fiqure is unweighted average of figures for Blacks, Hispanics, and Native (14%, 9a, 21% respectively) Other minority groups failure rate is 110. Americans 144

PAGE 149

..-4. .. . Minimum Competency Table V Functions of Technical Staff and Failure Rates Part I : Function, of Technical Staff I Local Assistance Technical Staff Given Groups Receiving Employed to: I I Assistance I --Rates I Provide local Administer using I I -.. LE A Overall I 19134-135 ~n-minori I --. -ix;:i-1~ 1 Part 11: Fallur= --1 -=.. State tests assistance teets results Teachers Princi pals admin. Initial 19i34-85 Mine._.. North Carolina N N Y Y Y Y Y 17% 10% NA NA Figures represen t first-time test takers only. North Dakota No proqram Ohio I N Y Y N N Y (1) (1) (1) (1) (1) Results of tests are not provided to the state (including pass/fail rates) on an annual basis. SE. evaluates 1/5 of all districts each year for accreditation. (All districts every 5 years.) Part Oklahoma No pr Oreqon Pennsylvania ram evaluation is to check to see that minimum standards of competency are in compliance. This evaluation Includes examining test results. Program is too new for any useful data from accreditation review NA N NA Y NA Y Y Y NA Y Y Y NA NA NA NA NA NA NA N Y Y Available from State Summary of Results 1984, Tables 7-18 Support materials and workshop; 8-10 workshops; also intermelate units provided assistance SEA trains then N Grade 1 leading 20% Math: 19% 24% 1 NR 32% NR 10% South Carolina Y Y Y Y Y Y south Dakota No program I Tennessee Not available for Interview Texas N N Y Workshops are given for ESC representatives These people then are available to help LEA personnel. (11 Failure rates reported are for 9th grade only. Other grades are not scored pass/fail. Minority figure is estimated averging Hispanic and Black scores across reading and math. Minority scores for writing were not available. Average of readinq and math failure rates in 1985 for Blacks was 35, for Hispanics, 29%. Steady improvement has been shown In all races, the qreatest improvement beinq amonq blacks. In 1980 Blacks scored 409 below whites. Now the difference is 25\. Overall scores showed a drop In 1985. This was attributed to the simultaneous pilot testing of next years test (which is harder). The combined affects of a harder test and a longer test probably resulted In lower scores o the TABS portion. 145

PAGE 150

Minimum competency Table V Functions of Technical Staff and Failure Rates State Utah Vermont Virginia (10th Grade) Technical Staff Employed Upgrade tests N N Y Washington No program I to to: Provide local assistance N N N t I: Functions of Local Assistance Given Administer tes t Y NR N interpret scores using results N NR N West Virginia No program yet in place; see Table VIII Wisconsin Wyoming I I Not available for interview I I I I No state data; district required to assess chemical Staff Groups Receiving Assistance Teachers N NR NA Principals N NR NA LEA admin. Y NR NA Part II: Failure Ratee Overall Initial NA NA 18% 1984-85 NA NA 5% 1984 Minorlty NA NA 10% l-es Ion-minority NA NA 3% 146

PAGE 151

Table VI Testing Time Required (Minutes per Student) There is little uniformity of practice from state to state in the amount of time devoted to minimum competency testing. In general, the time devoted to these tests is greater than that devoted to state assessment minutes in length are not uncommon, and few Whereas state assessment tests normally devote basic skills, minimum competency tests tend mathematics. New Yorks writing test, North for the pupils involved. Tests of 90require less than an hour to perform. more time to writing than to the other to devote more time Carolinas reading and Georgias reading and math tests require the greatest amounts of student to reading and math tests, and time. 147

PAGE 152

Minimum Competency Table VI Testing Time Required (Minutes per Student) State A1abama Alaska No stat proqram Arizona Arkansas District of Columbia Ohio 90 NA Y NA NA 60 VA N NA 135 NA 701 Not Math 90 NA Y NA NA 60 NA N NA 135 NA 9 0 Language Arts 90 NA NA NA 30 NA N NA N NA 1 90 Writing 45 NA N NA NA 40 NA N NA N NA 120 1 Science N NA Y 1 NA NA N NA N NA N NA N Social Studies N NA Y NA NA N NA N NA N NA N Critical Thinking N NA N NA NA N NA N NA N NA N I Other/Notes This is an average. May take longer at grade 11 and less time at grade 3. Up to each LEA; Information not available 1 Tests are not timed; rec. give over 4 mornings for total test Locally done 60 life skills rests are untimed estimate 45 seconds per item; tests are lot the same length for each grade, although there are approximately 250 items/grade level Comprehensive Graduation Test 90 min performance Testing 150 min. Grade 3 150 ,min, Tests are power tests and are open-ended with recorded time constraints; figures are recorded testing times SOURCE: Data Couplied for the office of Technology Assessment by Northwest Regional Educational Laboratory, 1985.

PAGE 153

--. Minimum Competency Table VI Testing Time Required (Minutes per Student) Language Writing Social Studies Critical Thinking State Reading Math Arts Science Other/Notes Indiana No program Iowa No proqram Kansas -IO NA N NA NA 70 100-qr. 50 NA N NA 120 NA NA 70 150-gr. 11 50 NA N NA 120 70 NA N NA NA 100-qr. N A 1 NA N NA N N NA N NA NA N N A 1 NA N NA N NA NA N N A 1 NA Kentucky Louisiana LanguageArts test covers reading, wr and other language Maine No proqram Marylands NA NA NA NA Test untimed; vari reatly Masachusetts Depends on test elected; in Jener. 80 minutes total Michigan N O program Minnesota No proqram Mississippi 70 N A 1 N N A L 11 Missour i lot a timed test Nebraska NA NA sting time depend test chosen by L ate developed tes take between 2 minu d 30 minutes per kill. W There are skills in the se 1 sections have n time limit. 1 149

PAGE 154

State New Hampshire a m New Jersey New Mexico New York No Oreqon Pennsylvania Rhode Is land No proq ram South Carolina South Dakota No program Tennessee Utah vermont Reading s: 75 Elem: S. A. 45 90 40 90 150 NA NA NA 90 Not available 60 NA NA Math HS 45 Elem: S.A.T 45 90 40 90 150 NA NA NA 90 for 55 NA NA Minimum Competency Table VI Testing Time Required (Minutes per Student) Language Arts Elem: S.A. I 45 N 40 N. 60 NA NA NA N reveiw N NA NA Writing S: 60 N Y 120-180 60 NA NA NA 90 55 NA NA Science N N 40 N N NA NA NA N N NA NA Social Studies N N 40 N N NA NA NA N N NA NA Critical Thinking N N N N N NA NA NA N N NA NA Other/Notes Varies by LEA. Other 5 areas total 90 minutes. Test times are averages. Test is not treed. District determined. Not a timed test; at least 4 hours at each grade level for readinq and math combined 150

PAGE 155

Stat e Washington No proqram West Virqinia proqram yet in see Table VIII Wisconsin Wyming No state data; district required to as Reading 60 o lace; Not available, e Math 60 e for interview Minimum Competency Table VI Testing Time Required (Minutes per Student) Language Arts N Writing N Science N Social Studies 60 1 Critical Thinking N Other/Notes 1 10th grade test 60 Other; No time limit, figu are estimated avera 151

PAGE 156

o-9 10-19 20-29 30-39 40-49 2 50-59 1 60-69 3 70-79 4 80-89 90-99 4 100-109 110-119 120-129 130-139 140-149 150-159 Table VI Testing Time Required (Minutes per Student) Language Math Arts Writing 1 2 2 5 1 1 1 1 1 2 1 1 2 1 1 2 1 2 1 1 2 152

PAGE 157

Table VII Changes in Minimum Competency Program State minimum competency testing programs have been in effect for as long as 12 years in Oregon to only within the last year in Ohio. Four states have programs ten years old or more (Arizona, Florida, Nebraska, and Oregon). Most changes in minimum competency testing reported are simply addition of new subjects to be tested, shifts from norm-referenced testing to criterion-referenced testing and back, introduction of reporting that assists remediation efforts in the schools, shifting of emphasis from high school graduation standards to minimum standards covering a period of years and sometimes culminating at the eighth grade, and changes in the years in which tests are administered. Plans for future changes in minimum competency testing programs were mostly the addition of new areas of testing and some changes in standards. Two states indicate they were considering moving to norm-referenced tests, and another is considering a move from twelfth grade graduation emphasis to eighth grade and fifth grade promotion emphasis. Connecticut has added a mastery testing program for grades 4, 6 and 8, and plans to phase out its minimum competency program in 1987. Addition of science is being considered by two or three states, and writing in two or three states. There is a trend away from norm-referenced tests, toward the use of criterion-referenced tests or criterion-referenced mastery tests, and toward the use of IRT scales in establishing cutoff standards. 153

PAGE 158

. Minimum Competence Table VII Changes in Minimum Competency Program Agencies and Organ. Agencies and Organ. hange Other current Program Y N N Y Y Y ram Years Proqra in Place Currentl y contemplated Changes State Major Changes Grade 11 grad. added in Firs class: Alabama Alaska No program Arizona s m 10 3 8 9 6 N N Y N N N N N N N N N N N N N N N N N N I None Anticipated to change to more stringent guideline due to legislation passed last year requiring promotion and retention guidelines. Also developing essential skills list that students in grades 9&12 must pass --do not know when will go into effect. Arkansas Obj's added In Science and L A in certain grades Overall test score added. Remedial component o 0 0 s e Phasing 198 7 Mastery 4,8 None added: plan to ensure students attain mastery. None lost districts do not test. District which does is moving away from M.C. (phasing out). None California Colorado Connticut None None out of M.c. to substitute testinq at I SOURCE: Data Compiled for the Office of Technology Assessment by Northwest Regional Educational Laboratory, 1985. 154

PAGE 159

Minimum Competency Table VII Changes in Minimum Competency Program Agencies and Orqan. Major Changes When M.C. first specified by board it was for graduation only; now only at grade 8 Also, because responsibility has always been at LEA level changes could have occurred without SEA knowing specifics. Agencies and Organ. c N N N N N Change Other Current Program Years Progrm in Place Currently Contemplated Changes State Dlaware 5 3 10 Grade 10: 4 3: beg 6 N Y Y: Ba the Y J. Y Y N cal Sam N N N N N N N Instructional dept. putting together course requirements; may be spin-off of item bank being developed for assessing these-result of Governors Task Force requiring Mastery Testing. l N N N N Against concept of mastery testing at SEA level; instead recd item bank being developed. Both ergs agreed. Change in specifying in more detail -student perf. reqs in terms of content and -assessment None N DOE None None Y: legislature asked for full-scale eval. of program 2 years aqo and came out with set of reconsnendations: 1) enhancement changes 2) merging SSAT 1 & 2 at grade 10. Note: initial leg. did not specify comp. level; in 1980 they did; also in 1980 state assumed responsibil. for testing--prlor to that, Leqislatiol very vaque. Grade 10: passinq scores w1ll be raised beginning Fall writing assmt. to be added Fall N 1 2 add Life-skills. None N Hawaii 155

PAGE 160

ompetenc y able VII Changes in Minimum Competency Program State Idaho Illinois Not Applicable Indiana No State program Iowa No prog Years Proqram in Place 6 N Current Y N Major Changes o 0 Min. -std levels adjusted in 1984 in preparation for change to grade 8. Types of cut off scores changed to grade 8. SEA to do the same model proqrams but counsel Legislature did not require testing each year; they required test in 79,81,82 84. Some LEAs may have opted to administer tests in other year Leg. in 1984 stipulated ( that tests be given 5 consecutive years Prior to 1984 tested in qrades 2,4,6,8,1 begining An 1984 and for next 5 years will test at grades 2,4,6 8,10 with SEA SUPPOrt Agencies and Organ. N 81 N change other N s for n to N Currently Contemplated Changes looking at norm-refd stdd. achievement tests. state assessment provided. None Agencies and Organ. N hange other with endorseme of teachers admin's. N 156

PAGE 161

..Minimum Competency Table VII Changes in Minimun Competency Program State Kentucky Maryland Massachusetts Michigan No pr Minnesota No p mississippi Years roqran in Place 6 4 9 4 a m ram 2 current Prcrgr.sm N N N Y Y Y Y Y N N Major Changes Same as for state assessment. Last program was not considere M.C. merely diagnostic. 1984 program is now MCT. Original plan was t add a new grade each year. Did tha from 1981 to 1984, :hen stopped. 1982 IRT models adopted. Reading added. De'pt. of stat e framework. None More $ for MCT thar for state assmt. Agencies and Organ. Other N N N N N Currently Contemplated changes 1986 legislature expected to make recommendation regarding promotion. Upgrade standards Add 8th grade to test Add a norm-refrncd. portion to test. All will be implemented in 1986. 1989: Math and writing added 1988: Citizenshlp added Poss. of statewide stud. test and standards. Minimum standards to be adopted by Fall Grade 12 grad. for 1982. Agencies and Organ. N N N N N N N for N N N N N N N Change Other N N Governor Governor N N N 157

PAGE 162

MiniCompetency Table VII. Changes in Minimum Competency Program Current Program Agencies and Organ. ncies and Organ. Other hange other N N N N N N Years program in Place 8 ram 10 6 program 8 7 6 Currently Contemplated Chanqes State Missouri Major Changes Y Y Y m. Y N N N N N N Y Y None N N N N N N N N N N N N As of 1986 grades withheld of 9th graders until passed. N N N N N Montana No pr Nebraska Nevada None None None N N Tests more difficult Spring change i standards and scoring from correct to IRT. New Hampshire Jersy NO changes expected None 1986--add writing and new tests in reading and math. Governor N New Mexico 1983--added L.A. reading and math. Hope to have test validly measure the Exit Competencies. W N N 1984--added science and social studies. New York elementary o Added 5th grade writing. o Went from NRT to CRT Testing program will be changed: additional areas will be tested, e.g., World and American History, science. N H.S. Comp. Tests Life-skills test to academic skills test In 1979 Introduced Degrees of Reading Power.

PAGE 163

Minimum Competency Table VII Change in Minimum Competency Program State North Corolin a North Dakota No ohio Oklahoma No pro Oregon South Dakota TennesseeNot availabe for interview Texas Years Proqram in Place 7 Program 1 am 12 First year proqram 6 program interview (-, Current N Y Y N Y N Y N N N N Y Major Chanqes Areas tested: Expanded content an grade level tested; uin effect 85-86. None None None 1984 legislation for compensatory ed. provides funds for imparied school: Added grade 3 in 1981. Agencies and an. stir comission change support ---N N N N Current 1 y Contemplated Changes None on everyone ported it. Pending: movement toward minimum competency testinq. Shift to testing Spring "86 instead of Fall with new Instrument to reflect F to S content; still a math and reading test w/same object Science added in at grades 3,6,9. In test for diploma will begin at 10th grade (now llth) to apply to 89/90 graduation. Scienc e 3,6,8 added in Totally new test next year. Grades 1,3,5, 7,9 and 12. 1-9 will be same subjects Grade 12 will be L A All will be objective based mastery test=.. Agencies and Organ. working N N h N N N N N N a N Y anqe ther N ----N n N N 159

PAGE 164

MinimumCompetency Table VI I Changee in Minimum Competency program Agencies and Organ. ncies and Organs hange Other c NAN NA. N .-. Current Years Proqram in Place 7 8 7 program 0 availble data : N N N Currently Contemplated changes Other Major Changes None State Uta h Class of 1989 will have to take new test Covers grades 3,6, 10 and 12. Not min. comp. but objectives based. Involves chanqes in state curriculum, as well as testinq. N N n Jud ----N ------Y N Y for N Y N -Vermont In 1989 it will become an 8th grade promotion test (not 12th grade graduation). Ruling was in 1985. Competencies will be re-written 1978--other (reasoninq) added. Virgini a None 1980 began development of 7-12 objectives and assmt. Hope to replace grade 10 test with K-12 objective-based education. Also adding subjects to form a full currlculum. Washington No Prgram West Virginia Wisconsin Not No state ---------NA -------review. required to assess see WV(2) WV2: A lawsuit was brought in 1978 or 1979 against one county (school district) claiming that school was not proding quality education. 1983 court decision said that state formulas for funding were inequitable and required major changes. In 1984 the SEA developed a Plaster Plan in response to the court. Policy statements in the plan require learning outcoms K-12. Objectives were written to define the outcomes. Pupils are required to show *progrees toward 100 percent mastery of the objectives. Twelve or 13 areas have been defined for curriculum objectives. For example, math is one area. 450 outcones were written for K-12 math, with 1400 objectives. Each objective has about 10 items for measurement. Items are being written by a large committee of teachers. In essence, a very large and widely varied item bank is beinq developed. Testting will be done by teachers by selecting items appropriate to their curricula. Teachers are required to teach to the objectives, but may choose different objectives to reach that outcome. 160

PAGE 165

TESTING SNAPSHOTS OF EIGHT Over the past 10 years, forces seeking reform STATES in education have worked to require increased state and/or local testing. In many places, this movement followed widespread dissatisfaction graduates. In from schools measure ments with the quality of education as personified by perceived ability levels of response, public and community leaders began to seek accountability specifi c statements of what is being attempted and specifi c of what is being accomplished. Often, the Governor or the state legislature became a critical player in this movement. Concerned over the need for a well-educated work force in the national competition for jobs and industry, states have increasingly turned to testing. Educators, often initially alarmed by demands for increased testing, have in most instances moved from opposition to cooperation, and have worked to design tests and test environments conducive to learning. Two forms of testing have increased; these minimum competency testing and assessment testing. Minimum competency testing seeks to determine whether or not students are are learning the information defined in that system as basic. Minimum competency testing normally comes in tandem with opportunities for help to those failing the tests and opportunities for substantially over Assessment of various school re-testing. In time, pass rates for minimum competency tests rise initial levels. testing is quite different, in that it seeks to measure the effectiveness programs. Assessment testing is more informative to educators and cheaper than the traditional standardized tests. Using specific modern quantitative techniques, assessment testing can be accomplished using a relatively small number of students. Thus, money is saved in test instruments and processing, and substantial time is saved by leaving most students in class. Assessment testing is generally thought to be 161

PAGE 166

a useful comparison between programs in different schools, because it is designed to measure program or school effectiveness, not simply the comparative ability levels of students. In order to accurately convey the various forces behind the current testing movement, OTA asked individuals in eight states to describe, in their own words, the recent history of testing in their state. The following papers are presented unedited, and are intended to give a flavor of the many ideas and circumstances at work in different states, and the various approaches that states have adopted.

PAGE 167

A BRIEF HISTORY OF STATE TESTING POLICIES IN CALIFORNIA Susan M. Bennett and Dale Carlson California Assessment Program California State Department of Education January 1986 Prepared Under Contract With The Office of Technology Assessment Congress of the United States 163

PAGE 168

A BRIEF HISTORY OF STATE TESTING POLICIES IN CALIFORNIA Origins of State Testing: 1961-1964 Statewide achievement testing in California originated in 1961 with the recommendation of a citizens advisory commission. The commission recommended that the Legislature set a level of instruction through the State Board and the mandatory statewide examinations be utilized to establish this standard (Joint Interim Committee, 1961, p. 38). The assessment program first implemented in 1962 embodies the concept mandated in 1961 and implemented for the first time in 1962 embodied accountability, but did not set standards in a literal or uniform sense million students the entire student population at grades 5, 8, and 11 the concept of More than a were tested annually from 1962-1964 in reading, language, mathematics, and aptitude). Districts selected standardized instruments from lists for each grade level ntelligence (scholastic of state-approved tests 1965-1973 The establishment of a statewide reading improvement program in 1965 (MillerUnruh Basic Reading Act) was accompanied by substantial modifications in the scope of content assessed and in the grade levels tested. The new legislation required districts to administer a uniform test to all students in grades 1, 2, and 3 to provide data for selecting those districts most in need of reading specialists. The legislation also instructed the State Board to adopt uniform tests at the upper grade levels; to change the grade levels tested from 5, 8, and 11 to 6 and 10; and to restrict achievement testing to a single content area: reading. An explicit proscription on public release of test results included in the 1961 testing law was reversed in 1968 when new legislation mandated that results be reported annually on a district-wide basis Further 164

PAGE 169

modification of the law in 1969 (California School Testing Act) changed the upper grade level to be tested from 10 to 12 and expanded the content tested to include basic skills in language and mathematics as well as reading. During this period districts purchased, administered, and scored Board. They returned summarized and reported the standardized test adopted for each grade level by the State the results to the State Department of Education to be to schools, districts, and to the State Board. 1973-1978 Widespread dissatisfaction with the statewide testing program especially the resentment among district personnel of what they perceived as unfair comparisons based on commercially-produced tests that were poorly matched to the skills taught in California led to a complete restructuring of the testing program. New law in 1973 incorporated detailed recommendations of a legislative advisory committee on testing chaired by Lee Cronbach. Foremost among the committees recommendations was the separation of local and statewide testing into distinct programs, with the statewide program mandated to provide data for evaluating instructional programs at the school, district, and state levels, but not to provide data for individual students or classes. Multiple-matrix sampling was recommended to provide reliable data on a broad array of curricular objectives while reducing the time required for testing from three or four hours to approximately 35 minutes. The new state-level testing program, the California Assessment Program (CAP), was first fully implemented in 1974-75 with all testing costs absorbed by the state. The design, development, and procedures of the new program were unique in the nation. CAP tests were developed for grades 1, 2, 3, 6, and 12 with the full participation of statewide committees of content area experts and classroom teachers. Each test was designed to assess specific objectives representing the full breadth of content that should be taught in each content area at the appropriate grade level. The newly-developed tests included 165

PAGE 170

a grade 1 entry level test of prereading achievement test), a single test of reading and 3; and tests of reading, mathematics skills (to replace the end-of-year reading achievement to be administered in grades 2 spelling, and language for grades 6 and 12. Following the multiple-matrix design recommended by the legislative advisor y committee, large numbers of items were distributed over 10-18 nonoverlapping forms for three of the new tests: the grade 2 and 3 reading test and the surveys of basic skills for grades 6 and 12. Each student at these grade levels completed a single form of the appropriate test and the results were then aggregated to provide a wide variety of program diagnostic scores for each content area and for subskills within each content areas. Scores were aggregated and reported at the school, district, and statewide levels. The new approach to statewide achievement testing, with its focus on the assessment of school-level programs rather than the needs or progress of individual students, relegated testing for other purposes to a variety of district-level testing programs. Thus, local districts assumed full responsibility for standardized achievement testing to satisfy program evaluation requirements, to compare local performance with national norms, and to report student-, class-, and school-level scores to parents and local school boards. Legislation in 1976 and 1977 also made districts responsible for conducting proficiency (minimum competency) testing in reading, writing an d computation and for developing or selecting appropriate tests to do so. Performance indicators and examples of minimum standards for testing once between grades 7-9 and twice between grades 10-11 were set by the State Board, with minimal course requirements for graduation prescribed by law. Individual districts set their own graduation standards. (Further legislation in 1981 mandated that summer school be required for all students in grades 7 to 12 who failed to meet their districts standards.) District-conducted proficiency testing was also required once between grades 4-6 to identify students in need of remediation. 166

PAGE 171

! . ., Legislation in 1975 also mandated an early exit proficiency test, the California High School Proficiency Examination (CHSPE). The CHSPE is an optional, four-hour examination that provides the opportunity for students who are 16 years old or secondterm sophomores to verify their competency in basic reading, writing, and arithmetic skills. Candidates with passing scores are awarded a Certificate of Proficiency that is equivalent by State law to a high schooi diploma. Although the State Department of Education is officially responsible for the development and content of the CHSPE, it is administered by a private testing service. The CHSPE is related to CAP, the statewide testing program, only peripherally normative data on the CAP twelfth-grade test are used as a partial basis for setting and monitoring the passing score (Carison, 1979). 1979-1982 A number of changes to CAP recommended by the 1977 Assembly Advisory Committee on Statewide Testing became law in 1978. The most significant of the changes ended testing in grade 2 and shifted resources to grade 3 to measure skills in written language, mathematics, and reading, with a heavy emphasis on comprehension. The new Survey of Basic Skills: Grade 3 was developed by staff of the State Department of Education with extensive involvement by advisory committees of content area specialists and by teachers throughout the state. First administered in 1979-80, the new test consists of more than 1,000 items operationalizing objectives found in the statewide curriculum frameworks, state-adopted textbooks, and skill areas commonly taught in California schools. Following a multiple-matrix design, items in each content area were assigned to 30 unique forms, each comprised of 34 items and requiring no more than 35 minutes for a student to complete. A scaled score system based on item response theory was introduced for reporting the results of the new Grade 3 Survey. The new system permitted year-to-year comparisons independent of statewide performance or item changes and also permitted 167

PAGE 172

--.. direct comparisons of performance across content areas without translation into normative scores. Beginning in 1980, grade 3 school reports have included scale scores for each of the three content areas and 90 specific skill areas presented in a program diagnostic format that encourages the use of information on relative strengths and weaknesses for modifying local instructional programs. CAP staff begin developing a new, more demanding Survey of Basic Skills: Grade 6 in 1980 following the same procedures as were followed in constructing the grade 3 test. The new Grade 6 Survey was administered for the first time in 1981-82. Each student completes one of 40 unique matrix forms consisting of 31 questions in 30-35 minutes. The new grade 6 school reports, like the grade 3 reports, provide program diagnostic information indicating relative strengths and weaknesses as shown by scale scores for the three content areas of reading, written expression, and mathematics, as well as for numerous subskills within each content area. 1983-1986 Californias new Superintendent of Public instruction, Bill Honig, was elected in November, 1982, on a reform platform calling for a return to a traditional academic curriculum and to instructional practices including rigorous testing that represent what we know works in education (Honig, 1985, p. 6.). Excellence in education, as all students both collegeand noncollegethat require brains rather than brawn, and through exposure to a common, irreducibie envisioned by Mr. Honig, involves preparing bound to compete successfully for jobs eIevating them intellectually and morally core of knowledge in the arts and sciences. To initiate the long-term process of reform required to operationalize this vision of quality education, the Department of Education requested additional funding from the legislature and proposed a number of statutory changes. The educational reform measure passed by the California legislature and signed by Governor Deukmejian in 1983 provided 168

PAGE 173

L $850 of the $950 million dollars in the Department of Educations original request along with a package of 65 reforms (Hughes-Hart Educational Reform Act), including mandated graduation requirements for all students, a longer school day and year, money for textbooks and summer school, tighter discipline and dismissal procedures, and definition of statewide curriculum standards. To provide for systemwide quality control, the reform measure mandated modification of the existing statewide assessment program to emphasize higher-order academic skills and to assess additional grade levels and content areas. It also established a new end-of-course examination program to measure and reward high-level achievement in critical high school courses. The changes in statewide testing by Hughes-Hart in 1983 reflect a general policy that standardized tests aligned with statewide curriculum objectives should be used to the greatest possible advantage to achieve the goals of curriculum reform with students of all types. More specific policy goals clarify several separate, but related, ways in which standardized tests are expected to promote curriculum reform. 1) Standardized tests are expected to focus the attention of educators and policy makers at all levels on the knowledge, skills, concepts, and processes which are essential for success in the more demanding hightech job market of the future, for responsible citizenship, and for personal fulfillment. The core of content and skills to be spotlighted represents a rigorous curriculum in the humanities, natural sciences, and math and emphasizes higher-order skills such as those required to analyze complex relationships, draw inferences, and reason deductively. Although it is assumed that in practice, the scope and pace of the curriculum will reflect differences in aptitude and intelligence (Honig, p.202), it is also assumed that the majority of students are not working up to their potential, and that it is the responsibility of the schools to challenge them to do so both for their own good and for the good of the society. 169

PAGE 174

. ._. 2) Scores on standardized tests (along with indices of performance such as enrollment in selected academic courses, the amount of homework completed o n a nightly basis and the frequency of writin g assignments) provide baselines against which schools are encouraged to set targets for improvement and to complete with themselves and with other schools serving similar populations, thus tcheting the whole system upward over time toward the goal of academi c excellence (Honig, 1985, p.124). 3) By helping to clarify a sense of common purpose, by focusin g attention on the challenging academic objectives of the reform movement, by raising expectations, and by providing feedback on improvements in achievement, standardized tests are expected to contribute along with the curriculum they represent, more interesting and challenging textbooks, and other key components of the reform package to rekindling a sense of excitement and enthusiasm for learning in teachers and student alike. 4) Standardized testing is expected t o provide measures o f accountability that are essential to gaining and maintaining cooperation and support for the educational reform movement from parents, educators, policy makers, the business community, and other important segments of the public Evidence of continuing i reprove ments in student performance is expected to sustai n enthusiasm over the anticipated 5-10 year period needed to fully implement the goals of curriculum reform. Unlike the testing reforms that have been instituted in other states in the past several years, the revisions, expansions, and additions to Californias statewide testing program do not include an emphasis on minimum competency testing. On the contrary, the recent changes in statewide testing indicate a commitment to go beyond narrowlyfocused tests of basic skills or minimum competencies to instruments that will truly embody the objectives of a challenging academic curriculum, measuring the full range of higher-order academic skills and using testing approaches other than the traditional multiple-choice format wherever possible. Consistent with the legislative mandate, statewide testing has been expanded to focus instruction on the most important objectives of the reform movement and to provide accountability to the public for a more rigorous instructional program. One major component of the expansion involves additions to the California Assessment Program. CAP has added to its survey series since 1983 by developing the Survey of 170

PAGE 175

Academic Skills: Grade 8, first administered in 1983-84. A matrixed test of 36 70-item forms, the grade 8 test consists of reading questions based on passages from literature, science, and social science emphasizing higher-level comprehension; questions on written expression based on student essays related to the reading passages; mathematics questions assessing computational abilities, problem solving, prealgebra, and pregeometry skills; history-social science questions emphasizing critical thinking skills as well as content knowledge; and science questions requiring knowledge of process as well as content. Tests of history-social science and science will also be developed to supplement the existing CAP surveys of reading, written expression, and mathematics at grade 6 and other grade levels as the legislature makes to the statewide testing program include content paralleling that of the new grade 8 direct (essay) assessment of writing skills, funds available. Other anticipated additions a Grade 10 Surveey with grade-appropriate test (not yet funded by the legislature), and a now in its second year of development and scheduled to be added to the Grade 8 Survey in 1987 and to the grade 12 and grade 6 tests in subsequent years. Current efforts to upgrade the California focus on the development of a completely new, test to replace the instrument that has been Assessment Programs survey series also expanded, and more demanding grade 12 in use since 1974. The new Survey of Academic Skills: Grade 12 will be a multiple-matrix test with content in reading, written expression, mathematics, history-social science, and science. The items will assess important higher-level thinking skills and competencies identified in each of these subject areas by the Model Curriculum Standards: Grades Nine through Twelve adopted by the State Board of Education in 1985. The new grade 12 test is scheduled for partial implementation (three content areas) in 1987-88 and full implementation (including tests of history -social science, science, and a written essay) in 1988-89. The CAP surveys for grades 3, 6, and 8 will be reviewed for consistency with statewide curriculum objectives and revised as needed after the 8 are completed in 1986-87. Model Curriculum Guides for kindergarten through grade 171

PAGE 176

. The Golden State Examination plan for expanding statewide testing the educational reform movement. Program (GSEP) is a second major component of the to focus instruction on the curriculum objectives of Golden State Exams will be developed to measure achievement in 17 academic subjects under statewide standards of competency and to identify students qualifying for a special honor designation on their high school diplomas. Students will be tested on a voluntary basis upon completion of courses in mathematics, laboratory sciences, United States history, English literature and composition, foreign languages, and health sciences. The first two GSEP exams in beginning algebra and geometry will be field tested in 1985-86 and fully implemented in 1986-87. GSEP of development, are available. exams in United States history and biology are now in the initial stages The full series of tests will be developed and operationalized as funds A third component of the plan for modifying statewide testing to better meet Californias educational objectives involves development of a comprehensive assessment system that will provide student-level scores to meet proficiency specialized local needs as well as provide the school-, district-, and needed for program evaluation by CAP. The proposed system would statewide testing program with district testing programs in order to requirements and state-level results consolidate CAPS reduce the overall costs of testing, reduce the amount of instructional time devoted to testing, and ensure that testing is focused on the priorities of Californias curriculum. Preliminary work has been completed, but full development of the system will require further legislative initiative. Use and Impact of Statewide Testing The statewide testing program, as required by the legislation that in its present form in 1973, provides group-level information to school established CAP districts, to the legislature, and to the public to be used in each of three major ways: 1) to evaluate the 172

PAGE 177

effectiveness of school programs, 2) to allocate resources to schools with the greatest educational needs, and 3) to identify successful practices. This is done annually through a series of reports including school-level and composite district-level reports, a four-year school and district sum mary, and an annual report of statewide results. In practice, CAP data are used by school personnel, the legislature and State Department of Education staff, and the public in a great variety of ways. The following are examples of some of the most common uses by each of these audiences: 1) Educators in districts and schools typically use CAP data to evaluate strengths and weaknesses in particular content and skill areas, at specific grade levels, in particular subgroups of students, and in particular schools. Trends across years, trends across grades, and comparisons with statewide performance and with the performance of other schools serving similar students populations are also frequently emphasized. 2) Results of a survey of more than 4,600 elementary principals in 1979 indicate that most of them were using CAP results to examine curricula more closely, to develop instructional strategies to correct problem areas, to call attention to problem areas not previously noted, and/or to develop or focus teacher in-service activities. The changes principals most frequently related to CAP results include modifications in the goals and objectives of instructional programs, articulation of curriculum and teaching activities within and across grade levels, modifications in the amount of time devoted to teaching various skills, and development of new instructional materials (California, 1980). Local educators also frequently use CAP data to document the need for special funds or for participation in special projects. Recent comments by local and district administrators, both in the press and in conversation with CAP staff, indicate that they continue to use CAP data in all of the ways documented by the 1979 survey. Legislators and State Department of Education staff typically use CAP data to evaluate instructional programs and practices by examining yearly achievement in major content areas and by making comparisons of trends across content areas, across grades, across years, and across subgroups of students (classified by gender, mobility level, English language fluency, socioeconomic level, and ethnicity, as well as by supplementary information on reading outside of school, homework assignments, writing assignments, TV exposure, etc). Statewide results are also compared with national performance based on studies equating CAP tests to various nationally standardized tests as well as to NAEP. 173

PAGE 178

Statewide CAP scores indicating curriculum weaknesses have prompted intervention at the state level. For example, the relative weakness in computational skills apparent in statewide CAP results in the late s led to revisions of state Curriculum Frameworks and to the adoption of new, more balanced textbooks. More recently, a decline in eighth grade CAP scores in 1985 (as well as the students below-average standing relative to national norms) has led to the formation of a Middle Grade Task Force composed of students, parents, educators, and representatives of business and industry. The Task Force, formed in January, 1986, will hold hearings throughout the state to address issues including students maturation patterns, teacher credentialing, grade level configurations, and effective teaching strategies in order to develop a plan for improving the quality of middle grade education in California. 3) Legislators and staff of the State Department of Education also typically use CAP data to evaluate the impact of special state and federal programs, to document need and allocate resources, to study funding models and effective schools, and to identify promising practices. Recent examples include: CAP scores in reading and mathematics (1979-1984) used as indicators of program effectiveness in comparing elementary and secondary school participants and nonparticipants in the School Improvement Program (California, 1985); CAP achievement scores used to identify exemplary schools (California, 1977; Fetler Carlson,1985); CAP twelth grade data used to identify low-performing high schools and their characteristics as a basis for proposing further legislation to assist such schools (California, 1984); and year-to-year improvements in CAP twelfth grade scores used to determine cash rewards to schools under the Education Improvement Incentive Program begun in 1984. 4) Since CAP data at the school-, district-, and statewide levels and comparisons of state results with national norms are widely reported in the press, they are major contributors to the general publics evaluation of Californias schools. Parents typically use such data to make comparisons between schools and districts and realtors typically use them to argue the merits of investment in areas with high assessment scores (Powell, 1981). Consistent with the policies of Californias educational reform movement and the accountability plan instituted in early 1984, standardized test data have been given greater influence in the past several years. In addition to the detailed information on achievement scores in CAPS annual school, district, and statewide reports, CAP scores in reading and mathematics are now also reported at all levels of the school performance report first issued by the Department of Education in fall 1984. The high school performance report includes CAP scores as well as information on students SAT scores, College Board Advanced Placement examination scores of 3 or above, and College Board 174

PAGE 179

achievement test scores on selected examinations. These test data along with other statewide performance indicators are now being used to recommend California schools for the Federal School Recognition Program. They will also serve as the primary basis for selecting schools for the new California School of the accountability program to be implemented. Californias Education Improvement Incentive Recognition Program, the next phase Program (EIIP) has also increased the emphasis on standardized test data in the past several years by offering a cash incentive for improvement achievement on the CAP twelfth grade test. Enacted as a part of the Hughes-Hart educational reform bill in 1983, EIIP is not a part of the Department of Educations accountability program. Nonetheless, by distributing awards of over $14.6 million to more than half of the high schools in California, EHP has focused a great deal of attention on statewide testing at grade 12. New legislation has recently been introduced to extend the incentive program to the Summary It would be premature to attempt to assess sixth grade level. the impact of the changes in statewide testing mandated by Californias 1983 educational reform legislation at this point. Major test development efforts are underway on the new grade 12 test, direct assessment of writing skills, and the Golden State end-of-course examination program (see above), but the first of these new assessment instruments will not be implemented until 1986-87, and .4 the full set of Golden State Examinations may not be finalized for a number of years. f Parts of the grade 8 test the first of the new tests to be completed have been in place since 1984, but the science component will be added for the first time in spring 1986. In California, as in the other states that are now beginning to implement educational reform, the appropriate time to look for improvements in achievement attributable to expanded testing programs and to the variety of other reform measures instituted concomitantly is still a year or two down the road (Kirst, 1985). 175

PAGE 180

In the meantime, Californias state testing program is contributing to the goals of the educational reform movement by focusing attention on statewide curriculum objectives, by providing a basis for schools to set targets and better their performance from year to year, and by providing accountability to the public. The California Assessment Program is, by design, well suited to perform these roles and has been doing so for a number of years by reporting broad and comprehensive program diagnostic information to educators at all levels, to the surrounding the educational reform movement standards, the accountability program with its developed, and the Educational Improvement awareness of the existing testing program. legislature, and to the public. Publicity in general, the new statewide curriculum performance reports, the new tests being Incentive Program, have all heightened Evidence provided by newspaper reports throughout the state, orders for rationale and content documents for the CAP tests, and attendance at workshops held to introduce the new grade 8 tests and to assist teachers in using program diagnostic data to evaluate strengths and weaknesses in their instructional programs indicate that educators are seriously concerned about their performance on the CAP tests. One consequence of this concern is that districts are taking steps to incorporate higher-level thinking skills and other competencies identified by the statewide curriculum standards in their local programs. 176

PAGE 181

California Legislature. (1984). Sacramento: Assembly Office of References Overcoming the Odds: Making High Schools Work. Research. California State Department of Education. (1977). California school effectiveness study ~ the first year: 1974-75. Sacramento: State Department of Education. California State Department of Education. (1980). Student achievement in California schools: 1979-80 annual report. Sacramento: California Assessment Program, California State Department of Education. California State Department of Education. (1985). Report of Consolidated Application Programs 1984-85. Sacramento: Program Evacuation and Research Division, California State Department of Education. Carlson, D.C. (1979). Statewide assessment in California. Studies in educational evaluation 5, (pp. 55-75). Great Britain: Pergamon Press. Fetler, M.E. & Carlson, D.C. (1985). Identification of exemplary schools on a large scale. In Research on Exemplary Schools (pp. 83-96). Academic Press. Honig, B. (1985). Last chance for our children: How you can help save our schools. Reading Mass: Addison-Wesley. Kirst, M.W. (1985). Sustaining state education reform momentum: The linkage between assessment and financial SUPPO rt. (Policy Paper No. 85-C3). Standard, CA: Institute for Research on Educational Finance and Governance. Joint Interim Committee. (1961). Report of the Joint Interim Committee on the public education system. Sacramento: Senate of the State of California. Powell, M. (1981 )., Uses of state assessment information. In D.C. Carlson (Ed.), Testing in the states: Beyond Accountability (pp. 13-29). San Francisco: Jossey-Bass. 177

PAGE 182

A BRIEF HISTORY OF STATE TESTING POLICIES IN COLORADO Wayne Martin Prepared Under Contract For The Office of Technology Assessment U.S. Congress 178

PAGE 183

A Brief History of State Testing Policies in Colorado BACKGROUND To better understand Colorados policies toward state testing, some general background information about Colorados public education system is needed. Colorado is a strong local control state. This is especially true in the area of education. For example, Colorado has no state curriculum or curricular objectives. The 176 local school boards each determine the curriculum to be used in their individual school districts. The concept of local control has generally had support from the public, local district staff and school board members the Colorado General Assembly, the Colorado State Board of Education, the Commissioner of Education, and the Department of Education. The need for local control is also supported by the diversity that exists within the state The majority of Colorado school districts are located in rural mountainous or agricultural settings while the majority of students (78%) attend urban or suburban school districts. The imposition of strong state control in the area of education appears to be neither practical nor desirable in Colorado. The State of Colorado guarantees that each school district will receive a certain amount of funds to educate its students. This is accomplished through the annual establishment of an Authorized Revenue Base (ARB) by the state legislature. The ARB is the dollar amount per pupil that represents the districts level of support for equalization purposes. The minimum ARB for 1985 was $2,550, triple the ARB for 1975. The revenue for the allowed ARB is generated through a shared formula using local school district property taxes and the state general fund. The shared formula includes a guaranteed tax base method (i.e., every mill of tax is guaranteed to raise an amount of revenue per pupil) to ensure equalization. Between 1975 and 1985, the guaranteed tax base increased from $27 to $63.41 per pupil. The state share of the ARB has changed 179

PAGE 184

relatively little between 1975 and 1985; the state general fund provides approximately half of the ARB each year. Governor Richard D. Lamm was a strong proponent of educational reform, serving on several different national task forces dealing with public education. Governor Lamm also worked with a legislature controlled by the opposition party since his initial election in 1974. Beginning with the 1985 legislative session, the Governor faced with a vetoproof Colorado General Assembly. In November 1986, State Treasurer Roy Romer was elected to succeed Governor Lamm. During the campaign and since taking office, Governor Romer has stressed the importance of education elementary, secondary, and postsecondary in building for Colorados economic future. Like Lamm, he must work with a veto-proof legislature controlled by the opposition party. It is against this background that the past and current state testing policies must be considered. COLORADO POLICIES, 1970-1985 During this time period, there were no mandatory state testing programs. Given the general support for local control of schools, other alternatives were pursued by the Colorado General Assembly. The first alternative was the Educational Accountability Act of 1971. This represented Colorados response to the assessment/testing programs being set up by other states during the early 1970s to institute accountability measures. The Educational Accountability Act of 1971 established the State Accountability Committee, which is an advisory body for the State Board of Education, and mandated the creation of local accountability committees within each school district. The purposes of the legislation are as follows: 22-7-102. Legislative declaration. (1) The general assembly declares that the purpose of this article is to institute an accountability program to define and measure quality in education and thus to help the public schools of Colorado to 180

PAGE 185

achieve such quality and to expand the life opportunities and options of the students of this state; further, the purpose is to provide to local school boards assistance in helping their school patrons to determine the relative value of their school program as compared to its cost. (2) The general assembly further declares that the educational accountability program developed under this article should be designed to measure objectively the adequacy and efficiency of the educational programs offered by the public schools. The program should begin by developing broad goals and specific performance objectives for the educational process and by identifying the activities of schools which can advance students toward these goals and objectives The program should then develop a means for evaluating the achieve merits and performance of students. (Colorado Revised Statutes, 1985) The Educational Accountability Act of 1971 is still in effect within Colorado. The Colorado State Board of Education has adopted rules and regulations to implement the law, and Colorado Department of Education staff verify that local districts are in compliance with the rules and regulations. Approximately one-third of the districts are reviewed each year for accountability and accreditation purposes. During the mid-1970s, states across the country began to mandate minimum competency or proficiency testing programs through either legislative or state board of education action. The general purpose of such programs was to verify that all students possessed a certain core of skills and abilities before leaving the public education system. Because Colorado does not have a state curriculum or state curricular objectives, the Colorado General Assembly passed the following legislation, revising the duties of local boards of education, in 1975 to address the question of competency or proficiency testing. 22-32-109.5. Board of education specific duties testing requirements. (1) In carrying out its duties under section 22-32-109 (1) (t) in determining educational programs, if a board of education imposes any special proficiency test for graduation from the twelfth grade beyond the regular requirements for satisfactory completion of the courses and hours prescribed for graduation, the results of such tests shall be used by school districts to design regular or special classes to meet the needs of all children as indicated by overall test results. If a board determines to impose such a proficiency test, such test shall be given at least twice during each school year, and initial testing shall take place in the ninth grade. 181

PAGE 186

, (2) Any child who does not satisfactorily fulfill the requirements of a special proficiency test imposed under the provisions of subsection (1) of this section shall be provided with remedial or tutorial services during the school day in the subject area which the test indicates deficiencies for graduation purposes. Such child shall be provided with these services from the time of initial testing until such time as the results of the special proficiency test are satisfactory. Parents of children not satisfactorily fulfilling the requirements of a special proficiency test shall be provided with all special proficiency test scores for their child, a minimum of once each semester. (Colorado Revised Statutes, 1985) This provision for proficiency testing is still in effect within the State of Colorado. The Denver Public School System has been the principal user of this legislation, though the school system has announced publicly its intention to move away from the use of proficiency testing for graduation purposes. Nearly all Colorado school districts test students with a standardized achievement test battery during any given school year. Because of the requirement for the Commissioner of Education to report annually on the status of K-12 public education, the Colorado Department of Education has required school districts to report reading and mathematics scores from their standardized achievement testing program. The purpose of collecting the information was to be able to report on the achievement of Colorado students. Unfortunately, the information has had limited utility because of the problems associated with aggregating the data. Because the districts use different test batteries, different forms of the same test battery, test different grades at different times of the year, and use different reporting metrics, the Department of Education has not been able to report more than the percentage of districts at, above, or below the expected test norm in reading and mathematics for elementary and secondary students. 182

PAGE 187

. -.. LEGISLATIVE ACTIVITIES, 1985 Between the 1984 and 1985 legislative sessions, the Interim Committee on School Finance met to deal with a variety of issues facing public education as it entered the m id-1980s. Though the states share of the ARB had remained relatively stable (approximately 50%) over the past ten years, the dollar amount continued to increase and accounted for more and more of the state general fund. Members of the Interim Committee began to raise questions about the quality of the public education offered in Colorado as they struggled with the issues of financing elementary, secondary, and higher education. Also, the recent national reports on public education and the need for reform, such as Nation At Risk, had raised a healthy skepticism among the public and the legislature about the current status of education. There was general agreement among the members of the Interim Committee that some statewide testing was needed. During the 1985 legislative session, two major testing bills were introduced by House members. The first bill called for testing all public school students in grades 3, 6, and 9 using a standardized achievement test battery to be selected by the State Board of Education. In effect, the bill would have established an ongoing Colorado testing program with the Colorado State Board of Education having the option of annually selecting the standardized achievement test battery to be used to carry out the testing. The second bill called for all 12th grade students to pass a proficiency test covering, but not limited to, reading, language arts, and mathematics as a graduation requirement. This bill would have established a Colorado minimum competency testing program. Both bills generated a great deal of debate statewide and at the statehouse. The testing program bill was generally opposed by the local education community. The principal arguments offered against the bill were as follows. Districts already test students using standardized achievement test batteries to gauge accomplishment of curricular goals and to improve instruction. The test batteries selected at the district level are considered to be the best measures of the curriculum taught. The addition of a 183

PAGE 188

state program would result in a loss of instructional time for students. The state program might or might not measure what is being taught by the district, and would probably have limited utility at the district or teacher levels. The cost of a state program would be large and would represent a waste of limited resources. The ultimate arguments were that the imposition of a state testing program would result in a loss of local control, that the content of the achievement test battery would begin to dictate curriculum at the local level, and that a state testing establishment of a state curriculum. Although concerned about the potential loss of local state curriculum, the Colorado PTA was further concerned program would lead to the control and the specter of a about whether a state testing program could be made meaningful for students and parents. An amendment was passed requiring that the results be reported to the student and his/her parents. Its main concern addressed, the Colorado PTA assumed a position of limited support for the testing program. The main questions asked by local educators included what was the purpose of such a program and how would the results be used by the legislature. There was great concern that the results would be used to compare individual districts, buildings, or classrooms. There was also concern that the test results would somehow be used to adjust state support of individual school districts. The responses from the House Education Committee were that a statewide profile of student achievement was very desirable and that the results could possibly be used to support special funding of categorical education programs. The 12th grade proficiency testing bill produced a great deal of emotion. There was general agreement by all segments of the education community with support from business and industry spokespersons that no student should leave school without a minimal core of skills. Strong supporters of the bill gave impassioned pleas that schools not be allowed to graduate students who lack the skills needed to become a productive 184

PAGE 189

member of society. This appeal was based on both the subsequent effects upon the individual and the cost to society of supporting such individuals. Supporters also demanded that remediation be provided to all students who did not pass the test. The education community argued that attempting to provide remediation in 12th grade might be too late, while expressing the fear that a testing program based on minimums might have the effect of lowering standards and expectations for all students. Concern about how such a program might establish a state curriculum also arose. The most effective argument offered against the bill was that it might end up penalizing the very students it was attempting to help and could result in encouraging such students to become dropouts. After public testimony was amended by the House Education accepted on the proficiency testing bill, the bill was Corn mittee. The amended bill required that all 1 lth grade students be required to take a proficiency become part of the students permanent record; graduation requirement. test. The results of the test were to the results were not to be used as a The Colorado State Board of Education expressed its support for the establishment of a statewide testing program, though the Board wished to see the testing program bill expanded to include students in grade 11. The Board generally felt that the information gained from statewide testing would be useful as it established its priorities for the work of the department. The State Board did not support the proficiency testing bill. After that bill was amended, the Board expressed its desire to see the bill broadened to test achievement rather than proficiency for students in grade 11. The State Board of Education also was very concerned that a proficiency test would allow minimums to become the goal for high school students. The Commissioner of Education presented the Boards position to the House Education Committee. Department staff provided technical information to the Committee on the bills, possible amendments and/or alternatives, and the potential costs 185

PAGE 190

of implementing proposed programs. The Commissioner also supported a third testing bill which was introduced in the Senate by the Chairman of the Senate Education Committee. This bill would have allowed the department to establish a statewide testing program without having the exact design mandated. The design of the program would have been based upon input from the education community with final approval of the design resting with the State Board of Education. Unfortunately, this was part of a larger bill which was aimed at reform of Colorado school finance. The General Assembly chose not to deal with the issue of financing education during its 1985 session. Both of the House testing bills were passed by the House Education Committee and were forwarded to the Appropriations Committee after brief hearings by the Senate Education Committee. Colorado state law prohibits deficit spending by the state, and the General Assembly did not want to undertake any revenue raising programs during the 1985 session. As a result, the testing program bill did not leave the Appropriations Committee because of the large amount of new funding it would require. The proficiency test bill did leave the Appropriations Committee with a provision to conduct a feasibility study of the program for $20,000; it was later defeated on the floor of the legislature. Although there was general grumbling and skepticism about the status of Colorado education, the General Assembly chose not to fund the testing bills or other education bills during the 1985 session. At this point, the Colorado education community proposed to the legislature that it fund pilot programs in student testing and other education areas of expressed concern by transferring $2 million of the states support of local school districts to the Department of Education for the next two years. The intent of the coalition group, which included the Colorado Association of School Boards, the Colorado Association of School Executives, the Colorado Education Association, the Colorado Federation of Teachers, the Colorado Council of Deans of Education, the Colorado Parent and Teacher Association, the Colorado State Board of Education, and the Colorado Department of 186

PAGE 191

Education, was to demonstrate that it could address a number of important education issues in this manner. The 2 + 2 concept, as it quickly became known, was endorsed by the Colorado Association for Commerce and Industry and the Office of the Governor. The Chairman of the House Education Committee accepted the challenge of the education community and introduced House Bill 1383. Co-sponsored by the Speaker of the House, the President of the Senate, the Chairman of the Senate Educatio n Committee and other key legislators in the General Assembly, the bill transferred $2 million to the Department of Education for the next two years and required the department to conduct pilot programs in the following areas: student testing, dropout reduction, education of gifted and talented students, training of education staff evaluators, and teacher and administrator quality and training. Percentages of the $2 million were allocated to the areas in the bill, with student testing being allocated $500,000 per year. House Bill 1383 was passed by the Colorado General Assembly in May. It has since become known as the Educational Quality Act of 1985. COLORADO POLICIES, 1986 and 1987 The Educational Quality Act of 1985 specified that during the first year of student testing (1986) all public school students in grades 3, 6, 9, and 11 be tested with a standardized achievement test battery. This design reflects the two major testing bills introduced in the House and the State Board of Educations preferred testing program. At its December meeting, the Colorado State Board of Education selected the Iowa Tests of Basic Skills/Tests of Achievement and Proficiency, Form G as the test battery to be used. The State Board also required that a complete test battery (including social studies and science) be administered to students. Board decided to lease rather than purchase the specified four grades were tested in April 1986. Because it is a pilot program, the test booklets. All students in the Student and classroom results were 187

PAGE 192

returned analysis, At pupils at to local school districts before the end of the school year. To allow for further the state and individual district results were not released until mid-July. the state level, results were reported in terms of national percentile ranks for each grade for the state as a whole and by sex, race/ethnicity, district size, and district setting. The goal was to profile the achievement of the average Colorado student or groups of students for the different learning areas measured by the test battery. Composite scores, based on student achievement across the various learning areas, were not used. Though the reporting was based on the national percentile ranks for the average scores of students, emphasis also was placed on the percent of students with achievement in the upper and lower quartiles and the top and bottom deciles. Because of Colorados Open Records Law, the achievement scores for individual school districts had to be made available to anyone requesting them. To provide a better context for understanding the individual district scores, district profiles also were prepared. The profile identified the districts size and setting categorization and presented current district information plus the state average for variables such as fall membership (in terms of racial/ethnic groups) for the four grades tested, dropout rate, number of graduates, pupil-teacher ratio, average teacher salary, average years of teaching experience for teachers, total district revenue per pupil, and total district expenditure per pupil. The profile also included information from the 1980 census pertaining to the district such as per capita income, median income, family income, household and education attainment characteristics, and poverty status. The design of the second year of student testing (1987) was left open in the legislation. The goal for the second year of the program was to look at a number of alternative testing models based upon input from the education community. It was reflective of the testing bill introduced in the Senate. To maximize the number of alternative measures examined, it was decided that samples, rather than every student, would be tested. 188

PAGE 193

In November 1986, the readiness skills of nearly 11,000 Colorado grade 1 students (approximately 25%) were tested with the Early primary Battery of the Iowa Tests of Basic Skills, Form G. The purpose of this effort was to describe the skills and abilities of students as they begin Colorados public school system. Kindergarten is not mandatory in Colorado, though every school district offers a free kindergarten program. When the results were released in February, the national percentile rank for students of the average score for the different learning areas tested was reported as well as the percent of students in the upper and lower quartile and top and bottom decile. In addition to the standard reporting variables (state as a whole, sex, race/ethnicity, district size, and district setting), prior school experience (no prior schooling, kindergarten only, or preschool and kindergarten) was also used as a reporting variable. In March 1987, a five percent sample of Colorado public school students in grades 3, 6, 9, and 11 (approximately 2,000-2,500 students per grade) participated in a writing assessment based on the National Assessment of Educational Progress (NAEP) model. Students in grades 3 and 6 were asked to respond to a narrative writing topic; students in grades 9 and 11 were asked to respond to an expository writing task. Because grade 6 is considered to be a pivotal point in writing instruction, the expository writing task was also administered to the grade 6 student sample. Following the NAEP model, student papers are being professionally scored in terms of the primary trait; secondary traits were also developed for use with the Colorado papers. Results will be reported in summer 1987. During April 1987, a five percent sample of Colorado public school students in grades 3, 6, 9, and 11 (approximately 2,000-2,500 students per grade) participated in an ability-and-achievement testing program. To provide continuous data from the previous year, the Iowa Tests of Basic Skills/Tests of Achievement and Proficiency, Form G and its companion ability test, the Cogn itive Abilities Test, Form 4 were administered to all students participating in the sample. In addition to demonstrating a different testing 189

PAGE 194

model by adding the ability test, this program is designed to show the type of data that would result from a yearly statewide administration of a standardized achievement test battery and to compare results from testing a sample of students (by applying the 1987 sample of schools to the 1986 data) to testing every student (the 1986 data). Results will be reported in summer 1987. The 3, 6, and Originally the school The health-related physical fitness of a five percent sample of students in grades 1, 8 will be surveyed in October 1987 as a part of the pilot testing program. scheduled for May 1987, revisions in the planned measures and the late point in year necessitated delaying this survey until fall. purposes for both years of student testing have been to provide a number of state portraits of student achievement and to provide results that are as useful as possible to local school districts. At this point, exactly how the test results are used by the local school districts and the Colorado General Assembly is only partially known. A number of school districts have used the 1986 achievement results to re-examine their curricular approaches. The Colorado General Assembly found some assurance from the first statewide achievement test results as it finance issues during its 1987 session. The consideration and support of a bill dealing with struggles with the budget and school readiness test results were used in funding for early childhood education. The legislature also has indicated support for continuing student testing on a pilot basis for a third year if the states budget problems can be resolved. The State Board of Education has used the results in preparing its priorities. The achievement results were also used for a special study of school district efficiency and effectiveness conducted by a State Board appointed committee. Indeed, the Efficiency and Effectiveness Committee recommended to the State Board that the every-student, every-district acheivement testing program be conducted at least every other year. The Department of Education has used the results to identify areas where it can best provide technical assistance to local school districts. 190

PAGE 195

The Colorado education community, as reflected by the coalition group responsible for the 2 + 2 concept, will also use the results to recommend to the Colorado General Assembly what type of ongoing student testing program (if any) will best serve the State of Colorado. 191

PAGE 196

ACHIEVEMENT TESTING IN FLORIDA Thomas E. Fisher Florida Department of Education Tallahassee, Florida January 1986 Prepared Under Contract For The Office of Technology Assessment Congress of the United States 192

PAGE 197

Introduction In 1973, the Governors Citizens Committee on Education issued a report on needed improvements in Floridas public education system. The report, entitled Improving Education in Florida (1973), contained several recommendations addressing the need for accurate information on students achievement. The Committee believed that a quality educational system could be implemented only if student achievement was closely monitored. In the Committees words, Florida educational policy decisions should be based on research, not merely on tradition. Since then, the Florida Legislature has moved with considerable speed to create an educational accountability program which uses student achievement tests as one of its cornerstones. The Florida testing program has been documented previously by Fisher (1978), Burlington (1979), and Pinkney and Fisher (1978). Briefly, the Florida approach to student achievement testing as authorized by the 1976 Educational Accountability Act (Chapter 76-223, Laws of Florida) depends upon measuring student mastery of certain high priority learner objectives at grades three, five, eight, and ten. School, district, and state summary reports reveal how many students have attained the objectives. For high school graduation purposes, students must pass a state minimum competency test. Unless the test is passed, the student cannot be given a regular diploma from a public high school. The acceptability of this policy has been demonstrated repeatedly in both the public and legal arenas. The Debra P. v. Burlington case challenged the use of the graduation test, but, when the last appeal was decided, the State was permitted to continue the requirement. The Florida Legislature has been the most visable force behind the testing program in Florida. Individual legislators can be identified who were enthusiastic supporters of 193

PAGE 198

. the concept and who worked diligently to convince their fellow legislators to vote for the proposed laws. Implementation of the progra m was the responsibility of the Commissioner of Education who was unswerving in his commitment despite legal challenges and attempts to delay it. The State Board of Education also was supportive of the testing program and worked with the Department of Education and the were necessary for implementation. In 1981, the passing a resolution calling for Floridas educational that of the upper one-fourth of the states. This Commissioner to adopt rules which Board exerted its own initiative in system to be of no less quality than upper quartile goal, as it became known, led to the creation of a set of indicators to be used in determining the progress being made toward the upper quartile. The indicators, of course, included test scores. Generally, the citizens demanded expectations and be rudimentary educational skills. testing and accountability laws in Florida have been enacted because the m. Citizens believed students needed clear statements of ieved the schools were promoting students who lacked even the most Educators did not initiate the movement toward increase d accountability; however, since the laws have been enacted, they have become supportive of the requirements and have cooperated in successfully implementing them. Florida continues to expand its testing and accountability programs, wit h improvements and additional requirements being enacted by almost each session of legislature. The requirements have the effect of strengthening the state database providing greater consistency in academic requirements. the and 1 194

PAGE 199

The 1983 Educational Reform Act In 1983, the Florida legislature enacted a series of laws collectively known as the Educational Reform Act (Chapter 83-327, Laws of Florida). The Act requires the state Board of Education to adopt minimum student performance standards in science and computer literacy in addition to those previously writing. Further, the Board is authorized to These standards are intended to set goals for the authorized in reading, mathematics, and adopt student standards of excellence. very capable students. In regard to the first of these two new requirements, the Department of Education convened working panels of district educators to draft the proposed minimum student performance standards in science and computer literacy. The draft standards were reviewed by all of the school districts. After revisions were made, the State Board of Education considered the standards and adopted them. The Department recently issued a Request for Proposals for the development of the test specifications which will guide the work of future test development contractors. School districts and universities were encouraged to submit proposals for the specification development project as the Department believes that the tests should be developed with the close involvement of local district educators. After the specifications have been developed and reviewed by all school districts, the test items will be constructed. The Department anticipates that the assessment of student skills in these subject areas will begin in about two years. In regard to the standards of excellence, the Department proceeded in a s manner. Panels were convened, the standards were reviewed, and revisions were prior to consideration by the Board. Board to develop the test item spec standards of excellence will probably m ilar made The Department engaged the Dade County School fications and test item pools. The assessment of be done on a sampling basis with the data used for instructional planning rather than for determining individual student progress in school. 195

PAGE 200

The 1984 FACET Act In 1984, the Florida legislature passed the Omnibus Education Act (Chapter 84-336, Laws of Florida) which again strengthened and broadened the testing programs. These provisions collectively are known as the Florida Accountability in Curriculum, Educational Instructional Materials, and Testing Act (FACET) of 1984. The stated purpose of the law is to enhance quality education and upgrade student achievement [through] a coordinated effort. to ensure that the diverse needs of our public school students are met with the best available instructional materials and assessment instruments and procedures. It is clear that the legislature intends for testing and instruction to be closely linked. The FACET Act strengthens previous language in the 1976 Accountability Act specifying that the testing programs will include comparisons between Florida and the nation. Interest in these comparisons dates back to the work of the Governors Citizens Committee report, previously cited, which mentioned the need to include elements of the National Assessment of Educational Progress in the statewide assessment. Legislators believed that state learner objectives should be pursued but, at the same time, it is worthwhile to monitor the achievement of Florida students compared to that of students across the nation. FACET requires the Department to determine and report norm-referenced test results no later than the 1989-90 school year. Comparisons between schools, districts, regions, and states are to be made public through a series of reports. In implementing this requirement, it will be necessary for the Department to consider the movement toward a national indicators project currently being advocated by the Council of Chief State School Officers (Council, 1985). Obviously, state-by-state comparisons will be available only to the extent that states cooperate in the design and collection of the same data. 196

PAGE 201

At this time, testing procedures. program has been Department to use the Department is working on the design of A set of general criteria and characteristics of endorsed by the Board of Education. These its norm-referenced the norm-referenced criteria require the testing procedures which will produce the most accurate data from which the comparisons required by the law are to be made. A second major provision of FACET is the requirement that curriculum frameworks be established for selected curricular areas. These frameworks are to consist of broad guidelines for individual course content. They will ensure consistency across the curricular offerings in the public schools. The Board of Education is required to adopt student performance standards derived from the curriculum frameworks. The Department then is to develop assessment instruments and procedures to permit the determination of student proficiency in the selected courses no later than 1988-89. The Department is currently working toward implementation of these requirements. FACET contains specific requirements for public reporting of the test results. The state level data is to be included in the annual report on public education issued by the Commissioner of Education. Comparative test scores are to be included with rankings of the districts and analyses revealing how Florida compares to other states. Each school district is to report annually on the status of education in the district. These reports are to include the results of the FACET tests. Likewise, each school is to issue annual reports of a similar nature. The reports are to include consideration of student socioeconomic status, aptitude, and prior achievement. Lastly, FACET recognizes that educators need more training in the selection and administration of tests develop standards and procedures. Further, and in the use of test results. The Department is required to procedures for these activities as well as model trainin g the Department is to develop criteria and procedures for determining those school programs which are the most deficient in studen t 197

PAGE 202

performance. These procedures are to take into account the results of the various tests specified in the Accountability Act and the provisions of FACET. In summary, FACET represents a comprehensive addition to the statewide assessment program established originally by the 19761 legislature. Prior to FACET, the assessment program concentrated on certain minimum skills in reading, writing, and mathematics. Testing now has been extended to specific high school courses. The curricular offerings in the states schools are being made more consistent. The public reporting of test results has been strengthened. Clearly, this is a significant legislative action affecting the public schools. Uses of Test Data in Florida Test data are used in a variety of ways in Florida. This is possible because of the different aggregations of test results which are made available. Generally, test results are used for (1) allocation of certain resources, (2) as performance goals for students, (3) for public accountability, and (4) as an incentive for improvement. When the 1976 Educational Accountability Act was initiated with its requirement for a high school graduation test, it became evident that the State had an obligation to assist those students who were not adequately prepared to pass the test. Thus, the State Compensatory Education Program, funded at about $35 million annually, was initiated. Funds are distributed according to need those districts which have the most students performing inadequately on the statewide assessment program receive the most money. The program is widely accepted and is very important in providing remedial instruction to students with academic needs. The statewide assessment tests measure required performance standards, and, in that sense, are important elements in decisions about promotion from grade to grade. 198

PAGE 203

However, the state tests at grades three, five, and eight do not determine by themselves whether a student will be promoted. The information is advisory only, and the teachers have the final decision. In contrast, high school students must pass the state test if they are to qualify for a high school diploma. The schools must incorporate the state standards into the local curriculum, and teachers are obligated to provide instruction in these skills. Since graduation is ultimately tied to student performance, the standards serve as a powerful incentive for individual students to perform well. As has been mentioned the statewide assessment test results are publi c information. The data consistently have been made public in various reports and news releases. Schools with low test scores are identified and are expected to improve their students performance. The Department of Education has implemented a sophisticated system for auditing all school districts in a cyclical fashion. Particular attention is paid to the educational programs in the schools which have low test scores. The test scores also serve to create a climate of academic competition among the schools and school districts. The State has been divided into regions based upon the circulation areas of the major metropolitan area news media coverage. Test results are aggregated and released by region thus making it possible for the citizens and parents to see how their area schools are performing. Furthermore, each district is required to submit an annual plan and evaluation report which shows its progress toward improvement in student performance. This requirement is part of the State Board of Educations goal of moving Florida to a higher quality educational system. The general feeling is that educational competition is perfectly acceptable and can be used as a vehicle for motivating students, teachers, and administrators to strive toward higher achievement. 199

PAGE 204

Summary and Conclusions In summary, it is clear that Floridians believe in the collection and use of student achievement test data Programs already implemented provide information about students fundamental skills. Programs authorized but not yet implemented will provide information about student skills in individual school courses. The data are used by educators, administrators, legislators, parents, and citizens. The data are used for making individual student instructional planning decisions as well as for broader, policy decisions by the legislature. Clearly, the new programs are having an impact in the K-12 grades. But, the use of tests extends beyond high school to new testing requirements for college sophomores and the use of tests for determining teacher and administrator academic expertise. Certainly, no one in Florida believes tests can measure everything, and they are not a perfect solution for all of educations difficulties. But, tests do provide incentives and do permit public accountability. These factors are so strong in Florida that the use of tests is likely to continue.

PAGE 205

References Council of Chief State School Officers Center on Assessment and Evaluation, Draft Report of the Committee on Coordinating Educational Information and Research. Washington, DC: Council, October 17, 1985. Department of Education, Laws Relating to Florida Public Education Enacted by the 1976 Legislature. Tallahassee, FL: Department of Education, 1976. Department of Education, Laws Relating to Florida Public Education Enacted by the 1983 Legislature. Tallahassee, FL: Department of Education, 1983. Department of Education, Laws Relating to Florida Public Education Enacted by the 1984 Legislature. Tallahassee, FL: Department of Education, 1984. Fisher, Thomas H. "Florida's Approach to "Competency Testing. Phi Delta Kappan. vol. 59 May 1978, pp. 599-602. Governors Citizens Committee on Education, Improving Education in Florida. Tallahassee, FL: 1973. Pinkney, H.B. and Thomas H. Fisher Validating the High School Diploma Florida Style. National Association of Secondary School Principals Bulletin, vol. 62 October 1978, pp. 51-56. Burlington, Ralph D. Good News from Florida: Our Minimum Competency Program Is Working. Phi Delta Kappan, vol. 60, May 1979, pp. 649-651.

PAGE 206

MICHIGAN EDUCATIONAL ASSESSMENT PROGRAM: HISTORY AND DEVELOPMENT Edward D. Roeber Michigan Department of Education May 1987 Prepared Under Contract For The Office of Technology Assessment Congress of the United States 202

PAGE 207

Michigan Educational Assessment Program: History and Development Introduction During attainments the early and mid1960s, growing concern about the of the nations children and youth and rising costs of education educational combined to create a new concept in education accountability. Rather than being solely concerned whether our children could read or whether the best college or university would admit our sons and daughters, we began to ask ourselves more fundamental questions about our public schools. While people looked to public schools to further social advancement and stressed the importance of a good education in finding a rewarding job and attaining the good life, serious questions about the quality of our schools were being raised. Increasing concern over the products of schooling was natural. We asked ourselves: what can students do? Surprisingly, little information was available. Although local testing programs had been around for years, little data was available about students across Michigan. This lack of information led to the development of a state assessment program in Michigan. The Creation of the Michigan Model By State Board action and request, funds were provided in fiscal year 1969 to begin a statewide program (for implementation by the end of January 1970) to conduct an annual testing of all fourth and seventh graders. Without adequate time to create the measures to be used and hardly time to decide what measures could be used, the Michigan Department of Education (MDE) contracted with Educational Testing Service to develop the first tests. Measures in mathematics, reading, mechanics of written expression, word relationships (a hybrid aptitude measure), a socioeconomic status 203

PAGE 208

(SES) scale and an attitude scale were prepared. All of these measures were normreferenced. Data on school buildings, districts and the state as a whole would be released to school district personnel only; public release of data would not occur, by promise of MDE. While district and school norms were prepared and percentile ranks released, none of the data was made public. Obviously, such a large-scale program could not be implemented withou t controversy and if the state assessment program was strong on anything, it was strong on generating controversy! Teachers disliked the achievement measures. Low scoring districts disliked the percentile ranks. Parents and students were offended by the questions in the SES measure and turned off by the attitude scales. Administrators were I 4 -. defensive about potentially unfair comparisons, while teachers were worried about evaluation based on these test results. Despite (or perhaps because of) this controversy, the program was continued through legislative mandate and funding (Public Act 38 of 1970). The second year of the program was even more controversial. Several large cities threatened to withold their answer sheets from scoring if they were required to administer the SES and attitude scales. The clincher came on Valentines Day, 1971, when the State Superintendent, at a news conference well attended by the press, released a report of achievement results for every school district in Michigan. Although this seemed contradictory to the earlier promise of not releasing the results, the Department had been required by a state Attorney Generals opinion not only to make the data public, but also to publish the data and disseminate it. Several newspapers in the state published the assessment scores; one paper (with statewide circulation) did so for all Michigan districts. That infamous day became known within MDE as the St. Valentines Day Massacre: educator outrage and concern about the program reached its peak. 204

PAGE 209

Efforts were begun in 1971 to work with mathematics and communication skills educators to refine the tests. For the first time, Michigan educators were writing test items. Items written by teachers appeared to be better measures of achievement of Michigan students and were better accepted. At the same time, two other fundamental changes occurred: 1) a model was developed that tied the state assessment program to statewide curriculum improvement and 2) the seeds of a new program were sowed. In 1971, the six-step accountability was proposed and adopted by the State Board of Education in 1972. The model called for 1) the development of Common Goals, 2) the statement of explicit student expectations in the form of student performance objectives, 3) a needs assessment to determine specific student needs, 4) an analysis and modification of the instructional system where student needs are shown to exist, 5) an evaluation of the effectiveness of these changes in meeting students needs, and 6) recommendations for future action. As the efforts to develop the Accountability Model and the components of it were under way, the Assessment Program continued the annual administration of the normreferenced tests in 1972 and 1973. Due to the continued controversy surrounding their use, the attitude scale and SES inventory were withdrawn. Substantial item tryouts were held in 1971-72 to validate the teacher-written items for the achievement tests. New items were substituted into the achievement tests in 1972-73, marking the introduction of the first nonprofessional-i tern-writer items in Michigan. The final year of normative testing drew to a close in January 1973, with barely a whimper, for a far more exciting and innovative program lay ahead the first use of objective-referenced tests on a statewide basis. 1972-73 was overshadowed by the new program. 205

PAGE 210

Michigans New Assessment Program During 1971 and 1972, as the controversy surrounding the Assessment Program continued and as the misuses of the norm-referenced data mounted, a basic shift in the Assessment Program occurred. A decision was made by the State Superintendent and the State Board of Education to shift the Assessment Program to the measurement of objectives developed in Michigan. Tests would be developed for the minimu m performance objectives in mathematics and reading. Based on the previous successful experience of using classroom teachers to write and try out test items, a test development program was begun in 1972 with five school districts representative of the state, as well as a testing company to edit the items. Teachers, after receiving training in item writing, worked for several months to produce the needed items. The testing company then was responsible for editing a selection of the items and putting them together in tryout packages. The items were tried out. After tryouts, extensive reviews of the objectives and test items were conducted and the final fourth and seventh grade tests were assembled. In the fall of 1973, the first objective-referenced assessment of students was conducted in Michigan. This was the first use of an objective-referenced test on such a wide-scale basis. Results were reported back for each student (and the students parents), classroom teachers, building principals and central office staff. Considerable emphasis was placed on using the results to provide remedial instruction to the students tested, using the results to review and improve the school curricula, and reporting results to the parents, school board and the public, via the news media. The results were not used in promotion/retention decisions about students, nor were they tied in any way to high school state-level districts. graduation. The data have been used, though, as the basis for allocating compensatory education funds (around $30 million per year) to local The switch from norm-referenced to objective-referenced tests was not without problems, however. First, the objective-referenced tests were longer, with 206

PAGE 211

students needing up to four or five hours to finish the test. Second, because the tests were untimed, some educators did not know what to do with students who finished early. Third, the concept of a minimal objective was new could all students attain all of the minimal performance objectives? Finally, there was concern over proper use of the results. Because of the number of performance objectives tested, and because of the decision to return results in a form useful to classroom teachers, assistance had to be provided in person and in writing to help teachers and administrators throughout the state to understand what the test data could (and could not) be used for. Expansion of the Michigan Educational Assessment Program (MEAP) When the mathematics and reading performance objectives were first written, they were divided into three sets: grades 1-3 (tested at grade 4), grades 4-6 (tested at grade 7) and grades 7-9. Tenth grade assessment was seen as a logical extension of the fourth and seventh grade program. Test development began in 1974 and the tests were piloted in 1975 and 1976 on a voluntary basis. Even though the State Board of Education acted in 1977 to expand the assessment program to include a tenth grade assessment, it was not until 1979 that the Legislature funded the program. While the Legislature was originally not convinced of the value of the volunteering to participate in 1977 1979. Assessment of Other Subject Areas While mathematics and reading expanded MEAP, the large percentage of districts and 1978 convinced them to mandate the program in are important basic skills (some would argue the most important skills), schools should and do teach students other subjects. MDE, recognizing this, developed objectives in other areas. Test development has occurred in most of these areas and by now, statewide samples of students have been tested in these areas. The original plans called for the assessment of two subject areas each year (in 207

PAGE 212

addition to mathematics and reading) at grades four, seven and ten through statewide sampling to produce an overall picture of the state. Assessment in each area then would follow a four-year cycle continuing to assess all subject areas. Forces For Change The MEAP has continued from 1979 to 1985 to assess all fourth, seventh and tenth graders annually in mathematics and reading. In addition, one or two subject areas were selected for sample testing each year. While achievement has risen in mathematics and reading, there have not been appreciable changes in student performance in the areas where only samples of students were tested. Considerable support was evident for MEAP and for changing the program to support instructional improvement in all subject areas tested. A major force for change of MEAP, of course, has been the spate of reports on the condition of education nationally and in Michigan. A number of these have proposed using testing not only as vehicle to monitor student achievement but also as stimulus for educational reform. In Michigan, for example, a special report written by State Senator Sederburg and Michigan State University Professor Rudman, was prepared that examined changes in performance for various subgroups of students, particularly at the high school level, where comparative data on students in Michigan and the nation is available using college-entrance tests such as the SAT. This report was written in response to A Nation At Risk and the Michigan State Board of Education plan for the future (A Blueprint for Action, 1984), which included Corn mission. The following is report: recommendations made by the Michigan High School taken from the summary of the Sederburg and Rudman 208

PAGE 213

Over the past few years, state and federal educational policy has targeted the lower achieving student. This targeting of funds and effort has yielded results. However, it is apparent that, at the same time, we may have neglected the better achieving student. In contrast to the prevailing belief, the brightest students have not succeeded regardless of the educational system. Consequently, we are calling for a shift in educational policy. We must create an educational system that challenges all young people and develops students to the best of their abilities. Emphasis on testing for basic skills for high school graduation and grade promotion reinforce the attitude that teachers and administrators should be most concerned with the lower achieving student. While it is worthwhile to insure that all students possess essential skills before graduation, we must not overlook the student who is not challenged by such minimal objectives. The recent proposals made by the State Board of Education go a long way toward accomplishing the goals outlined here. However, the entire focus must be shifted away from minimal skills which tend to bring high achievers down while trying to bring everyone up to the highest level possible. The State Board and the legislature will need to clarify their philosophical direction as well as set specific goals for whatever educational reform they wish to achieve in the 1980s. Proposals for Change in MEAP The Sederburg and Rudman paper contained the first proposals for developing a higher-level test. Although the State Board of Educations report included changes for the assessment program, such changes dealt only with broadening the scope of MEAP to include periodic, every-pupil testing of other subject areas including health, science, career development, and social studies. The State Board of Education has approved the voluntary testing of Health in 1985 and the every-pupil testing of science for 1986. The Sederburg-Rudman article, however, dealt specifically with higher-level assessment by suggesting, among other things, that: 1. The testing program of the State Board of Education should be changed to adequately measure all Michigan students, not just those below the achievement level determined by the State. 2. The State Board of Education set achievement goals to be attained by all achievement classifications by a specific date. In their f Blueprint for Action the State Board calls on local boards to initiate a 3-5 year plan to improve achievement. Similarly, the Board should set State goals to improve all categories of Michigan youngsters. 3. State policy should reflect an effort to pressure local school districts to provide programming for the entire spectrum of students. The State testing program should be used to validate or accredit local school diplomas for all students. 209

PAGE 214

a b. c A Achievement tests administered as early as the tenth grade should point to areas for potential remediation. The 10th grade test should emphasize reading, language, and basic math skills. An 11th grade exam should include physical science, biological science, and social science. The 12th grade year would be used to assist students who did not meet essential skills in the 10th and 11th grade exams. The State Board of Education should use these tests as the basis for accrediting high school diplomas. response to the Sederburg and Rudman paper by the MDE suggested other possible directions for the MEAP, including expanding the program to periodically assess a third subject area at grades four, seven and ten. In addition, the MDE proposed: The other way in which MEAP may change in coming years is to assess students beyond the basic skill level. This discussion presumes that (1) testing basic skills is valid and will still be carried out, (2) testing higher-level skills should emphasize the same purposes as the regular MEAP program (i.e., individual student assistance, curricula review and revision, reporting to various audiences), (3) students should be identified based on their basic skill achievement, (4) such higher-level skills are either more difficult subject matter content, critical reasoning skills or higher-level thinking skills (e.g., analysis, synthesis and evaluation from Blooms Taxonomy), and (5) the students identified can be offered a school program which meets their educational needs, even as schools are helping students who have not as yet achieved the minimums. The presumption is that schools (and the State) can emphasize both basic skills and advanced skills and not have to choose one over the other (Roeber, 1984). MEAP staff proposed a plan that included a two-tier approach, with all fourth, seventh, and tenth grade students taking the basic skill level and those that passed, the higher-level examination. It was proposed that advanced tests be developed at three levels (grades 4-6, given in seventh grade; grades 7-9, given in tenth grade; and grades 10-12, given in grades 10, 11, and 12). Staff also developed a list of technical and policy issues for testing beyond the basic skills. The Department plan was presented to the State Board of Education in early 1985. After considerable discussion, the State Board approved the MEAP staff plan that a study group be convened to examine issues and to develop a tentative assessment plan. 210

PAGE 215

Developing the Plan for the New Assessment Program Since late 1984, Department staff have been consisting of local and intermediate district educators, meeting with a planning group college and university specialists and others. Represented on the group are gifted educators, assessment and curriculum specialists, content area specialists (e.g., science, reading), and administrators. The group has spent a considerable amount of time discussing methods to address student needs, particularly those of students who already pass the current basic skills tests. Very early in these discussions it was apparent that there were sharp differences of opinion regarding the direction MEAP should take. Some members of the advisory group, for example, proposed toughening the current content standards tested in MEAP. Others suggested that tests of critical thinking, critical reasoning, or thinking skills be used. The group pursued both options. Discussions have focused on what tougher standards really mean, how higher-order thinking could be tested and how this program could mesh with the current basic skills program. Others have been examining various approaches to teaching thinking skills, looking particularly at how thinking skills are defined and the implications for testing. While viewed originally as an alternative to the current basic skill program (or, at least, a more difficult extension of it), thinking skills are now viewed as a logical complement to the current program, plus any new program which might be developed. Recommendation for Change The planning group agreed that there is a need to assess subject content from a conceptual point of view and to include a broader range of subject matter content. In order to encourage the development of students thinking skills, the committee also felt that thinking skills should be assessed within each subject content area. Also, the group felt that MEAP should be broadened to include an every-pupil writing assessment, and 211

PAGE 216

subjects other than mathematics and reading should be assessed each year rather than on the current cyclic program. Taken as a whole, the group recommended: 1. Basic skills assessment continuation and revision of the every-pupil essential skills assessments at grades 4, 7, and 10 in reading and mathematics. The revisions should include the assessment of thinking skills, a broader range of (i.e., algebra in ninth grade mathematics test) and the focus on understanding the concept as opposed to a right answer. 2. An every-pupil writing assessment be given; 3. Health, science, social studies, and career development be assessed on an everypupil matrix-sampling basis. It is recommended (2 and 3) be implemented in grades 5, 8, and 11. 4. Thinking skills should be assessed in all content areas. The planning groups recommendations will be presented to the State Board of Education in early 1986. If action was favorable, it would take years to develop the needed testing materials. It would also take time to prepare local districts to test several subject areas at grade levels not previously assessed. Most importantly, staff would need to define higher order thinking skills, both in general terms and also for each subject area in which it will be tested. Counterforces Against Change Following the completion of the planning groups work, the recommendations were presented to the State Board of Education in March, 1986. They received the planning groups report and referred it to the State Board of Education-appointed advisory council for the service area of the Department in which MEAP is located. This advisory council the Office of Technical Assistance and Evaluation (OTAE) Advisory Council is comprised of official representatives of major professional groups such as teachers, principals, administrators, school boards, curriculum groups, as well as technical specialists. The purpose of the OTAE Advisory Council is to advise staff and the State Board of Education on the major issues facing the Office. 212

PAGE 217

The OTAE Advisory Council reviewed the planning groups recommendations and, in May, 1986, voted to oppose the plan and, instead, support a plan that would call for MEAP to develop item banks which local districts could use, in addition to available tests and MEAP tests in the five areas covered by the plan to test one or more of them on a voluntary basis. MEAP would develop, with the assistance of technical groups, standards for equivalence among the various measures used in any subject area. However, testing would not be During the planning mandatory. the summer, MEAP staff convened an ad hoc group comprised of a subset of group and the OTAE Advisory Council to attempt to develop a compromise which all groups could support. The planning groups recommendations were particularly opposed by four groups: the Michigan Education Association and the .Michigan Association of School Boards, both of which feared loss of control of schools, the Michigan Association for Supervision and Curriculum Development, which felt testing was not the proper vehicle for curriculum change and the Middle Cities Association, which felt that state testing duplicated local testing and that the latter was preferable. These groups and others were asked to serve on the ad hoc group. The group met four times during the summer of 1986 and held several stormy sessions to arrive at the compromise. required to give the expanded testing social studies and career development This compromise was that local districts would be at grades 5, 8 and 11 in writing, health, science, once every four years (but volunteer on off-years) and financial incentives would be sought for participating schools to use for school improvement activities. During the fall, 1986, the compromise plan was re-submitted to the OTAE Advisory Council, with the interest of sending it to the State Board of Education. Each Advisory Council member was asked to discuss the compromise plan with the organization they represented. In October, 1986, the Advisory Council took formal action on the compromised plan and rejected it. Most major organizations continued to oppose it, even 213

PAGE 218

though the representatives that had served on the ad hoc group had (personally) agreed to the compromise. Mandatory testing was the key to the rejection of the compromise. Final Plan for the Future Approval Following the vote of the Advisory Council, MEAP staff were informed by the State Superintendent that, with the opposition of about all groups to mandated expansion, he would not put any plan mandating expansion before the State Board of Education, MEAP staff Instead, the development than rewrote the plan for the future to delete any mandated expansion. plan calls for the development of tests in health, science, career and social studies, grades 4, 7 and 10, which are to be offered annually on a voluntary, state-paid basis to local districts. In addition, a writing test will be developed for grades 5, 8 and 11 and offered on the same basis. Staff will continue to develop a program of financial incentive to encourage schools to give the tests and to use the information to review curricula and improve instruction. This plan was presented to the State Board of Education in March, 1987, and approved unanimously. Tests in the areas of health, science and career development will be offered to districts in the fall, 1987 MEAP; tests in social studies and writing are in development and will be added when ready. Summary The MEAP has been in operation since 1969. During that time, it has shifted from a norm-referenced to an objective-referenced program While the program was controversial in its early years, the emphasis on providing data helpful to i reproving student learning has helped to improve the support for the program. Grade 10 assessment was added in 1979 to the original grade 4 and 7 programs. In more recent years, periodic, every-pupil tests in other areas, such as science, were proposed. The first area of such testing is science scheduled for 1986. 214

PAGE 219

The MEAP. thinking, these ide State Bo / Advisory was also voluntaryEducation cent reports on education have led to a number of suggestions for changing se include toughening the basic skills tests, adding measures of critical increasing the number of subject areas tested. Staff plans to implement were presented to the State Board of Education in 1986 and referred to the of Education appointed Advisory Council. The plans were rejected by the uncil. A compromise plan, which contained an element of mandatory testing, ected by the Advisory Council. Consequently, a plan to expand MEAP on a tate-paid basis was proposed by staff and approved by the State Board of The plan will be implemented beginning in the fall of 1987. 215

PAGE 220

REFERENCES Roeber, Edward D. Michigan Educational Assessment Future (Michigan Department of Education, 1984). Program: Proposed Plans for the Sederburg, William A. and Herbert C. Rudman. Educational Reform and Declining Test Scores, Michigan School Board Journal, vol. 30, No. 24, April 1984 pp. 8-10. State Board of Education Better Education for Michigan Citizens: A Blueprint for Action, (Lansing MI, 1984). 216

PAGE 221

STATEWIDE TESTING IN NEW JERSEY Steven Koffler Prepared Under Contract With the Office of Technology Assessment Congress of the United States 217

PAGE 222

Statewide Testing in New Jersey The focus of statewide testing in New Jersey has changed three times since 1972 to meet the changing demands of society. During the past fourteen years, the program has changed from statewide assessment (1972-1977) to minimum competency testing (19781985) to the current more rigorous competency testing (1984 ). The purpose of this paper is to explain the changes in statewide testing in New Jersey, with particular emphasis on the rationale for the different programs, the components of each program and the curricular and policy implications of each. Educational Assessment Program Statewide testing in New Jersey began with the first administration of the Educational Assessment Program (EAP) tests in 1972. The EAP measured reading and mathematics skills which had been identified as being taught in a majority of the public school classrooms in New Jersey. Students in grades four, seven and ten were tested annually; students in grade twelve were tested every three years. The impetus for the EAP came from New Jersey Governor William Cahill who, in his 1972 State of the State address, lamented that there was no reliable scientific test on a statewide basis to determine reading ability and reading growth of our youth. A bill to create a statewide assessment program died in the legislature; however because New Jersey statutes provide the Commissioner of Education with the power to create such programs, Commissioner Carl Marburger ordered that a statewide assessment program be developed. The primary purpose of the EAP was to assist districts to identify programmatic needs and provide direction for program design, improvement and evaluation. Results were returned to the districts in the form of item-by-item summary reports. Those 218

PAGE 223

reports identified the percent of students correctly responding to each item for every class, building and district. Districts were required to analyze and make public the test results. However, the districts only had to do so for the subset of items which in their judgment measured the skills which had been taught prior to the tests administration. No total or other aggregated scores were reported at any level. As a result, the EAP results had little effect on policy. The test results also did not affect students or schools. The EAP was intended for statewide and district assessment, not for measuring individuals or groups competency. The EAP monitored the education system and measured the status quo. It served a limited, but important, role: focusing on the districts curricular needs and monitoring the changes in the needs. Minimum Basic Skills Program By the mid 1970s, the continuing trend of declining test scores and increasing costs for education led to the loss of public confidence in the professional educators ability to resolve the problems of education. This loss of confidence led to the publics decision that external forces had to impose and raise standards in the schools. And, testing was to play a prominent role in that decision. Statewide assessment programs, like satisfy the publics new demand. Instead of status of the education system, the public catalyst to cause the system to change. programs were initiated in state after state. the EAP, were considered insufficient to tests which provided information about the wanted a program which would serve as a As a result, minimum competency testing A 1976 New Jersey law resulted in the end of the EAP and the creation of the Minimum Basic Skills (MBS) test, a statewide minimum competency program designed to measure pupils proficiency in minimum reading and mathematics skills at grades 3, 6, 9 and 11. The skills to be measured by the MBS were identified based on input from educators, students and the general public and were those which students needed to 219

PAGE 224

master at a minimum by spring of the tested grades. The tests were criterion-referenced tests developed by the Department. In spring 1978, the MBS tests were administered for the first time. Approximately 21% of the students failed at least one of those tests that year. In one urban area approximately 84% of the students failed the sixth grade mathematics test and 81% failed the ninth grade mathematics test. In 1978 many students, especially in the urban areas, did not have a By 1982 there spring, only 9% of mastery of those skills considered to be minimum and basic. were dramatic improvements in student performance. By that the students were failing; there was substantial improvement, especially in the urban areas. The improvement was both expected and logical. After five years, school curriculums had been modified to reflect the tested skills, the teaching staff was teaching the skills, and, as the results indicate, students were learning the skills. While the EAP program assumed a passive, monitoring role, the MBS served an active role in changing the education system. This difference in roles in exemplified by the manner in which the results were reported to the public. The EAP reporting was left to the districts and was on an item by item basis for selected items. The MBS reporting took on new and more important meaning because district by district aggregated results (i.e., percent passing) based on all of the items were reported to the public by the Department. Districts could be compared and the public sought answers as to why their districts students were not performing at the same level as students elsewhere. The publics demand provided the pressure that contributed to the teaching of the MBS skills. While the EAPs effect upon the districts curriculum was negligible, the MBSs effect was far reaching. The EAP skills were included in the districts curriculums; however, MBS skills were not necessarily part of it. Total scores and public reporting were based on all of the items. Thus, teaching had to reflect all of the skills. Certainly, districts did not have to alter their programs so that sufficient instruction in the tested 220

PAGE 225

skills occurred prior performance might be dictated a portion of to the testing dates. Yet, if they did not, their students lower than those of neighboring districts. In this manner, the tests each districts curriculum and the impetus for curricular change shifted to the Department of Education. The MBS also became a critical policy. Unlike the EAP, sanctions were factor in shaping many areas of educational now i reposed as a result of the test. The MBS results influenced high school graduation policies and became a method of identifying students districts improve In publicly who needed remediation and a mechanism for distributing funds, certifying and evaluating teachers. As a result, there was even greater pressure to performance. summary, because its results affected and effected policy and were reported each year, the MBS became a catalyst that changed education in New Jersey. The MBS was a successful program; students in New Jersey mastered the minimum skills. Yet, the programs success caused its demise and properly so. High School Proficiency Test The MBS was a key issue in the 1981 New Jersey Republican candidate, former state Assemblyman Thomas gubernatorial election. The Kean, was the author of the 1976 MBS law. However, by 1981 he believed that the states focus on minimum skills was too narrow. Kean was elected and appointed Saul Cooperman, a New Jersey district superintendent, as his Corn missioner of Education. Cooperman agreed that the MBS had to be eliminated. He concluded that the education system had moved beyond the minimums because students had mastered the minimums. Most students were not only passing the test, but most were correctly answering almost all of the items. Further, because the MBS focused on minimum skills, it could not identify deficiencies in higher level cognitive skills and the need to measure the higher level skills was becoming increasingly evident. 221

PAGE 226

A 1979 law mandated statewide graduation requirements, including passing the ninth grade statewide test, beginning with the ninth grade class of 1981-82. Cooperman believed that a cruel hoax was being perpetrated on the students because although they could be awarded a diploma by passing the MBS, many of them did not have the skills which would prepare them for the work force or college. Cooperman was convinced that higher standards were necessary and that the states graduation test had to reflect the level of skills and difficulty that was needed by ninth graders in order to become productive members of society. He believed that since students had mastered the minimum basic skills, it was the proper time to take the next step and require a mastery of a set of higher level skills. In August 1982, Cooperman recommended to the State Board of Education that the MBS program be eliminated and that it be replaced by a new statewide testing system which would better reflect the current needs of students in the state. Cooper man indicated that he would recommend the components of the new program in January 1983. There were eight principles which Cooperman decided must be satisfied by the new statewide testing system. 1. The new tests had to provide a measure of accountability which would restore public confidence in education. 2. The new testing system had to be fiscall y economical and relatively independent of funding fluctuations. 3. The new tests had to be more rigorous than the MBS and emphasize more than just minimum basic skills. 222

PAGE 227

4. 5. 6. Tests were needed in the elementary insure that students were mastering pass the graduation test. grades as an Early Warning System to the prerequisite skills they needed to The new system had to avoid or minimize duplicative or overtesting. Thus, the tests used had to be as efficient as possible and serve state and local purposes, where appropriate. The tests had to satisfy rigorous professional standards. 7. The new system had to satisfy New Jersey law which required that the Department of Education establish uniform proficiency standards in the basic skills. It also required a administered to students in the 8. The new system had to satisfy which required that: test for high school graduation to be initially ninth grade. the Debra P. v. Burlington judicial decisions a. graduation tests had to reflect the material taught; b. In students had to be provided fair warning and opportunity to prepare for a graduation test. January 1983, Coaperman recommended to the State Board of Education the components of the new statewide testing system. Many alternatives had been considered including the use of commercially-developed normed-referenced tests, state-developed criterion-referenced tests, and combinations of the two. The recommended program 223

PAGE 228

included a state developed nint h Proficiency Test (HSPT). The HSPT criterion-referenced tests and would than did the MBS. grade graduation test, called the High School would consist of reading, mathematics and writing be designed to measure a higher level set of skills There would be no state-developed tests in other grades. Rather, districts would continue to be required to select and use in grades 3-11 the test which was most appropriate for their curriculum and satisfied technical criteria established by the Department. The Department would identify specific passing scores for eac h commercial test (percent passing) The use of and would annually collect and make public each districts test results in grades three and six. both a state-developed test in grade nine and commercially-developed tests at all other grades had many persuasive advantages and best met the established principles. The advantage of the commercial tests were as follows: 1. The tests districts chose would best match their curricula. 2. Commercial tests measure higher level skills than the MBS test and can be administered at every grade level, providing for a continuous assessment of student progress. 3. Commercial tests allow districts to compare their students performance with that of students at the national level. 4. The use of commercial tests avoids overtesting or duplicative testing. It also reduces costs to the state without increasing costs to the districts. 224

PAGE 229

5. In 1978 when the MBS program began, state-developed tests were needed at multiple grade levels because many districts did not have sophisticated testing programs which could be relied upon to provide valid and reliable data. Today, however, local programs do provide such information. ; While the arguments for using commercial tests in the elementary grades were persuasive, there were equally compelling arguments for using a state-developed test for grade nine. The major factor was the high school graduation law. It would be unfair to permit students to take different graduation tests because they attended different schools. Many wanted the requirement. However, required that before a HSPT to immediately the due notice decisio n test was used to deny sufficient time for the students to be taught the replace the MBS as from the Debra P. v. the graduation Burlington case students a diploma, there had to be skills. Because of this, Commissioner Cooperman and the State Board of Education agreed that although the HSPT would be administered beginning in 1983-84, it would not count for graduation until the 1985-86 administration. Thus, during school years 1983-84 and 1984-85, the MBS and HSPT were administered to all ninth grade students. The major distinction between the MBS and the HSPT was in the skills measured by each. While the MBS measured rote learning, the HSPT measures skills students need to interpret what they read, solve practical math problems and write coherently. By contrast, the MBS reading test stressed literal comprehension while the HSPT measures inferential comprehension. The MBS math test required simple computation and one-step word problems while the HSPT math test requires students to respond to threeand fourstep word problems, prealgebra and geometry. While there was no writing component to the MBS, there is one for the HSPT. The writing component of the HSPT consists of both a multiple choice section and, more importantly, an essay. 225

PAGE 230

At the December 1985 State Board of Education meeting, Commissioner Cooperman recommended to the Board passing scores for the HSPT. More important than the actual passing scores are the anticipated i replications of the scores. In 1986, approximately 86,500 students will take the HSPT. It is estimated that about 42,000 students (48.5%) will fail at least one part of the test. However, as with the MBS test, students have four opportunities to pass the HSPT (in grades 9-12). It is expected that each year as the districts curricula become more aligned with the HSPT-tested skills, the percent of students passing the tests will dramatically increase. Considerable effort is now being directed to prepare students for the HSPT both at the state and district levels. As part of its HSPT initiative, the Department did not stop with developing a new, more rigorous statewide testing system. Rather, the Department went beyond its traditional regulating role and is now working with districts to develop and offer new programs to help prepare students for the HSPT. The Department has developed a variety of programs, training institutes, resource guides, pilot programs, demonstration projects, model programs and instructional materials for districts directed toward helping students improve their basic skills measured by the HSPT. Further, it has developed programs to improve student attendance, strengthen job training programs, discourage students from dropping out and offer alternatives to those who do drop out and reduce disruption in the classroom. Approximately $13 million has been committed for this effort, one of the largest of its kind in the country. Although virtually no organization opposes the movement toward higher standards, certain groups are opposed to various aspects or implications of the program. The statewide organizations representing the principals and supervisors, school boards and teachers have expressed concern about the effect the program will have on dropouts, the need for increased funds for compensatory education programs, and the length of the due notice period. The following points are pertinent to those concerns: 226

PAGE 231

4 1. That the test will lead to an increased high school dropout rate is speculative and not supported by the MBS experience. The states dropout rate remained stable during the MBS years. 2. Students who fail tests at all grade levels (MBS, HSPT, commercial test) are to be provided with compensatory education programs. In 1985-86, the Department is providing districts $106 million in state compensatory education aid for remedial programs. In 1986-87, the total is expected to exceed $110 million. The Commissioner has requested an additional $49 million, for a total of $159 million, to address the increased needs anticipated during the transition from MBS to HSPT. 3. The organizations did not favor postponing the HSPT; rather they wanted to gradually increase the passing scores, arguing that there has not been sufficient time for the students to have been taught the skills. However, districts and students have now had a two and a half year preparation time before the first meaningful administration of the HSPT, and a six year delay before the test would affect the first graduating class (1988-89). Further, to lower the passing score from the recommended levels would serve to graduate students who were not as prepared as they should be. it is clear that the HSPT will parallel the MBS as a catalyst to reform education in New Jersey. It will be used for essentially the same policy and curricular purposes as was the MBS. However, the impact of the HSPT may be even greater than the MBS because of its increased rigor. 227

PAGE 232

Conclusion The concept of statewide testing changed significantly in New Jersey as the demands of the public changed. It is clear that the public is convinced that statewide competency programs are a legitimate means of effecting reform. Their confidence is apparent by the support for the movement in New Jersey toward a more rigorous form of program rather than an abandoning of statewide testing. Finally, even though the HSPT is still in its initial stages of implementation, plans are already being developed to someday replace the HSPT with a new graduation test at the eleventh rather than the ninth grade level. Thus, it is likely, at least in New Jersey, that statewide competency testing will continue to be an important component of the education system for many years. 228

PAGE 233

REFERENCES Cahill, W.T., Governors Message to New Jersey State Legislature, January 1972. (Trenton, NJ, 1972). Cooperman, S., High School Proficient Test Information Packet, (New Jersey Department of Education j Trenton, NJ, 1985). Debra P. et. al. v. Burlington, 644 F. Supp. 2d 397, 400-02 (5th Cir. 1981). Koffler, S. L., Statewide Testing Programs: From Monitors of Change to Tools of Reform, (Paper presented at the 1984 annual meeting of the National Council of Measurement in Education, New Orleans, LA, 1984). New Jersey State Department of Education Guidelines for the Interpretation of the New Jersey Educational Assessment Program Results 1972-73, (Trenton, NJ, 1973). New Jersey State Department of Education, Statewide Testing System, (Trenton, NJ, 1983). 229

PAGE 234

NEW YORK STATE TESTING POLICIES Winsor A. Lott New York State Education Department January 12, 1986 Prepared Under Contract For The Office of Technology Assessment Congress of the United States 230

PAGE 235

New York State Testing Policies In 1985 New York celebrated the bicentennial of the University of the State of New York, which the name given to the totality of the libraries, and museums, all regulated by the Board of Regents. does the States board of education have such sweeping and States educational and cultural institutions. The Rules of the Regulations are so exte n secondary educ Thus, : examination : which schol a in the annual that the ac a who were the impositilibies The a c tions in 187 exam i nation furnish a s ments of un 1883, p. 36) tions in mol discontinue administered been retain States schools, colleges, Perhaps in no other State enduring power over the Board of Regents and the the Commissioner of Education have the force and effect of law, and they ive that there are few aspects of education, particularly elementary and ation, that go unregulated. was not surprising when, in 1865, the Regents created a system of State n English grammar, spelling, arithmetic, and geography to determine in each academy are entitled, under the provisions of law, to be counte d apportionment of the literature fund ( Murray, 1881, p. 462). It appears mies had been claiming enrollments that included large numbers of pupils yepared for academic study, and these numbers were reduced sharply by of the Regents examinations. tive preliminary had to be added to the name of the Regents examinawhen a series of advanced examinations made its debut. The advanced were designed, in the language of Chapter 425 of the Law of 1877, to able standard of graduation from said academies and academic departschools, and of admission to the several colleges of the State (Bradley, The advanced Regents examination program still continues with examinathan twenty high school subjects, but the preliminary examinations were 1959 because the literature fund had disappeared and the examinations, the end of grade eight, no longer served any useful purpose. Had they they could possibly have made the introduction of competency tests unnecessary scant fifteen years later. 231

PAGE 236

It is interesting to note that the State Legislature was involved in the creation of the advanced or high school Regents examination program. Perhaps the 1877 legislation was introduced at the request of the Board of Regents because, as a general rule, the Legislature does not interfere with the Regents, who preappointed by the Legislature, in matters pertaining to educational programs such as the recommended curriculum or the State testing program. Exceptions are made when the Regents take actions that are clearly unpopular. Many testing programs have been introduced by the Board of Regents or by the Boards administrative agency, the State Education Department, since 1877. Some of these programs have disappeared and some continue. Among those that have disappeared are a variety of norm-referenced tests, first in reading and then in mathematics, science, and social studies. The tests were administered in elementary and junior high schools on an optional basis. Another test that has disappeared is the Regents Scholarship Examination, which was used to select the winners of undergraduate scholarships. Now the SAT and ACT are used for this purpose. The Regents Scholarship Examination was eliminated by the Legislature as a result of lobbying by the guidance counselors association. The association argued correctly that the same individuals would be identified as winners by the SAT and ACT, which all college-bound students not needed. Among the programs that continue is the Pupil take, so the States examination is Evaluation Program, which consists of reading and mathematics tests in grades three and six and a writing test in grade five. The tests are administered annually to every pupil in every public and nonpublic elementary school. identify pupils who the Commissioner. sisting of reading, Introduced in 1965 as a general assessment program, it now serves to are in need of remediation, which is mandated by the Regulations of In the 1970s, a competency testing program was introduced, conwriting, and mathematics tests that are administered in the high schools and preliminary competency tests in reading and writing that are administered in 232

PAGE 237

grade eight or grade nine. Every student who receives a high school diploma must demonstrate competency in reading, writing, and mathematics. About one-half of each graduating class demonstrates competency by passing the competency tests, and the other half (the college-bound) do so by passing Regents examinations in English and mathematics or by attaining designated scores on the SAT or ACT. This paper deals with elementary and secondary school testing programs, but it should be noted that other testing programs have been introduced by the Regents or the State Education Department and continue to function. These include a series of collegelevel examinations that allow individuals to earn college credits and eventually, if they choose, to be awarded a college degree by the Board of Regents. Also included are professional licensing examinations, graduate scholarship and fellowship examinations, and a high school equivalency testing program. All this is by way of saying that the Regents and the State Education Department have a long and elaborate history of introducing examination programs to meet specific needs or to accomplish specific purposes. The tests that have disappeared have been, for the most part, tests that have been provided as a service to schools. Those that remain serve a regulatory function. With a few exceptions, the State tests are developed by the State Education Department with the aid of consultants. Two separate testing offices (one in the elementary and secondary branch and the other in the postsecondary branch), the offices of subject-matter specialists, and professional licensing boards are involved in test development activities. Tests are clearly an important priority for the Board of Regents. The current importance of testing was made apparent in the 1970s when the Regents competency testing program was introduced, and this importance has bee n dramatically highlighted during the past few years. In 1984, the Board of Regents adopted the New York State Board of Regents Action Plan to Improve Elementary and Secondary Education Results in New York j on which work had begun well in advance of 233

PAGE 238

the flurry of reports criticizing the nations schools. The Action Plan increased high school. diploma requirements? added to the elementary and middle school curriculum, and took other steps to reform the States elementary and secondary schools. Not surprisingly, these other steps include a significant increase in the number of tests to be taken by New York State students. In a few years, students will be required to demonstrate competency in science and social studies as well as in reading, writing, and mathematics to receive a high school diploma. Three new competency tests will be added, one in science and two in social studies. In addition, a new science test will be administered in grade six, and new social studies tests will be administered in grades six and eight. Foreign language proficiency examinations will be administered in the middle grades. Tests in as many as 40 occupational education courses will be added, and there will be two high school Regents examinations in social studies where there is now only one. From the beginning of the high school Regents examination program in 1877, the State has issued a Regents high school diploma to students who pass certain of the Regents examinations and earn several more units of credit than are required for a local diploma. The Regents diploma has always been seen as more prestigious than a local diploma, although there is no practical difference between the two types of credentials. No college requires a Regents diploma for admission. Under the Action Plan regulations, the number of Regents examinations that a student must pass to receive a Regents diploma has been greatly increased. Perhaps the most unique feature of the Action Plan is the Comprehensive Assessment Report. Each fall the State Education Department will provide public school districts and nonpublic schools with a compilation of its State test results for the past three-years, coupled with other statistics such as dropout and attendance rates, average class size, enrollment by race or ethnic origin, socioeconomic indicators, pupil mobility rate, and similar items. All of the data are reported routinely to the State Education Department during the course of the school year, but the Comprehensive Assessment 234

PAGE 239

Report organizes the data together with explanatory text. Under the Action plan regulations, the superintendent of each public school district must present the districts Comprehensive Assessment Report to the board of education at a public meeting. The reports serve as a public record of accountability, and the Regents believe that the debate and discussion stemming from the school boards review of the report is the best means of bringing about programmatic changes. In the past, many newspapers have obtained test results, particularly for the Pupil Evaluation Program, in order to publish stories comparing school districts. Now, however, a tremendous amount of data is readily available. (The first Comprehensive Assessment Reports were prepared in October 1985 and had to be presented to school boards prior to December 15.) Many more newspapers are publishing comparative data, and the articles are far more extensive than they have ever been before. This is clearly what the Regents intended. The Comprehensive Assessment Report by itself would have been an effective means of stimulating local school improvement efforts. Linked to the report, however, is a requirement that the Commissioner of Education identify 600-900 low performing schools that will be required to develop and submit comprehensive school improvement plans. It is the intent of the State Education Department to work with these schools in the development of their plans and in their improvement efforts. The names of these schools were widely publicized by the media, as anticipated. It is apparent from the Action Plan that the Board of Regents and the State Education Department view the State testing program as a powerful tool for insuring compliance with the Commissioners Regulations, for bringing about change, and for improving the quality of education in New Yorks schools. There are, after all, few other tools available and none so effective. 235

PAGE 240

REFERENCES Bradley, J.E. The Regents Examination. In Proceedings of the Twentieth Meeting of the University Convocation of the State of New York, July 11-13, 1882 (Albany, NY: Weed, Parsons and Company, 1883). Murray, D. t Academic Examinations. In Annual Report of the Regents of the University of the State of New York (Albany, NY: Weed, Parsons and Company, 1881). 236

PAGE 241

OREGON STATE TESTING POLICIES PAST AND PRESENT Wayne Neuburger, Director Assessment and Evaluation Oregon Department of Education January 6, 1986 Prepared Under Contract For The Office of Technology Assessment Congress of the United States 237

PAGE 242

Oregon State Testing Policies Past and Present Over the past twelve years educational policy in the State of Oregon has had a strong emphasis on the use of testing information. In the early 1970s Oregon was the first state to require students to demonstrate minimum competence in basic skills in order to graduate from high school. A state-administered testing program has also been in place since 1974. This program has conducted an assessment of reading, writing and mathematics at Grades 4, 7 and 11. The assessment has been conducted with about a 15 percent sample on a 2-4 year cycle. Finally, since the mid-1970s the state has required local districts to assess individual students in the basic skills to determine their instructional needs and to evaluate instructional programs. Appendix A contains the standards that describe the requirements for minimum competence compliance individual student assessment, instructional program assessment and the state policy for the state testing program. The emphasis of these policies was on a strong local determination of the outcomes to be assessed and the particular assessment tools to be used. The states assessment program was more focused on looking at state performance trends on consensus educational goals. The policy orientation outlined above was the states official stance until the fall of 1983 when Verne Duncan, the State Superintendent of Public Instruction, proposed a series of new policies. They included: l Establish a state-required curriculum in all basic academic programs, kindergarten through grade 12. Assessing all students in grades 3, 6 and 10 in basic skills. l Establishing a state 8th grade examination for all students as they complete their grade school program with an individual program designed for students not passing the test.

PAGE 243

These proposals were presented to the State Board of Education, which is responsible for setting educational policy and requirements or standards for local districts. The State Board and Superintendent commissioned a series of task forces to review the Superintendents proposals. These task forces consisted of teachers administrators, university professors, business leaders, and school board members. Fro m the recommendations of the task forces, the State Board generated the Oregon Action Plan for Excellence, which. was adopted on June 28, 1984. A copy of the plan is included in Appendix B. This plan parallels the State Superintendents initial proposal on testing but changed the grade levels to 3, 5, 8 and 11, and did not require an individual plan for students not passing the grade 8 test. The initial challenge to this plan came when funds were requested for its implementation from the 1985 state legislature. Although the Governor supported the plan and its funding, the legislature was less impressed. There appeared to be a number of groups influencing the decision. The first key influence came when the Senate Education Committee recommended to the Ways and Means Committee that no funding be allocated for the testing portion of the plan. They listed as their reasons that the plan was not thought out well enough and they opposed the potential use of state testing information to compare local schools and districts. The groups that gave input to the Senate Education Committee included representatives from local school districts, the Oregon School Boards Association, the Confederation of Oregon School Administrators and the Oregon Education Association. The hearings before the Ways and Means Committee indicated that the attitude of the members of this committee were similar to the Senate Education Committee. The Ways and Means Committee also seemed to be committed to providing additional funding to higher education and there did not appear to be any funds left for additional elementary and secondary programs. 239

PAGE 244

The inability of the Oregon Department of Education to obtain funds for their state testing program postponed the implementation of the Oregon Action Plan for Excellence. However, the Department was able to reallocate funds to support the development of the common curriculum goals proposed by the State Superintendent. In addition, the testing requirements for local districts are under review with changes to reflect local testing programs addressing individual students and programs related to the states common curriculum goals. These changes could impact local testing programs, even if a state testing program was not implemented. These proposed new requirements are included in Appendix C. Oregon has long had a reputation of strong local option in education. The state has played the role of providing broad general direction with local districts having many options for implementation of these requirements. This orientation has led to a wide variation in the programs that have been implemented by local districts. The larger districts have more consistently developed extensive testing programs. For example, the two largest districts, Portland and Salem, have developed their own tests to meet the requirements of the state. One of the big concerns by these districts is that the states testing program will replace their own programs, taking away their control. On the other hand, small districts, which is the vast majority of districts in the state, have testing programs that are limited to publishers tests. (There are six or seven publisher tests used in the state with no one test having a majority of use. ) In a survey taken by the Department in the Spring of 1985, 85 percent of the larger districts opposed a state testing program that required the testing of all students at selected grade levels. However, 76 percent of the smaller districts supported the establishment of such a state testing program. There is an obvious split between smaller and larger districts in their support for a change in the states testing proposals. However, the larger districts have more influence with the legislature. 240

PAGE 245

The State furthering their refused to fund Superintendent and State Board of Education have continued to work on intention to implement a state testing program. Since the legislature the testing program, they have been active in preparation for the next session. The two major activities have been to develop anew policy for the state testing program and to revise their long range plan (see Appendix D). One change in their plan has been to include in their program a state minimum competency testing program for graduation for high school. Many local districts questioned the relationship between the state test at the high school level on the states common curriculum goals and the requirements that local districts must assess student competence for graduation. The Superintendent states common necessary skills and State Board curriculum goals for graduation. Another change list of approved tests major tests available in the plan have resolved the should be the basis problem by recommending for determining if students that the have the was to allow local districts to administer a test from a at grades 3 and 5. The tests on the approved lists would represent to school districts that match reasonably well the states common curriculum. This would allow local districts to continue to use the major tests being used by districts now. This approach was recommended by representatives from local districts and received support from some of the educational political organizations such as the Oregon School Boards Association and Confederation of Oregon School Administrators. The tests on this list would be scaled to a common scale, allowing for the results from these different tests to be combined. This approach was recently recommended by the Center for the Study of Evacuation as a means to compare test results among states. Another development since interim legislative committee to will be meeting during the spring is the state testing program. The the last legislative session has been the formation of an study educational reform in the state. This committee of 1986. One of the topics possibly under consideration leaders of the House of Representatives and the Senate have expressed a concern over the Oregon legislatures lack of action on educational 241

PAGE 246

reform issues. This committee will make recommendations to the next legislative session which meets again in the spring of 1987. The course of the future of state testing in Oregon is yet to be determined. There are obviously a lot of political groups that can influence the future direction. However, the state legislature with its control over funds has the biggest impact on the State Department of Educations proposed testing program. Until all the pieces fall into place, it will be impossible to predict what will happen. 242

PAGE 247

Oregon Appendix A Standard 316(2) Standard 602 Standard 606 Board Policy 3125 Improvement (Old Policy) Competence Requirements Individual Student Instructional Program Assessment and Program 243

PAGE 248

Appendix A Standard 316(2) (2) Competence Requirements (a) Each student shall demonstrate competence in: (A) Reading (B) Writing (C) Mathematics (D) Speaking (E) Listening (F) Reasoning (b) Student Competence: (A) Shall be verified by measurement of student knowledge and skills or measurement of student ability to apply that knowledge and skill; (B) May be verified through alternative means to meet individualized needs; however, the school districts standard of performance must not be reduced; and (C) When verified in courses, shall be described in planned course statements; challenge tests and/or other appropriate procedures for verification of competencies assigned to courses must also be available. (c) In developing curriculum and criteria for verification, school districts should be guided by levels of performance required in life roles. (d) Competence in reading, writing, mathematics, speaking, listening and reasoning shall be recorded on students high school transcripts. Competence, when verified prior to grade 9, shall be recorded on high school transcripts. Standard 602 Individual Student The school district shall assure that educational programs and services support all students as they progress through school. It shall: (1) Identify each students educational progress, needs, and interests related to: (a) Basic skills attainment of the knowledge and skills expected of students at each grade, K/l through 8, (b) Completion of graduation requirements, and (c) General educational development; (2) Provide instruction consistent with the desired achievement considering the needs and interests of each student; (3) Maintain student progress records; and (4) Report educational progress to parents and students at least annually and as appropriate in (a) Basic skills attainment, (b) Achievement toward the fulfillment of graduation requirements, and (c) General educational development 244

PAGE 249

Standard 606 Instructional Program The school district shall maintain a process for evaluating and improving instructional programs. It shall: (1) (2) (3) (4) (5) Assess student performance annually in reading, writing and mathematics in at least two elementary grades and one secondary grade; Assess student performance on selected program goals in at least language arts, ma the matics, science and social studies in two elementary grades and one secondary grade, prior to the selection of district textbooks and other instructional materials under rule 581-22-520 of these standards; Utilize appropriate measurement procedures in making such assessments and report results to the community; Identify needs based on assessment results and establish priorities for program improvement; and Make needed program improvement as identified in the needs identification process. Board Policy 3125 Assessment and Program Improvement To determine the status of student achievement in areas related to State Board goals, student performance shall be assessed statewide and other types of data shall be reviewed. These data are to be analyzed for discrepancies between actual and expected levels of performance. If significant discrepancies exist, they will become a basis for Board priorities. Statewide assessment also is designed to provide information useful to school districts in making needed program improvements. 245

PAGE 250

Oregon Appendix B Oregon Action Plan for Excellence 246

PAGE 251

Oregon Action Plan for Excellence Introduction. Americans live in a world characterized by accelerating social change which carries profound implications for education. While we Oregonians are justifiably proud of our public school system, we cannot afford to rest on what has been achieved to date. If we do, we can no longer assert that we are doing the job of preparing our children to cope with the demands they will encounter as adults in the 21st century. The schools of Oregon must equip students to be adaptable and self-motivated learners, able to acquire new knowledge and skills long after formal schooling is completed. The Oregon Action Plan has been developed in response to these concerns. Why Make Changes Now? Recent studies have shown that students in Oregon perform better than students nationally on basic skills tests, have higher levels of achievement as they leave high school, and those entering higher education are better prepared than students nationally. Students in Oregon who have prepared to enter the labor market directly also get gpod grades on their performance as new workers. The general level of education in the state is greater than the average across the country. Students in the schools tend to feel good about the education they are receiving and find schools to be an enjoyable and safe place to be. Although schools in the state should be proud of such accomplishments, there is room for improvement. The future will demand that Students be lifelong learners, adapting to new job requirements, technological developments, and societal changes. A recent national study indicates that high school graduates who enter the work force directly need virtually the same skills and abilities as those going on to college. The fundamental skills of oral and written communication, problem solving and comprehension of written and mathematical information are needed for success in adulthood. In Oregon, evidence points to similar conclusions. Employers have indicated that employees will need to be retrained as many as five times while working in one company. Furthermore, Oregon employers feel that schools must help all students in applying their school experience to real life situations and In developing skills and knowledge which enable them to solve problems on the job. Another indicator of the need for school improvement is the concern that Schools are losing too many students before they graduate. Also of concern is the percentage of Oregon students entering college who must take remedial courses in math and English. Adapting instruction to the learning needs and characteristics of individual students must be educators highest priority if such problems are to be alleviated. Excellence for Every Student The goal of the Oregon Action Plan for Excellence is to bring about the highest levels of performance and satisfaction of all students. Excellence is possible when learners are challenged to go beyond assumed limits and develop their talents and abilities to the utmost. Educators and parents must set high expectations for learning and, in turn, provide learning opportunities and support necessary for each student to meet those expectations. Our student population has changed dramatically over the past 30 years. Family mobility, cultural diversity, and the need to serve the handicapped have increased the complexity of the schools responsibility. However, when education is truly excellent, it does not vary in quality because of such variables. The State Board and Superintendent believe the goal of excellence for every student represents the highest form of commitment to equity in education. Empowering the Schools Actions to bring about excellence in education must focus on empowering schools to adapt instruction to the needs, learning styles and learning rates of individual students. Furthermore, such instruction should be directed toward mastery of understood and agreed-upon goals for learning. The energy and efforts of both teachers and students must be primarily oriented around achieving the fundamental learning skills and knowledge which establish a foundation for academic, occupational. and life success. Skillful, competent teachers are the key persons in the schooling process. Actions on the part of school principals and others must support and 247 enhance the capabilities of teachers to develop the

PAGE 252

talents and abilities of all learners. The principals role IS to provide school leadership, to coordinate the instructional program, and to create the climate and Capacity for the self-direction and self-renewal necessary to achieve excellence. The school board administrators. other district personnel (certlficated and classified), and community groups all play important roles to support the key partners in the learning process-the student, the parent and the teacher. State, regional and local agencies need to assist local schools in doing their job by providing guidelines, models, research information, technical assistance, Support networks and financial resources. Underlying Commitment The Oregon Action Plan for Excellence establishes a framework for responding to the problems and challenges described above, building upon the existing strengths of the school system. The State Board of Education, the State Superintendent of Public Instruction and the Department of Education are committed to support educational excellence and effective stewardship of public funds in partnership with focal efforts. Incentives, assistance, encouragement, resources and flexibility will be provided to the maximum extent possible. Meanwhile, a stable and adequate system of school finance is essential. The commitment of the State Superintendent and the State Board to work with the Governor, Legislature and others toward this end is set forth as a primary strategy in this plan. From the 1970s to the 1990s More Than a Decade Since 1972. the State Board and Department of Education have been moving toward a system which focuses on student learning as opposed to he earlier emphasis on methods and means. The Oregon Action Plan for Excellence fits into a logical progression toward a student-based educational system that evolves through cycles of self-correcion and improvement. Simply stated, the system will specify the results to be expected, periodically measure performance, take corrective action and begin the process again. Setting Goals for 1990 While excellence is a worthwhile goal in the abstract, the Action Plan has been developed with he expectation of specific results which can be been by our citizens and through which the performance of the states educational system can be judged. These goals will specify, for example, that by 1990 there will be significant improvement in: school productivity l student achievement in the basic skills employer and community satisfaction with students and schools l student and parent satisfaction with schools l school climate, as evidenced by less vandalism, class interruptions and absenteeism of Progress l a reduced student dropout rate The success of the Action Plan will be measured by how well these and other results are achieved. Agreeing on Policies which Support the Goals To guide Oregon schools in achieving the goals specified above. the State Board of Education has established the following policies for the Action Plan It is the policy of the State Board of Education and the Department of Education to: l Establish standards for public schools designed to enable all students to successfully prepare for adult life after high school. l Establish clear and high learning expectations for all students, allowing flexible means for students to achieve these expectations. l Increase the capacity, incentives, and support for school and program improvement to ensure the best possible learning situation for students. Assure Oregonians of the quality of their public schools. The Department will assemble a task force to develop these goals, and acquire baseline data to ascertain progress toward the goals. 2

PAGE 253

A Framework for Action Initial efforts to implement the Boards broad policies have been recommended by eight task forces which represent all major "stakeholder groups" in Oregon education. The work of these task forces was grounded in research on school effectiveness and organizational behavior tested by the practical experiences of teachers, administrators and community representatives. The action statementswhich describe the work to be doneare set forth in the following pages. The Oregon Action Plan for Excellence establishes basic expectations for all Oregon schools. Where excellent programs already exist, they will be encouraged to continue and grow. At the same time, the plan establishes a framework for action to encourage local school districts to move far beyond basic requirements to provide excellence in education for all students. Actions for Excellence 1.0 Defining What Oregon Students Should Learn 1.1 Define the State Common Curriculum The Oregon Department of Education, working with local school districts and higher education institutions, shall define the required common curriculum goals for elementary and secondary schools in terms of the learning skills and knowledge students are expected to possess as a result of their schooling experience. Goals will be specified at selected checkpoints. Curriculum goals for all students shall be specified in: (a) Learning skills: reading, writing, speaking, listening, mathematics, critical thinking, scientific method, and study skills. (b) Knowledge and skills in: art, health education, language arts, mathematics, science, music, physical education, social studies, career development, personal finance, economics, and computer literacy. Local school districts, with assistance from the Oregon Department of Education, shall be responsible for organizing the curriculum and delivering instruction to achieve the common curriculum goals. 1.2 Provide a Comprehensive Curriculum Local school districts, with assistance from the Oregon Department of Education, shall provide a comprehensive instructional program beyond the common curriculum to advance each students personal, educational and career goals. The program wiII include opportunities for experiences in the visual and performing arts, foreign languages, vocational education and other applied arts, and advanced courses in the areas covered by the common curriculum. Rationale The statutory responsibilities of the State Board of Education are clear with respect to its role in establishing a sound comprehensive curriculum. with particular emphasis on the highest practical scholarship standards . (ORS 326.051). The guarantee of a high quality educational program for all students forms the cornerstone of the states role in public education. By taking a stronger role in defining expectations for student learning, the State Board and Department intend to: (1) provide leadership in establishing educational standards commensurate with the challenges todays students will encounter in the future; (2) focus public attention on the essential outcomes of schooling that are expected of all students; and (3) mobilize the energies of Oregon educators to provide learning experiences that motivate and engage all students. It is recognized that an overly prescriptive approach to curriculum policy would deny schools the flexibility and capacity to capitalize on the inventiveness of teachers, principals and other instructional leaders. As research on effective schooling practices indicates, a strong commitment to school improvement depends in large part on the degree of local ownership of curriculum decisions and instructional practices. Thus, the intent is to define learner expectations in ways that allow for a variety of instructional approaches and options for local curriculum design. Nevertheless, the state will test students attainment of the skills and knowledge expected at the major transition points in schooling to assure that learning expectations are being met. Suggested Timeline 1984435 Develop common learning skills 1985-87 Develop common curriculum in language arts, math, science, health, and foreign language 1987 & beyond Continue to develop comprehensive curriculum guidelines in advance of state textbook selections 2.0 Increasing Expectations and Incentives for Student Achievement 2.1 Increase Graduation Requirements The State Board of Education shall raise the standards for graduation from high school by increasing the units of credit required of all students from 21 to 23 in the following areas of study: 4 units of. English 2 units in mathematics 2 units in science 1 unit in United States history 249

PAGE 254

. A ramework for Action 1 unit in government and economics 1 unit in world history, geography and culture 1 unit in health 1 unit in physical education 1/2 unit in career development 1/2 unit in personal finance 2 units of required electives in: vocational education/applied arts, visual and performing arts or foreign language 7 additional electives With expectations of increased performance levels, schools must be increasingly prepared to meet individual learning needs and abilities. Alternative methods for meeting graduation requirements may be planned for the individual student. Methods to be considered by local school districts include: (a) Challenge tests for specific courses (b) Demonstrating achievement of specific goals through other educational and life experiences. 2.2 Establish an Honors Diploma In order to challenge students to strive for educational excellence the state shall award an "honors diploma to high school graduates meeting the following criteria: (a) A grade point average which indicates superior achievement (b) Demonstrated excellence in achievement in one or more of the following: (1) academic areas (2) vocational/applied arts (3) visual or performing arts. Rationale Raising the number of units required for high school graduation signifies that more effort is expected of high school students, particularly in the subject areas of English, math and science. The complaints of employers and college officials that high school graduates lack skills in writing, mathematics and logical thinking adds legitimacy to increasing course requirements in these areas. Also, findings of the National Assessment of Educational Progress for 17-year-olds indicate that many high school students are poorly prepared in the fundamentals of literacy and numeracy, as well as in higher-order reasoning skills. Strengthened graduation standards must not lead to accelerated dropout rates, however. The challenge to Oregons secondary schools is to employ instructional practices and use new technologies to help all students succeed in meeting the revised graduation requirements. In establishing a state honors diploma, the intent is to motivate students to strive for higher levels of educational achievement, recognizing not only superior performance in traditional academic subjects, but also in vocational and artistic areas. Suggestad Timeline 1984-85 Consider and adopt changes in high shool graduation requirements Establish state honors diploma for the class of 1985437 1987 & beyond 1985 Provide assistance with optional ways to meet requiremants Evaluate impact of changes in graduation requirements 3.0 Measuring and Assessing Student Performance 3.1 Establish Standards and Measure Performance The State Board of Education, with the help of local districts, shall establish standards and measure student performance at grades 3, 6, 8 and 11 on selected goals in the learning skills and knowledge specified in the common curriculum. Most school districts currently have a local testing program in place. Every effort will be made to build the statewide testing program on existing excellent programs. 3.2 Require Local Testing Programs Local school districts shall develop and implement programs for continuous monitoring of student progress toward the learning skills and knowledge specified in the common curriculum so that students can be assisted in making steady progress toward meeting the curriculum goals. Models will be developed by the Oregon Department of Education for districts needing assistance in establishing the local testing program. 3.3 Assess Performance of Eighth Grade Students The test to be administered to all 8th graders will assess students success in mastering the skills and knowledge necessary to be successful in high school. All tests used by the state in assessing student performance will be developed or selected cooperatively with representatives from local districts. 3.4 Monitor Academic Performance of Oregon Students The Oregon Department of Education will monitor the academic performance of Oregon students by gathering assessment data from local school districts and reporting statewide results to the public.

PAGE 255

A Framework for Action Rationale Accurate information on student achievement of the learning goals defined by the state serves a number of purposes: (1) such test results reinforce the common curriculum, particularly when publicly reported; (2) information on the general pattern of student strengths and weaknesses provides guidance for improving curriculum and instruction; (3) data on individual student performance informs decisions on meeting learning needs, such as placement in programs designed to alleviate skill deficiencies; and (4) test results provide the public with an accurate accounting of how well students are achieving. The proposed approach to statewide assessment will have a direct impact on education in Oregon because it will send a clear message to local boards and educators about expectations for learning, while allowing districts the freedom to determine how students progress toward them. Districts should begin to align curriculum and instruction with these standards, continuous monitoring of student performance should occur (beginning in the primary grades), and students should be assured of learning necessary skills as they progress toward the standards. 1984-85 1985-87 1987 & beyond Field test basic skills test for all 8th grade students, Spring 1985 Annually test 8th grade students and field test assessment instruments at other grade levels Conduct annual testing in areas of common /earning and provide tests for program evaluation matching the curriculum revision and textbook selection schedules 4.0 Improving the Effectiveness of Teachers and Administrators 4.1 Develop Performance Evaluation Systems Local school districts shall improve the effectiveness of performance evaluation systems for all teachers and administrators. 4.2 Establish Staff Development Programs Local school districts shall develop and implement effective staff development programs related to district evaluation systems and school improvement plans. 4.3 Provide Assistance The Oregon Department of Education shall provide assistance in efforts to improve the effectiveness of teachers and administrators by: (a) developing models for staff compensation which recognize contributions to improved program and school performance, or assumption of increased responsibilities (e. g., career ladder plans) (b) developing models for staff evaluation and staff development (c) providing workshops, training and other staff development efforts (d) developing a plan for seeking funding for scholarships and subsidies to encourage outstanding graduates to enter the teaching profession (e) working with higher education to strengthen teacher and administrator training programs (f) supporting research, development and dissemination activities focused on effective instruction. Rationale The quality of teachers is a concern that surfaces frequently in surveys of public perceptions of the schools. For example, in the 1983 annual Gallup poll on education, difficulty getting good teachers and teachers lack of interest ranked fifth and sixth among the major problems confronting public schools. Quality of teaching was given a grade of C or below by 45 percent of the national sample. The survey also indicates public dissatisfaction with the level of teachers salaries and the predominant compensation system. By nearly a two-to-one margin, the public favored basing a teachers pay on the quality of his or her work, compared with paying all teachers on a standard-scale basis. Clearly, public regard for education hinges in large part on the perceived effectiveness of school personnel. Many effective teachers and administrators in Oregon are committed to increasing their professional knowledge and skills. While these persons should be saluted, the State Board and Superintendent also believe the quality of instruction and school administration throughout the state can be enhanced by providing greater direction and opportunities for improvement. The actions listed above address the following issues and concerns: Nonsystematic or incomplete planning of evaluation and staff development. Cursory or formalistic evaluation rituals which result in no improvement in personnel performance. Unclear definitions of quality teaching or effective administration. All personnel not being evaluated, with many having little or no expectation of being helped by the process. Requests for help from teachers seen as admissions of weakness by some colleagues and 251 administrators.

PAGE 256

A Framework for Action The general feeling, supported by an inadequate compensation system and lack of growth opportunities for individuals, that an educational career wiII not be rewarding or worthwhile. Solutions to these problems are not sought through formal mandates; theyre more likely found in strong local evaluation systems, continued staff development and adequate compensation systems. The states role is to provide leadership to promote high standards of quality in teaching and to assist districts in developing and implementing systematic evaluation and staff development programs. suggasted Timeline 1984-85 Begin to develop and field teat model evaluation, compensation and staff development programs 1985-87 Develop, evaluate and provide models, guidelines assistance 1987 & beyond Continue to provide technical assistance and update models and guidelines 5.0 Improving School Effectiveness 5.1 Establish Educational Standards The State Board of Education shall redefine the educational standards used to evaluate schools and districts, with an emphasis on student performance. 5.2 Monitor State Standards The Department of Education shall monitor the performance of Oregon school districts in meeting state standards and provide technical assistance to those districts needing help in meeting standards. 5.3 Develop School Profiles In addition to the standardization program, the Department of Education shall furnish each school district with periodic school profile to assist the district in its efforts for improvement. Profile information shall describe the school and its performance. The state will describe the basic format and content with opportunities given to districts to add information of local interest. 5.4 Give Recognition for Excellence The Department of Education shall develop a plan for recognition and awards to schools and districts for outstanding and exemplary educational programs which contribute to excellence for Oregon students. The Department of Education shall develop a plan for recognition and awards to individuals throughout Oregon who have made outstanding contributions to student achievement and educational excellence. 5.5 Encourage Local District Initiative In order to encourage local district initiative in striving for excellence, the Department of Education shall: Develop plans for freeing districts from the constraints of standards which may inhibit creativity and initiative in developing innovative plans of action. l Provide incentive, assistance and encouragement to a few districts willing to probe the frontiers of knowledge and practice. Rationale A commitment to continuing self-renewal and improvement is the hallmark of effective schools. A major ingredient in school improvement is the systematic monitoring of information on key performance variables, using such data to detect potential problems and taking corrective action. The intent of the actions listed above is to increase the capacity for local improvement by providing tools (e.g., school profiles), technical assistance and incentives. The State Board and Superintendent recognize that meaningful efforts to improve school effectiveness originate from within the Iocal system, as opposed to being externally directed or mandated. Thus, the Department will focus its efforts on assisting districts to make effective use of school performance data and providing incentives and support for innovative practices. Suggested Timeline 1984-85 1985%7 1987 & beyond Revise state standards to be consistent with Action Plan Develop and test profile, evaluation and school improvement models Recognize and reward excellence an d improve profile, evaluation and school Improvement models Substitute school and program performance evaluation for much of the traditional evaluation of the means of schooling as the basis for standards compliance and school improvement 6.0 Increasing the Use of Educational and Communications Technology 6.1 The Department of Education shall plan and direct statewide activities to: (a) Provide technology-based instructional materials by locating and distributing existing materials through a clearinghouse on educational technology and through the development of high priority new materials. Also, provide guidelines to assist schools in evaluating software designed for instructional delivery and management.

PAGE 257

A Framework for Action (b) Establish a comprehensive, readily-accessible, statewide communications network for education. (c) Encourage the establishmentof partnerships among individuals, industries, school districts, and community college districts to pursue appropriate uses of technology in education. (d) Develop guidelines for evaluating new technologies and providing models and training for educators to better understand the appropriateness of technology, and how it may be incorporated with other media in the instructional program. (e) Assure that all students in Oregon have equal access to all available technology-based instruction, including instructional television and computer-based instruction. 6.2 The State Board of Education, working with all other appropriate state agencies, shall establish a council on educational technology to coordinate efforts to apply educational technology in Oregon schools. Rationale The use of technology in education can substantially contribute to educational excellence in Oregon by assuring the development of human potential: by providing equitable access to educational resources across the state; by providing equal opportunity for all races, ethnic groups, economic groups, and both sexes; and by freeing staff and administrative personnel to attend to what is educationally essential. However, introduction of technology into education requires the simultaneous development of three interdependent aspects: compatible hardware; effective, relevant software; and skillful staff. Any one of these alone is useless without the others. It is important to ensure that harmony, balance, human values, and equity are included as we develop these new tools for education. However, in the fall of 1963, there was approximately one computer for every 75 students in grades kindergarten through 12 in the state. Although this ratio may be sufficient to provide students with an experience using the computer, it does not allow students and teachers to incorporate technology into the classroom. In addition, only about 30 percent of the teachers in the state feel literate in the use of technology. These factors, along with the need for more and better software, make the use of technology for delivery and management of instruction a long-term goal that will require continued cooperative efforts in supplying the technology and developing skills to use it. 1984-85 Take initial steps to establish an electronic network and clearinghouse for technology 1985-87 1987 & beyond Provide models. guidelines and training and increase courseware available for use with technology Assure all students use and understand the impact of technology in their personal, social and work lives Assure teachers are able to use technology to manage and deliver instruction 7.0 Improving the Use of Instructional Time 7.1 Use Existing Time More Effectively The Department of Education shall provide leadership, incentives, assistance and regulatory flexibility to encourage school districts to use existing instructional time more efficiently and effectively. It is the mutual responeibililty of local and state agencies to free classrooms of interruptions and find creative approaches to more productive daily, weekly and annual school schedules and calendars. 7.2 Establish Minimum Instructional Days The school district shall provide a minimum of 175 days of instruction annually. Time lost for temporary closures must be rescheduled by the school district. Guidelines for length of the instructional day will be developed by the state. Rationale Several research studies in education have shown that the amount of time spent instructing students has a direct effect on how much students learn. This simple relationship has very significant implications for schools. Unnecessary interruptions rob students of the opportunity to increase their knowledge or skill. In addition, studies have indicated that the relevance of what is learned and the quality of presentation contribute to learning. Consequently, it is imperative that schools protect the time available for instruction and ensure it is quality time as well. Through an analysis of current practices, schools should be able to identify where potential problems exist. Also, the sharing of effective practices can assist schools to find better ways of allocating and utilizing instructional time. Suggested Timeline 1984-85 1985-87 1987 & beyond Develop awareness guidelinas and assessment tools for increasing productive use of time Consider rule changes for the 175 day school year and continue to work for reduction of classroom interruptions Find, share and promote creatuve ways to use 253 time

PAGE 258

Strategy Naxt saps 1~ l Oatumfne me fo8aitMty of Ostabfiahlng l n Oregon foundmton for ~ to Obtak! Prtvate hmda for raaaamh. dWdo@rWt and evafudfon ratated to me kn pmvewwWot achoof effectiveness and productivity. l Defernlbna me foaaientty of l stabflahfng 89 8 nonprofit pubfk corpomfktn, art Oragon centar for XWmctm@ Wcfmo@gy. Spewt bod$ of cooperation must h bsterad batwwn Schoofa l nd thair ~mtiaa to Sarw ma baat intarosts of students. me Schwts l nd ma cornrnuntty ss s wholo. (3roups l t tho local, ragbnaf. l nd state bveta must join forces In to Smeve l xceffenca. Tha stat. can rwqpnzo l nd suppon t@mkal sss19tanco cantars m provtding information snd 9Samtanca directfy tO schoofs of thmuq)h dlStrktS, ESOS or COtWWtiWS. Ragbnaf md state ~ibMYM Uld networks can amarge or ba Organwd to provufe flacal, moral or technkaf s~fo rneatachool problems In ma most effacttve wtd l c$ant W8Y l kwofva extsttng advtaoq comrnhtaas In tha and Irnptamanfanrm of me action pm l nd croste new advisory Comnxttcas l s Sppropriato. Estabksh state l nd beat duatfon l nd work Councils for Action

PAGE 259

Oregon Appendix C Proposed Standard 602 Proposed Standard 606 255

PAGE 260

Proposed Standard 602 Individual Student To ensure each students educational success in school, school districts shall pay constant attention to individual student progress. Each district shall: (1) (2) (3) (4) (5) (6) (7) (8) Use test results, classroom work, grades, attendance, behavior and other evaluative information for identifying each students educational progress, related to: (a) Attainment of the Essential Learning Skills adopted by the State Board of Education, (b) Attainment of the common knowledge and skills in instructional programs adopted by the State Board of Education, (c) General educational progress in personal, social and career development, and (d) Completion of graduation requirements; Record and maintain student records which allow for the review of test information, classroom information and other evaluative information to determine the instructional needs of each student; Adapt instruction and curriculum when the needs, interests and learning styles of each student indicates an adaptation is needed; and Report educational progress to parents and students at lest annually on: (a) Attainment of the Essential Learning Skills, and the common knowledge and skills adopted by the State Board of Education, (b) Achievement toward the fulfillment of graduation requirements if appropriate, and (c) General educational progress in personal, social and career development. Identify students who are having extreme difficulties in school, as indicated by: (a) Erratic attendance; (b) Academic problems leading to grade or credit deficits; (c) Conduct or behavioral problems in school or out; (d) Poor relationships with school personnel; (e) Lack of good peer relationships; or (f) Lack of self-esteem. Design educational programs or propose placement in alternative education programs to meet the needs of students identified as having extreme difficulties in school. Report at least annually to the local school board on the status and progress of students identified under section (5) of this rule. Report to the Department of Education in the annual School Level Fall Report (Form No. 581-3174) the number of students who are identified as dropouts under the following definition: A pupil who leaves a school, for any reason except death, before graduation or completion of a program of studies and without transferring to another school or educational program leading to a high school diploma or alternative certificate. 256

PAGE 261

Proposed Standard 606 Instructional Program To ensure continual improvement of instructional programs, school districts shall review test results and other evaluative information to identify levels of performance. t o recognize deficiencies, and to plan needed improvement. Each district shall: (1) Identify district, school and program needs by: (a) Annually reviewing test results and other evaluative information collected for purposes of OAr 581-22-602; (b) Conducting program evaluations periodically in language arts, mathematics, science, health education, social studies and vocational education. These evaluations should be consistent with state curriculum development and textbook selection timelines, and include the measurement of student performance on the appropriate common curriculum goals adopted by the State Board of Education; (2) Implement district, school and program improvements as identified; (3) Provide appropriate related staff development activities; (4) Annually report test results to the community; and (5) Annually report test results and progress on improvement plans to the Department of Education. 257

PAGE 262

POLICY FOR TESTING IN OREGON 3125 Assessment The basic purpose of educational assessment is to provide information that will help individuals make informed choices regarding educational alternatives. Assessment information is relevant to decisions made by students, parents, teachers, school and district administrators, state level decision makers, and citizens. The following policy is put forth to guide state and local education agencies in their assessment activities. I. Underlying Principles The assessment policy of the State Board of Education is based on the following principles: A. Educators at the classroom, school, district and state levels need adequate information to identify students instructional needs and to guid e instructional program efforts. B. In order to inform decisionmakers, assessment information must be timely, relevant to the decision, and easy to understand. c The responsibility for interpreting and using assessment results belongs a t the level at which decisions are made (i. e., individual student, classroom, school, district or state). D. Citizens of the state should be informed about the performance of schools in order to be informed participants in resolving education issues. II. Student Assessment In the elementary grades the educational experience of most students is based on a fairly common and uniform curriculum. This experience begins to differ among students as they progress through school. At the high school level this differentiation begins to increase dramatically, when students pursue courses that relate to their personal and career goals and interests. Nevertheless. there is a core body of knowledge and skills that all students should learn through a K-12 schooling experience. Any student assessment program should recognize and accommodate both the common learning goals expected of students and their differing needs and interests. In carrying out its role to insure that the state maintains a system of modern schools, the State Board of Education will establish the common learning goals that all students must achieve in order to graduate from high school. These outcomes will specify the knowledge, skills and abilities necessary to function as productive adults. The Board will also specify assessment procedures and the standards students must meet. In addition, students must meet unit of credit requirements for high school graduation, allowing for the differentiation i n student needs and interests. As students progress toward attaining the common knowledge, skills and abilities necessary for high school graduation, it is important that checkpoints be established to monitor students progress. Teachers check on a students progress 258

PAGE 263

on a regular and frequent basis. Recognizing this ongoing monitoring system in schools, the state will establish several key points where a common system will be used to check students progress. A critical checkpoint is at the transition -from the elementary program to high school. At this point is is exceedingly important that students possess the requisite knowledge and skills to be successful in high school. The state will establish a performance standard at the eighth grade to identify students who may not be prepared for high school. III. Program Assessment To determine the effectiveness of instructional programs related to the Boards adopted common curriculum goals, student performance will be assessed statewide. These data will be used to identify curriculum strengths and weaknesses on a statewide basis and set targets for program improvement. Information from the assessment of the states common curriculum goals will be reported to policy makers and the public to inform them of educational achievement in the state. In addition, local school districts will use assessment data in making needed program improvements and to convey to their public and the state the status of student achievement in their schools. Iv. State Standards In order to insure that districts carry out their assessment responsibilities, the State Board of Education will adopt standards for public schools. These standards will be based on the most current research and knowledge of effective practices. v. State Support The Superintendent of Public Instruction will develop and maintain an ongoing program to assist local districts implement the assessment standards for elementary and secondary schools. This support will include sample assessment instruments, guidelines for their use and technical assistance in implementing a sound assessment program. 259

PAGE 264

Oregon Appendix D Revised Board Policy 3125 Long Range Testing Plan

PAGE 267

SUMMARY OF PROPOSED STATE TESTING PLAN STATE TESTING RESPONSIBILITIES State-developed high school completion tests administered, beginning in grade 10 State tests administered to all 8th graders State tests administered to samples of students in grades 3, 5 and 11 DISTRICT TESTING RESPONSIBILITIES Districts required to administer stateapproved tests in grades 3 and 5; results reported to state District determines measures/methods for identifying students not making expected progress in grades K, 1, 2, 4, 6, 7 and 9 District determined measures for assessing program effectiveness Assure that PURPOSE SERVED all students receiving high school diploma possess required skills Improve instructional programs on a school and statewide basis Provide information to the public and state policymakers regarding the effectiveness of all public schools in the state Assure that all students who are not making satisfactory progress receive needed assistance Improve instructional programs in each school Improve instructional program on a statewide basis Provide information to the public and state policymakers regarding the effectiveness of all public schools in the state Improve instructional programs on a statewide basis Provide information to the public and state policymakers regarding the effectiveness of all public schools in the state. PURPOSE SERVED Assure that students who are not making satisfactory progress receive needed assistance Improve instructional programs in each school Provide information to the public and state policymakers regarding the effectiveness of all public schools in the state Assure that students who are not making satisfactory progress receive needed assistance Improve instructional programs in each school 2974Psa 11/15/85 263

PAGE 268

TIMELINES FOR STATE TESTING Activit y 1 Establish a state achievement scale at grade s 3 and 5 for equating publishers test Informatio n to state achievement scale. 2. Adminsster state-developed tests of Essentia l Learning Skills in reading, writing, mathematic s and reasoning to a sample of 3rd, 5th and 8t h graders 3. Collect local test data from all schools a t grades 3 and 5. 4 Administer state-developed high school completion test in reading, writing, mathematics and reasoning to a sample of 12th graders t o establish criteria for passing 5 Administer state-developed test of Essentia l Learning Skills in reading, writing, mathematic s and reasoning to a sample of 3rd and 5th and all 8th graders 6 Administer state-developed high school completio n test to all 1Oth graders to go Into effect wit h the class of 1992 Timelin e Spring 198 6 Spring 198 7 Spring 198 7 Annually thereafte r English/Language Art s Math/Scienc e Healt h Social Studie s Spring 198 7 Spring 198 8 Annually thereafte r 7 Begin to add additional curriculum areas to stat e developed tests to be given to samples of 3rd and 5th and 11th graders and all 8th graders accordin g to the following schedule : Fall 198 8 Semi-annuall y thereafte r Spring 1989 Spring 1991 Spring 1991 Spring 1993 3106Psa 264

PAGE 269

POLICY FOR TESTING IN OREGON 3125 Assessment The basic purpose of educational assessment Is to provide Information tha t will help individuals make Informed choices regarding educationa l alternatives Assessment Information is relevant to decisions made by students, parents, teachers, school and district administrators, state leve l decisionmakers, and citizens The following policy is put forth to guid e state and local education agencies in their assessment activities I Underlying Principle s The assessment policy of the State Board of Education is based on th e following principles : A. Educators at the classroom, school, district and state levels nee d adequate Information to identify students instructional needs an d to guide Instructional program efforts B. In order to Inform decisionmakers, assessment information must b e timely, relevant to the decision, and easy to understand. c. The responsibility for interpreting and using assessment result s belongs at the level at which decisions are made (I.e., individua l student, classroom, school, district or state) D. Citizens of the state should be Informed about the performance o f schools in order to be Informed participants in resolvin g education issues II. Student Assessmen t In the elementary grades the educational experience of most students i s based on a fairly common and uniform curriculum. This experience begin s to differ among students as they progress through school. At the hig h school level this differentiation begins to Increase dramatically, whe n students pursue courses that relate to their personal and career goals an d Interests Nevertheless, there i S a core body of knowledge and skill s that all students should learn through a K-12 schooling experience. An y student assessment program should recognize and accommodate both the common learning goals expected of students and their differing needs an d Interests In carrying out its role to insure that the state maintains a system o f modern schools, the State Board of Education will establish the commo n learning goals that all students must achieve in order to graduate fro m high school These outcomes will specify the knowledge, skills an d abilities necessary to function as productive adults The Board will als o specify assessment procedures and the standards students must meet. In addition, students must meet unit of credit requirements for high school graduation, allowing for the differentiation in student needs an d Interests 265

PAGE 270

As students progress toward attaining the common knowledge, skills an d abilities necessary for high school graduation, it is important tha t checkpoints be established to monitor students progress Teachers check on a students progress on a regular and frequent basis Recognlzing thi s ongoing monitoring system in schools, the state wiII establish several key points where a common system will be used to check students progress. A critical checkpoint is at the transition from the elementary program t o high school. At this point is is exceedingly Important that student s possess the requisite knowledge and skills to be successful in hig h school The state will establish a performance standard at the eighth grade to Identify students who may not be prepared for high school. III Program Assessment To determine the effectiveness of Instructional programs related to the Boards adopted common curriculum goals, student performance will be assessed statewide. These data will be used to identify curriculum strengths and weaknesses on a statewide basis and set targets for program Improvement. Information from the assessment of the states common curriculum goal s will be reported to policymakers and the public to inform them o f educational achievement in the state In addition, local school districts will use assessment data in makin g needed program Improvements and to convey to their public and the stat e the status of student achievement in their schools IV. State Standard s In order to insure that districts carry out their assessmen t responsibilities, the State Board of Education will adopt standards fo r public schools These standards will be based on the most curren t research and knowledge of effective practices v. State Suppor t The Superintendent of Public Instruction will develop and maintain an ongoing program to assist local districts Implement the assessmen t standards for elementary and secondary schools This support will Include sample assessment Instruments, guidelines for their use and technica l assistance in Implementing a sound assessment program 3106Psa 266

PAGE 271

A BRIEF HISTORY OF TESTING POLICIES IN THE STATE OF TEXAS Keith L. Cruse December 31, 1985 Prepared Under Contract For The Office of Technology Assessment Congress of the United States 267

PAGE 272

A Brief History of Testing Policies in the State of Texas In the middle and late 1960s, the Texas Governor appointed a blue ribbon committee to study public education in the state and to develop policy statements which would provide a basis for i reproving the state system of public education. One aspect of the Texas Educational Development Study conducted by the Governor's Committee on Public School Education in Texas (1967) was a statewide assessment using the American College Testing (ACT) Program. While Texas was reviewing the state system of public education, the Federal Government was in the midst of educational reform which was expressed in the Elementary and Secondary Education Act of 1965. This national legislation provided the impetus for states to install educational planning units in their state departments of education. Thus, the Texas Education Agency created the Office of Planning which included the Division of Assessment and Evaluation. One predictable outcome of the interaction of the state and national educational efforts was that the new planning unit would conduct a study based on the Governors Committees previous work. In May of 1972, the Texas Education Agency released a report on the 1971 Texas Achievement Appraisal Study. The Preface of that report summarizes the beginning status of a developing state testing policy: The Texas Achievement Appraisal Study was conducted as a part of the continuing effort of the Texas Education Agency to assess the educational needs of Texas pupils. Although patterned after the 1967 study of the Governor% Committee on Public School Education, this activity was the first of its kind to be accomplished by the State agency. Based on a replication study of 69,000 Texas high school seniors, the report describes demographic information and test scores on the American College Test. The report was designed to assist educational leaders in improving the quality of Texas elementary and secondary public schools. 268

PAGE 273

Immediately after reporting the ACT results, the state department of education began working cooperatively with a commercial testing company to explore potential benefits of standardized criterion referenced tests for use in large scale assessments. Primary motivation of the managers of the Texas Education Agency and the test company was to find an economical method of obtaining student performance data which was more useful for improving the quality of education. The traditional norm referenced tests in use were helpful in evaluating how well a student, or a group of students, was compared to one another and the nation, but seemed to lack the precision necessary to evaluate the achievement of specific learner objectives of priority concern to teachers, administrators, and policymakers and thereby define the needed improvements in educational programs. In 1973 and 1974, the state department conducted statewide assessments in reading and mathematics using criterion referenced tests. Multiple outcomes were achieved: 1. Statewide student performance data were available on specific learner objectives which were judged important by Texas educators. 2. Information was obtained on the usefulness of criterion referenced tests. 3. Discrepancies in student achievement between various subpopulations were quantified in specific learning areas. 4. Educators in Texas began to communicate about how (and where) specific learner objectives were taught, at both the local and state levels. 269

PAGE 274

The remaining years in the 1970s offered more opportunities for the Texas Education Agency to explore assessment strategies for a state testing policy. In 1975, the Agency conducted a statewide assessment of the status of career education. This study was largely a result of the combination of national concerns in career education and the state level interests in the area of testing. The unique features of this program provide some insights on the emerging state policies on testing: 1. A funding plan was designed by Texas Education Agency managers which used both state and federal resources. 2. A commercial contractor developed unique tests to measure career education outcomes (objectives) which were developed for Texas students through an extensive grass roots program conducted across the state. 3. The work of selecting learner outcomes and building criterion referenced tests was accomplished cooperatively by the state department of education, selected regional education service centers, several urban school districts, and a paid contractor. 4. The primary objectives sought through these assessment activities related to diagnosing student learning deficiencies, identifying educational program weaknesses, and evaluating statewide student performance. A sampling approach was used which provided no district or campus information. As a result of the first decade of student testing activities (initiated and conducted by the Texas Education Agency) and an increasing awareness on the part of the state legislature that there was little empirical evidence of the effectiveness of publi c 270

PAGE 275

education in Texas, the legislature appropriated $3,000,000 to the state department for the development of a better management information base. Some of the funds were used to plan and develop a computerized database for education. The remaining resources were used to conduct statewide student performance assessments. In 1978 and 1979, the Texas Education Agency requested that school districts cooperate in seven separate statewide student testing programs. Participation was consistently close to 100 percent in the Texas Assessment Project. Custom built criterion referenced tests were administered in mathematics and reading. Released test items from the National Assessment of Educational Program program were used to develop tests in writing, economics education, and citizenship. Commercial norm referenced tests in reading and mathematics were also administered. By 1979, the Texas Education Agency had a separate division with full-time responsibility for providing student performance data. More information on student achievement was available to educators and the public than ever before in the history of public education in Texas. As one reviews the history of student testing in Texas, the benefits of an early start and a wide variety of assessment experiences become evident. Throughout all the previous assessment activities, the state department was making comprehensive reports to all school districts, the press, the public, and the state legislature. In 1979, an informed Texas legislature passed a law to establish the first state mandated testing program. Although no specific line item in the budget provided funding for the program, the State Board of Education and the managers of the department developed a funding plan. The law was implemented in a manner to comply with the full intent of the legislature. Criterion referenced tests in the basic skills of mathematics, reading, and writing were administered to all students in grades 3, 5, and 9. Students in grades 10, 11, and 12 who did not master the tests were offered the opportunity to retake the tests each time they were administered. 271

PAGE 276

From 1980 to 1985, the state mandated testing program, the Texas Assessment of Basic Skills (TABS), used criterion referenced tests to provide information on student achievement The TABS program offered the first opportunity for students across the state to take the same test. Individual students, parents, and teachers received mastery information of each basic skill (8 to 12 per test). The program avoided classroom summaries but provided data on campuses and districts which, by law, were made public. Comparisons between districts were made. Attention of the public was focused on student learning to an unprecedented degree. The results were dramatic. Local school officials identified successful instructional strategies and employed them in such a manner that they increased student achievement statewide. Not only did overall student performance increase, but the differences in student performance between minority and majority subpopulations decreased. During the six year period, the state legislature amended the law to make it mandatory for students in grades 10, 11, and 12 to retake the tests if they had not demonstrated mastery in grade 9. In 1980, only 70 per cent of the grade 9 students mastered the mathematics test, was mastered by 84 per cent of grade 9 students. from 70 to 78 per cent over the same time period. The TABS program did not begin without the with such large scale educational efforts. Some while in 1985 the mathematics tests Mastery on the reading test improved usual resistance to change associated teacher groups resisted the idea of a state program meeting the needs of different program responded by pointing out that these students in the opinion of a cross section of Texas resisted the idea of comparing schools because of types of students. Supporters of the were basic skills, necessary for all educators. Some school administrators diverse student populations in terms of ethnic composition, family wealth, and limited English proficiency. The reporting strategies used for TABS always included demographic information as a part of reporting student performance. Standard reports for each school district included three separate aggregations: (1) all students, (2) limited English proficient students, and (3) non-limited 272

PAGE 277

English proficient students. Minority organizations monitored the program carefully. Every effort was made to ensure that the TABS tests were free from bias, and the results of those efforts were made public. As the results of minority groups improving at a faster rate than majority students became apparent, little opposition was left. If the TABS program is to be judged successful, why was it so widely accepted? There is no simple answer, but it is important to understand that the entire program was tied to state compensatory efforts. State compensatory funds were given to school districts on the basis of eligibility for free or reduced priced lunches, but the law required those districts to use the funds to develop and implement appropriate remedial programs for students who did not master the basic skills measured by the TABS program. Thus, the testing program was put in the perspective of a needs assessment strategy for state compensatory efforts. The supporters of the program were those educators and public policy makers who wanted documentation of educational needs and empirical evidence of educational improvement if it occurred At the end of the program, there was no organized group which offered public opposition to the program. The true evaluation of the program should probably be based on what happened to it. In 1984, the Texas Legislature, in special session, passed one of the most comprehensive educational reform laws in the changed the construction of the State Board of was financed, required students to make 70 to play rule in Texas schools, required teachers history of public education. House Bill 72 Education, altered the way that education pass a course, implemented a no pass, no to pass competency tests, and revised the TABS program. The TABS language was moved from the compensatory education section of the Texas Education Code to a separate section of its own. The law changed the student assessment program from the largest to twice that size. The new program, the Texas Educational Asessment of Minimum Skills (TEAMS) tests every student in grades 1, 3, 5, 7, 9, and 11, approximately 1.6 million students annually. 273

PAGE 278

If there is a central theme to this history of testing policies, it is the concept of a policy evolution. In fact, a proper title would be the The Evolution of Student Testing Policies in Texas. Obviously, the complexity of any government/society function such as that of a state educational system for public education makes it impossible to identify simple cause-effect relationships. However, several factors should be listed for their contribution to the present testing policy in Texas: 1. 2. 3. 4. 5. A national report card for education repeatedly ranks Texas low. The current Texas Governor based much of his campaign on improved quality of education in the state. A blue ribbon committee appointed by the Governor recommended sweeping reforms for the state system of public education. The was chairman of the Governors Committee was a very influential citizen who committed to higher standards for education in Texas. State policy makers had over a decade of experience to inform their state policy decisions in the area of student testing. In October of 1985, the first TEAMS tests were administered to over 191,000 high school juniors. A review of the new state testing program reveals some significant changes from the TABS program: 1. The State Board of Education is required to set passing standards for the total test at all grades. 274

PAGE 279

2. 3. 4. 5. High school students must pass an Exit Level test (first administered in grade 11) in order to receive a high school diploma. The opportunity for retesting is provided for students failing the test. Students are now tested at each odd numbered grade 1, 3, 5, 7, 9, and 11. The Texas Education Agency is directed to provide national comparative data on the TEAMS tests in order to monitor the states rank in the nation. Texas school districts must provide remedial instruction to those students not passing the TEAMS tests. The Chairman of the State Board of Education and the Texas Commissioner of Education have both repeatedly made public statements to the fact that the TEAMS program will be the primary basis for evaluating the education reforms called for in House Bill 72. A public policy has evolved, in the light of a concern for Texas to compete successfully in the world market place, which indicates a desire to provide adequate resources for a quality system of public education along with an component which includes a state testing program to monitor the progress reform in Texas. accountability y of educational 275


xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID EA3FWLFRZ_G5GROA INGEST_TIME 2017-05-18T17:30:50Z PACKAGE AA00054986_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES