|
Citation |
- Permanent Link:
- https://ufdc.ufl.edu/IR00000047/00001
Material Information
- Title:
- Effects of two different assessment tools on secondary instrumental music students' achievement and motivation
- Creator:
- Olsen, David N. ( Dissertant )
Brophy, Timothy ( Thesis advisor )
dos Santos, Silvio ( Reviewer )
- Publisher:
- University of Florida
- Publication Date:
- 2009
- Language:
- English
Subjects
- Subjects / Keywords:
- Instrumental music ( jstor )
Motivation ( jstor ) Music education ( jstor ) Music students ( jstor ) Music teachers ( jstor ) Musical motives ( jstor ) Musical performance ( jstor ) Orchestras ( jstor ) Student motivation ( jstor ) Student surveys ( jstor ) Dissertations, Academic -- UF -- Music Education Music Education Thesis, M.M. Alachua County ( local )
Notes
- Abstract:
- The purpose of this study was to explore the effect of two contrasting assessment
tools on secondary instrumental music students’ achievement and motivation. The study was guided by the following questions: What is the effect on secondary instrumental music students’ achievement using a rubric-based assessment tool versus a “pass-off†based assessment tool? What is the effect of the assessment tool used on a student’s motivation to practice? Using two different assessment tools, the study examined 65 students in Grades 6, 7, and 8 and ranging in abilities from beginning through advanced. These students from the researcher’s class were randomly assigned through a systematic process to the researcher’s-created “pass-off†grading system (n = 31) and the researcher’s-created rubric-based “contract†grading system (n = 34). The study was conducted over the course of the first 9-week grading period of the school year. All students participated in a pre- and post-test to attain “achievement†score data and a survey to collect background information on the students’ music education, experience, and their thoughts and feelings about their assessment procedure. Results indicated there was no statistically significant (p < .05) effect of assessment tool on student achievement. Data analysis did indicate significance in the post-test achievement score
as a function of assessment type and ensemble (p = .04).
- Publication Status:
- Published
- Thesis:
- MM in Music Education conferred Fall 2009.
- Issuing Body:
- Supervisory committee: Timothy S. Brophy [chair] and Silvio dos Santos [member]
- Acquisition:
- Music Education terminal project
- General Note:
- Includes bibliographical information.
- General Note:
- Document formatted into pages; contains 55 pages.
- General Note:
- Includes vita.
Record Information
- Source Institution:
- University of Florida Institutional Repository
- Holding Location:
- University of Florida
- Rights Management:
- Permissions granted to the University of Florida Institutional Repository and University of Florida Digital Collections to allow use by the submitter. All rights reserved by the author.
- Resource Identifier:
- 908766670 ( OCLC )
Aggregation Information
- UFIR:
- Institutional Repository at the University of Florida (IR@UF)
- UFETD:
- University of Florida Theses & Dissertations
- IUF:
- University of Florida
|
Downloads |
This item has the following downloads:
|
Full Text |
THE EFFECT OF TWO DIFFERENT ASSESSMENT TOOLS ON
SECONDARY INSTRUMENTAL MUSIC STUDENTS'
ACHIEVEMENT AND MOTIVATION
By
DAVID N. OLSEN
SUPERVISORY COMMITTEE
TIMOTHY S. BROPHY, CHAIR
SILVIO DOS SANTOS, MEMBER
A PROJECT IN LIEU OF THESIS PRESENTED TO THE COLLEGE OF FINE ARTS
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
MASTER OF MUSIC
UNIVERSITY OF FLORIDA
2009
2009 David N. Olsen
To Kristin and my family.
ACKNOWLEDGMENTS
I thank the 2008-09 master of music education faculty for their hard work and
dedication to seeing the first class of summer masters in music through to their goal.
Drs. Timothy S. Brophy, Charles R. Hoffer, Russell L. Robinson, Paul Richards, Silvio
dos Santos, Kara Dawson, and Miriam Zach are educators who care and are truly
dedicated to the future of music education. I thank my parents for getting me involved
and supporting me in music. Finally, I thank Kristin for being patient and confident
through the long summers apart and the falls of on-line classes.
TABLE OF CONTENTS
page
ACKNOWLEDGMENTS .................................... ............... 4
LIST O F TABLES ......... ............... ...................... .... ............... 7
LIST OF ABBREVIATIONS...................... .......... ................ ......... 8
ABSTRACT ........... ....... ..... .......... ..... ...................... ......... 10
CHAPTER
1 INTRODUCTION ........................... ................... 12
Significance of the Problem ..................... ....................... 12
Purpose of the Study ......................... ......... ........ 13
D e lim ita tio n s .............. ..... ............ ................. .......................................... 1 3
2 REVIEW OF LITERATURE ............... ... ....................... 14
Introduction .................... ........... ............... 14
Philosophical Rational.............................. ............... 14
Theoretical Background ............... .................... ........ 15
Assessment Development ............... ............................ 17
M o tiv a tio n .................................................................. 1 8
3 METHODOLOGY AND PROCEDURES............................... ..... 20
Introduction ................... ............ ............... 20
Study Procedures ..................................... ........ 20
Data Collection and Analysis ............... .... ...................... 22
Pre- and Post-Test ...... ...................... ....... ......... 22
S u rv e y .............. ..... ............ ................. ............................................. 2 3
4 R E S U L T S .................................................................................................. 2 5
Pre- and Post-Test Results ......... .. ....................... 25
S urvey R esults............................... ............. ....... 28
Contract........................... ............ ............... 29
Pass-Offs................................................. ............... 31
5 DISCUSSION ................. ........ ................ 34
Introduction ................... ............ ............... 34
A ch ieve m e nt .............................................................................. ............... 35
M o tiv a tio n .............. .. ............... ................. .............................................. 3 7
C o inclusion ............. ........ .............................................. 39
APPENDIX
A PRE-AND POST-TEST SCORING SHEET..................................... .................... 41
B S U RV EY Q U EST IO N S ......................... ......... ......... .................................. 43
C EXAMPLE CONTRACT SHEET AND SCORING RUBRIC ....................... 46
D EXAMPLE PASS-OFF SHEET ................................. .................... 48
E UF IRB PROTOCOL LETTER OF APPROVAL ........... ......................... 49
F ORANGE COUNTY PUBLIC SCHOOLS' REQUEST FOR RESEARCH FORM.... 50
G PARENT CONSENT FORM ............................................... 51
H STUDENT ASSENT FORM ............... .... ............. ............... 52
LIST OF REFERENCES ......................... ........ .......... 53
BIO G RA PH ICA L SKETC H ......................... ..... ..................................... ............... 55
6
LIST OF TABLES
Table page
Table 1 Analysis of Variance for Post-Test Total Scores as a Function of
Assessment Type and Ensemble ................... ....... .. ....... ............... 26
Table 2 Post-Hoc Comparison of Post-Test Total Scores of all Assessment Types
as a Function of E nsem ble ............... .................. ........... ................ ......... 27
Table 3 Analysis of Variance for Average Minutes of Practice as a Function of
Assessment Type and Ensemble ............ .. ............. .............. 28
Table 4 Post-Hoc Comparison of Average Minutes of Practice Time per Week as a
Function of All Assessment Types and Ensemble.................... ........... 28
Table 5 Quantitative Survey Results for Contract Students (n = 32) ............................. 30
Table 6 Qualitative Survey Results for Contract Students (n = 32) ............................. 31
Table 7 Quantitative Survey Results for Pass-Off Students (n = 28)......................... 32
Table 8 Qualitative Survey Results for Pass-Off Students (n = 28)............................... 33
LIST OF ABBREVIATIONS
Achievement
Contracts
Pass-Offs
Generals
a specific musical accomplishment, often the result of specific
instruction. Reading notation, performing a specific piece. (Radocy
& Boyle, 2003, p. 385).
a researcher-developed assessment instrument based on a four-
level rating scale: Excellent (5), Good (4), Fair (3), and Poor (2).
There are eight different scored categories: Tone, Rhythm, Note
Recognition, Musicianship, Right Hand, Left Hand, Bowing, and
Posture. The contract itself is a list of items that the student must
perform during the grading period. It consists of the list of items with
the grading rubric. When a student is prepared to perform one of
the requirements on their contract, the teacher uses the scoring
criteria to assign the grade. Students are allowed to replay any
tests to receive their desired final grade during the grading period.
a researcher-developed assessment instrument based on a pass
(A) or fail (F) rating scale. The pass-off sheet is a list of items that
the student must perform during the grading period. "Passing-off"
refers to the fact that the student performer meets the high
standard that would result in a perfect score. The student performs
the selection "flawlessly." Flaws would include rhythmic errors, note
errors, bowing errors, and posture problems. This system requires
the assessor to have a high musical standard when listening and
assigning scores. Feedback is given verbally to students when they
do not perform to the set standard. When students are prepared to
perform one of the requirements on the pass-off sheet, they
perform for the assessor. If students do not "pass-off" then they
may try again at another time until they do "pass-off." As presented
to the students participating, pass-offs must be completed in a
specific order to achieve the desired grade. EX: Requirements # 1
through 5 must be completed to receive a "C." If the student
completed # 1 through 4 and then # 8 their grade would in effect be
a "D."
This category exists on both the pass-off sheet and the contract
sheet. This category is open for student choice and possible
development outside of the "performance-based" grading
procedure. Included in this category are 18 different choices
ranging from composing to journal writing, and concert attendance.
For some students this is a difficult category to complete while
others have no problem at all. Much of its completion has to do with
personal initiative.
1. The act or an instance of motivating. 2. The state or condition of
being motivated. 3. Something that motivates; inducement,
incentive (Motivation, N.D).
Motivation
Abstract of Project in Lieu of Thesis Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Master of Music
THE EFFECT OF TWO DIFFERENT ASSESSMENT TOOLS ON SECONDARY
INSTRUMENTAL MUSIC STUDENTS' ACHIEVEMENT AND MOTIVATION
By
David N. Olsen
December 2009
Chair: Timothy S. Brophy
Member: Silvio dos Santos
Major: Music Education
The purpose of this study was to explore the effect of two contrasting assessment
tools on secondary instrumental music students' achievement and motivation. The study
was guided by the following questions: What is the effect on secondary instrumental
music students' achievement using a rubric-based assessment tool versus a "pass-off"
based assessment tool? What is the effect of the assessment tool used on a student's
motivation to practice? Using two different assessment tools, the study examined 65
students in Grades 6, 7, and 8 and ranging in abilities from beginning through
advanced. These students from the researcher's class were randomly assigned through
a systematic process to the researcher's-created "pass-off" grading system (n = 31) and
the researcher's-created rubric-based "contract" grading system (n = 34). The study was
conducted over the course of the first 9-week grading period of the school year. All
students participated in a pre- and post-test to attain "achievement" score data and a
survey to collect background information on the students' music education, experience,
and their thoughts and feelings about their assessment procedure. Results indicated
there was no statistically significant (p < .05) effect of assessment tool on student
achievement. Data analysis did indicate significance in the post-test achievement score
as a function of assessment type and ensemble (p = .04).
CHAPTER 1
INTRODUCTION
Finding the most appropriate way to assess music students, and music in general,
has been a quest for music teachers at every grade level. How does one grade a
subject, such as music, that can be so subjective and open to personal opinion and
preference? One person's opinion about a performance can be vastly different from
another's. In music education there are different opinions on performance quality. These
opinions are formed by teachers' experiences, their personal abilities, and education.
There are ways, though, to help even the playing field for all educators and make the
subjective assessment of a musical performance more objective. Using a more
objective assessment process, such as an assessment procedure delineated by
well-defined rubrics, it may be possible to achieve consistency among music educators'
assessment scores. The process may also inspire students to achieve advanced goals
by using the quantifiable guidelines in their grading procedure that are easy to read,
understand, and put into practice.
Significance of the Problem
An ongoing challenge for instrumental music teachers is finding a defined and
practical way of formally assessing students in the secondary instrumental music
ensemble. Bell & Bell (2003) found that many times music teachers grade students
based on non-musical attributes such as attendance, behavior, effort, and attitude.
Antmann (2007) found in his study of "successful" Florida middle school band programs
that many middle school band directors are assigning grades to their students based on
playing tests, concert attendance, conduct, and participation. Since student assessment
has become an increasingly major component in the public education system, it is even
more imperative music teachers create and use meaningful forms of assessment in this
subject. Asmus (1999) and Bell & Bell (2003) noted the importance of why music
teachers must come up with meaningful ways of assessment and that documenting
students' learning of specific skills in a standards-based curriculum helps demonstrate
music education's worth to the community at large.
Purpose of the Study
The purpose of this study is to explore the effect of two contrasting assessment
tools on secondary instrumental music students' achievement and motivation. This
study is guided by the following questions:
* What is the effect of a rubric-based assessment tool versus a "pass-off" based
assessment tool on secondary instrumental music students' achievement?
* What is the effect of the assessment tool used on a student's motivation to
practice?
Delimitations
The following were not accounted for in this study: gender, ethnicity, or socioeconomic
background of students participating in the study.
CHAPTER 2
REVIEW OF LITERATURE
Introduction
Recent trends, such as the No Child Left Behind Act (NCLB), have brought our
school systems into the national spotlight and focused attention on national standards
and assessment. With federal, state, and local governments focusing on assessment,
student achievement, and accountability, it is only appropriate to align our music
curriculum with these ideas to remain a justifiable subject in our society (Asmus, 1999;
Bell & Bell, 2003; Willoughby, Feifs, Baenen, & Grimes, 1995). NCLB has established
the arts as part of our education system's "core curriculum" (H.Res.1, 2002). Because of
the arts' inclusion as core curriculum, it is even more essential to develop meaningful
assessment procedures to demonstrate music's importance to a child's education. In
this review, the researcher explored philosophical and theoretical rationales about
education and learning, and assessment development and student motivation.
Philosophical Rational
Orenstein and Hunkins (2009) deliver an introduction to the four major
philosophies that have influenced education in the United States: idealism, realism,
pragmatism, and existentialism. They go on to write about four "agreed-upon"
educational philosophies. These educational philosophies are perennialism,
essentialism, progressivism, and reconstructionism. Orenstein and Hunkins (2009) also
indicate these educational philosophies have their roots in one or more of the previously
mentioned major philosophies. This researcher finds their philosophy of education
aligned with a pragmatic essentialist view in that the purpose of education is to impart
knowledge, skills, and values in students to make them self-sufficient contributing
members of society. The pragmatist believes in culturing critical thinking in students
and using the scientific process (Orenstein & Hunkins, 2009, pg 37). The essentialist
believes in teaching the core set of basic skills and knowledge (Orenstein & Hunkins,
2009, pg. 56). Knowing the foundation of educational philosophies will assist in the
development of one's own philosophy of education, which in turn drives curriculum
development and choices in assessment procedures.
Theoretical Background
Learning theories can be viewed in a traditional sense and categorized into two
camps: behavioral and cognitive (Abels, Hoffer & Klotman, 1995; Radocy & Boyle,
2003). Radocy and Boyle go on to write that learning is defined as "an observable
change in behavior, due to experience, which is not attributable to anything else (p.
396)." Different forms of learning in the behaviorist camp include classical conditioning,
established by Russian physiologist I. P. Pavlov, and operant conditioning, associated
with works by B. F. Skinner.
In classical conditioning, also known as Pavlovian conditioning, an unconditioned
stimulus which elicits an unconditioned response is preceded by a conditioned stimulus
to the point that the unconditioned response becomes a conditioned response when the
unconditioned stimulus is removed and replaced with the conditioned stimulus. A classic
example of this type of conditioning is in the experiments Pavlov performed with dogs.
Pavlov exposed dogs to meat powder (unconditioned stimulus) and the dogs would
salivate (unconditioned response). Pavlov then rang a bell (conditioned stimulus)
preceding each exposure to the meat powder and the dogs would salivate. Over time,
the ringing of the bell made the dogs salivate (conditioned response) even when there
was no meat powder presented. In operant conditioning, a desired response is made
stronger through selective reinforcement. Radocy and Boyle give the example of "if an
encaged pigeon pecks at a particular spot and receives food as a consequence, the
pigeon is more likely to peck at that spot again and can learn to do it when requiring
food (2009, p. 399)."
Cognitive learning theories base their learning on organization and reorganization
of learning structures (Radocy & Boyle, 2009). Leading names in this area of research
include Bruner, Kohler, and Piaget. Theoretical views in this camp stem from and
include Gestalt theory. Gestalt theorists are primarily interested in perception, or
understanding the whole, and through the organization of this perception, information
and concepts are learned (Radocy & Boyle, 2009). Swiss biologist Piaget has had a
great impact on child development theory with his well-known Stage Theory (Abeles et
al., 1995). His theory presents four theoretical stages of child development:
sensorimotor stage, preoperational stage, concrete operational thought, and finally
formal operational thought. J. Bruner, an American psychologist, also has developed a
theory on child development that is divided into three, less rigid, stages: enactive,
iconic, and symbolic. Studying these child development theories could assist teachers in
understanding the mental development and the readiness to learn of the particular
students they are teaching and in turn, can sequence their curriculum to best fit the
students' learning process (Abeles et al., 1995).
Applications of both learning theories find their way into the music classroom.
Behavioral theories help with maintaining general order in the classroom and proper
behavior. This theory can also be used in general rehearsal techniques as the teacher
is giving verbal feedback to the students about their performance to increase the
likelihood of the same response. This can be categorized as a form of operant
conditioning (Abeles et al., 1995). Cognitive theories help teachers organize information
and curriculum into segments that can be taught to students and then analyzed to see if
they have been learned and to what degree they have been learned. One such method
that organizes information and then uses behavioral techniques to assess skill
acquisition is Bloom's Taxonomy of the Cognitive Domain (Abeles et al., 1995).
Assessment Development
Developing quality assessment methods presents its own challenges (Baker,
Aschbacher, Niemi, & Sato, 1992). Some teachers believe intensely that developing
formal assessments will take away from the abstract nature of the art (Asmus, 19991;
Willoughby et al.,1995). Baker, Aschbacher, Niemi, & Sato (1992) conducted a five-year
research study on constructing an assessment method that would thoroughly test
cognition of material taught. Their study explored the construction of accurate
performance-based assessments. Assessments should measure students' knowledge
of skills and verify that they have acquired the concepts taught in class. Assessment
using rubrics helps guide the teacher in pacing instruction, presenting curriculum, and
setting performance standards (Asmus, 19991). Asmus goes on to state that having
well-thought-out learning objectives helps guide the teacher and the students through
the curriculum and to the assessment method. Using rubrics can also help the students
being assessed by giving them clear guidelines as to what is being assessed and what
they can do to perform better in the future (Asmus, 19992). Willoughby et al. (1995)
write that rubrics document student progress and supply teachers with useful
information when communicating with the principal, parents, and students. They also
state how assessing an arts course can be subjective and having a method of formal
assessment helps to objectify the results. Many school districts, as well as State and
National associations such as The National Association for Music Education (MENC),
have curriculum resources and guides that make a good starting point for targeting
learning objectives and developing a sequence of instruction (Asmus, 19991).
Motivation
Abeles et al. (1995, pg. 212) define motivation as "the energy that a learner
employs when changing a behavior." They go on to state that educators primarily focus
on a student's secondary or psychological drives that "include fear, love, frustration,
curiosity, the need for stimulation, [and] the need for achievement" (pg. 213).
Assessment should be a motivational teaching tool (Baker, Aschbacher, Niemi, & Sato,
1992). In his study, Asmus (1986) focused on the opinions of the students and their
ideas of success and failure pertaining to music class. Asmus used a 10-item
open-ended questionnaire to collect data from the students and then conceptualized
those data using Weiner's Attribution Theory. Weiner's Attribution Theory is a
framework that assumes that people try to determine why people do what they do
(Learning Theories Knowledgebase, 2009). When people succeed at something, they
tend to attribute that success with their own skill and when people fail, they will attribute
that failure to some external cause.
Assessment using rubrics gives the students the power to self-assess. The
guidelines or descriptors, a key element on the rubric chart (Asmus 19992), act as a
guide in individuals' practice. Hewitt (1991) found research literature relating to self-
assessment on musical performances to be inconclusive, because of the lack of study
in this area, but self-assessment did have a positive result on students' practice attitude.
Hewitt mentions that students who received training on how to self-assess were more
positive about music, their classes, and their teacher (pg. 309). Using a rubric to guide
their practice, student's attitude toward practice may increase. Asmus (19991) notes that
having a mapped-out sequence of learning can lead to increased student motivation, in
turn enhancing knowledge and skill.
CHAPTER 3
METHODOLOGY AND PROCEDURES
Introduction
This action research was based on a mixed-method research approach, dominant-
status sequential design, in which the researcher collected both quantitative and
qualitative data (QUAN --> qual). Permission for this study was acquired from the
University of Florida Institutional Review Board (Appendix E). Permission was also
acquired from the researcher's school district's Accountability, Research, and
Assessment Department (Appendix F). The researcher used students from the
researcher's class who volunteered to participate. Parents completed consent forms
(Appendix G) and students completed assent forms (Appendix H) in order to take part in
the study. The study was conducted during the first 9-week grading period (beginning
September 7th and concluding on October 27th) of the 2009-2010 school year. Sixty-five
students volunteered to participate: 28 beginners, 12 intermediate, and 25 advanced
students in Grades 6, 7, and 8.
Study Procedures
Students participating in this research were placed in their assessment method
based on systematic sampling by their class. Each class was alphabetized by last
name. The researcher then chose a random starting place in each list and placed every
3rd student into an assessment procedure: either the contract system or the pass-off
system. During the course of the 9-week grading period, students were given three
opportunities during their class period to test using their assessment procedure. Each of
the students tested by performing their musical selection for the entire class. Students
on the rubric-based contract received a numerical grade based on how they performed
and based on the standards outlined in the rubric. Students using the pass-off system
either played their excerpt "flawlessly" and "passed-off" or were stopped at the first flaw
and told that they did not pass-off. All students were given the opportunity to stay after
school or come to school early to perform their excerpts again for the grade they
desired.
Requirements for beginning orchestra students' tests were as follows:
The D major scale one octave with arpeggio pizzicato, memorized
Essential Elements 2000 book 1 exercises # 9, 19, 22, and 34
Winning Rhythms charts 1, 2, 3, and 4 (students were required to count a
randomly selected line from each chart)
Complete 3 out of 18 "generals."
Requirements for intermediate orchestra students' tests were as follows:
the D and G Major scales two octaves (where appropriate for their instrument)
with arpeggio, memorized
Essential Elements 2000 book 2 exercises # 36 and 47 and Essential Technique
2000 book 3 exercises 21 and 22 as one exercise and 23 and 24 as exercise
Winning Rhythms charts 10, 11, 12, and 14 (students were required to count a
randomly selected line from each chart)
Complete 3 out of 18 "generals."
Requirements for advanced orchestra students' test were as follows:
The D, G and C Major scales two octaves (where appropriate for their
instrument) with arpeggio, memorized
Essential Technique 2000 book 3 exercises 21 and 22 as one exercise, # 35,
and # 47 and 48 as one exercise
Winning Rhythms charts 14, 15, 16, and 17 (students were required to count a
randomly selected line from each chart)
Complete 3 out of 18 "generals."
Data Collection and Analysis
Data were collected through a pre- and post-test procedure and a concluding
survey with which the researcher gathered background information regarding students'
musical experience and their experiences and feelings concerning their assessment
procedure.
Pre- and Post-Test
The pre- and post-test consisted of each student recording two of their required
materials, a "prepared piece" and a scale, and also a sight-reading exercise. The
beginning students' required materials were exercise # 34 and the memorized D Major
scale with arpeggio (pizzicato). The intermediate and advanced students' required
materials were exercise # 21 and 22 as a continuous line and the memorized D Major
scale two octaves with arpeggio (bowed). For the pre-test, students were only told that
they were to play a prepared piece, a scale, and a sight-reading exercise. The students
were not informed what those items would actually consist of until they entered the
recording area. Once the students recorded, they were informed that the selections
constituting the pre-test would make up the post-test and the only element that would
change would be the sight-reading exercise. The pre-test occurred a few weeks into the
9-week grading period to give all students the opportunity to begin work on all exercises
required of them for their grading procedure. The post-test occurred after the end of the
grading period, when all grades had been finalized. Students again recorded the
prepared piece and scale mentioned previously and a slightly different sight-reading
than before. Recordings were made in the researcher's office using a laptop, an
external microphone, and the computer program Audacity. Results of the pre- and post-
test were analyzed by the researcher using an instrument based on the
"Woodwind/Brass Solo" evaluation form (Saunders & Holahan, 1997) (Appendix A).
This form, while used in almost its complete state, was modified to work with string
performers rather than wind performers. Elements such as breath control and
articulation were replaced with bowing and bow control. Other elements such as
embouchure formation and observable posture elements were eliminated from scoring.
Survey
The survey was constructed by the researcher using the on-line survey creator
Survey Monkey (Appendix B). The information collected was used to gain background
information about the students' education in music. The information collected in the
survey included their time spent in music class during school, the number of months
they took private lessons on their orchestra instrument and/or piano, the average
number of minutes per week they spent practicing their instrument during the grading
period, the number of months the student had been playing an orchestra instrument,
and their thoughts and opinions on their experience during the grading period as it
related to their assigned assessment procedure. The survey was administered during
each students' 46-minute class period in the school's computer lab. The students were
given written instructions on how to access the Survey Monkey survey. Students who
needed help raised their hands and were helped by the researcher. Results of the
survey yielded both quantitative and qualitative data.
Other data collected included the assigned assessment procedure (pass-offs or
contract), their grade level, their age in months, their instrument (violin, viola, cello, or
bass), and their ensemble level (beginning, intermediate, advanced).
CHAPTER 4
RESULTS
Pre- and Post-Test Results
Analysis of Variance for pre- and post-test total score as function of the
assessment type were conducted. The pre-test as a function of assessment method
results indicated that minutes of music instruction had a high significance (p= .005)
along with months of playing on the instrument (p=.03) in predicting the student's score.
This test also indicated that assessment type was not found to be significant (p= .70).
The post-test as a function of assessment type yielded slightly different results
indicating that assessment type had more significance (p= .04); months of playing an
orchestra instrument was still significant (p= .04); and minutes of music instruction was
no longer significant (p= .80). Observing the effects of the assessment type and the
ensemble yielded a statistically interesting result. In an analysis of variance of the pre-
test total scores using two independent variables (ensemble and assessment type)
yielded the result that minutes of music instruction in school was significant in the score
the students received (p= .01). Analysis of variance of the post-test total scores with the
same two independent variables yielded the result that ensemble type, while not as
statistically significant, was more significant (p= .22) than minutes of music instruction
(p= .97) which was no longer significant.
Analysis of Variance using assessment type combined with ensemble revealed
interesting statistics (Table 1). Data collected concerning the post-test total score as a
function of assessment type were not statistically significant (p= .29). The post-test total
score as a function of the ensemble in which the students were enrolled was significant
(p= .02). When looking at the post-test total score as a function of both assessment type
and ensemble together there was also a statistically significant result (p= .04).
Table 1 Analysis of Variance for Post-Test Total Scores as a Function of Assessment
Type and Ensemble
Source df F p
Assessment Type 1 1.14 .29
Ensemble 2 4.22 .02*
Assessment Type Ensemble 2 3.29 .04*
Error 59
Total 64
Note: *p < .05
The mean post-test total score in beginning orchestra was 73.96. When observing
the beginner's post-test total scores delineated by assessment type, it was found that
the students who used the contract had a mean score of 81.63, while those on the
pass-offs received a mean score of 64.88. For the intermediate orchestra as a whole,
the mean score for the post-test total score was 77.54. Students on the contract had a
mean score of 78.21 and students on the pass-offs earned a mean score of 76.60. In
the advanced orchestra, the mean post-test total score as a group was 85.78. The
students who were using the contract earned a mean score of 83.21, while those on
pass-offs earned a mean score of 88.15. Table 2 shows the post-hoc comparison of the
post-test total scores of all assessment types as a function of the ensemble in which the
student is enrolled.
Table 2 Post-Hoc Comparison of Post-Test Total Scores of all Assessment Types as a
Function of Ensemble
Assessment Type Ensemble Difference of Means p
Pass-Offs
Beginning -16.75 .07
Intermediate -1.61 1.00
Advanced 4.95 .96
Note: All comparison types are based on post-test total score for that ensemble's
contract.
Observing the effects of assessment type on the average minutes of practice per
week also yielded some interesting results. The mean number of minutes per week for
students on the contract was 88.5 minutes while those on the pass-off assessment
practiced a mean of 149.5 minutes, a 51.3% difference. Table 3 shows data indicating
the significance of the assessment type and ensemble on average practice time in
minutes. Breaking it down further and analyzing those assessment type effects
pertaining to each ensemble member's practice time yields much the same result in that
the pass-off students practiced longer: Beginning students on contract = 69.5 minutes
vs. pass-offs = 87.7 minutes, a difference of 23.3%; Intermediate students on contract =
88.3 minutes vs. pass-offs = 116.0 minutes, a difference of 27.1%; Advanced students
on contract = 112.5 minutes vs. pass-offs = 224.2 minutes, a difference of 66.3%. Table
4 shows the average amount of practice time in minutes as a function of assessment
type and ensemble. There is a significance indicated with the assessment type used
only in the symphonic orchestra as to how much average time per week the students
spent practicing.
Table 3 Analysis of Variance for Average Minutes of Practice as a Function of
Assessment Type and Ensemble
Source df F p
Assessment Type 1 6.93 .01*
Ensemble 2 9.87 .00*
Assessment Type and Ensemble 2 2.85 .07
Error 59
Total 64
Note: *p < .05
Table 4 Post-Hoc Comparison of Average Minutes of Practice Time per Week as a
Function of All Assessment Types and Ensemble
Assessment Type Ensemble Difference of Means p
Pass-Offs
Beginning 18.23 .98
Intermediate 27.71 .99
Advanced 111.73 .01*
Note: All comparison types are based on student reported average practice time per
week for that ensemble's contract participants.
*p < .05
Survey Results
Much of the survey was used to collect information to assist with the pre- and
post-test results. Information was also collected regarding students' feelings and
thoughts about their assessment procedure. All 65 students participating in the research
completed the survey information that was necessary for analyzing the pre- and post-
test result data. That information included how many months they had been taking
private lessons on their orchestra instrument, how many months they had been taking
piano lessons, on average how many minutes per week they practiced, and how many
minutes they had a music class each day. Sixty-one of the 65 students completed the
remainder of the survey, which consisted of questions concerning their thoughts and
feelings about their assessment procedure. Some students skipped various questions
for reasons unknown to the researcher.
Contract
Table 5 shows the quantitative results gained through the survey as answered by
the students using the contract assessment procedure. Questions are listed with the
responses the students had to choose from. The first two questions allowed multiple
answers while the last five only allowed one possible answer.
Table 5 Quantitative Survey Results for Contract Students (n = 32)
What was your FIRST feelings) about the testing method for which you Responses
were chosen? Mark all answers that apply.
Happy 72.2% (21)
Sad 0.0% (0)
Encouraged 41.4% (12)
Discouraged 0.0% (0)
Did not care either way 34.5% (10)
Other 31.0% (9)
Skipped question (3)
How did you feel after you took you FIRST test using your testing
method? Mark all answers that apply.
Happy 59.4% (19)
Sad 0.0% (0)
Shocked 9.4% (3)
Encouraged 37.5% (12)
Discouraged 0.0% (0)
Did not care either way 31.3% (10)
Skipped question (0)
After you took a test, how did you feel?
Encouraged to continue 96.6% (28)
Discouraged to continue 3.4% (1)
Skipped the question (3)
My testing method made me practice more
I agree 93.8% (30)
I disagree 6.3% (2)
Skipped the question (0)
My testing method made me WANT to practice
I agree 75.0% (24)
I disagree 25.0% (8)
Skipped the question (0)
My testing method made me NOT want to practice
I agree 6.3% (2)
I disagree 93.8% (30)
Skipped the question (0)
My testing method helped me become a better player on my orchestra
instrument
I agree 83.9% (26)
I disagree 16.1% (5)
Skipped the question (1)
Analyzing the qualitative data collected in the survey concerning the contract
revealed themes. Data in Table 6 consists of patterns found in the student written
responses to the following questions about their assessment procedure: What were
three things that you liked about the way you were tested? What were three things that
you did not like about the way you were tested? In a few words, write any final thoughts,
feelings, and/or experiences about your testing experience.
Table 6 Qualitative Survey Results for Contract Students (n = 32)
Theme Data
"Like about the Contract" Could see what was done wrong and how
to fix it.
Improve grade by trying again.
Order of what you completed did not
matter.
Get a grade no matter.
Test when you wanted to.
"Dislike about the Contract" Testing in front of the class
Inability to make a perfect score.
"Practice Attitude" None.
"Final thoughts" Helped make me a better player.
Could always do it again.
Could always get a higher grade.
Pass-Offs
Data in Table 7 are the quantitative results gained through the survey as answered
by the pass-off students. Questions are listed with the responses the students had to
choose from. The first two questions allowed multiple answers while the last five only
allowed one possible answer.
Table 7 Quantitative Survey Results for Pass-Off Students (n = 28)
What was your FIRST feelings) about the testing method for which you Responses
were chosen? Mark all answers that apply.
Happy 27.3% (6)
Sad 13.6% (3)
Encouraged 31.8% (7)
Discouraged 18.2% (4)
Did not care either way 31.8% (7)
Other 50.0% (11)
Skipped question (6)
How did you feel after you took you FIRST test using your testing
method? Mark all answers that apply.
Happy 48.1% (13)
Sad 18.5% (5)
Shocked 40.7% (11)
Encouraged 29.6% (8)
Discouraged 25.9% (7)
Did not care either way 3.7% (1)
Skipped question (1)
After you took a test, how did you feel?
Encouraged to continue 85.2% (23)
Discouraged to continue 22.2% (6)
Skipped the question (1)
My testing method made me practice more
I agree 92.6% (25)
I disagree 7.4% (2)
Skipped the question (1)
My testing method made me WANT to practice
I agree 77.8% (21)
I disagree 22.2% (6)
Skipped the question (1)
My testing method made me NOT want to practice
I agree 22.2% (6)
I disagree 77.8% (21)
Skipped the question (1)
My testing method helped me become a better player on my orchestra
instrument
I agree 88.9% (24)
I disagree 11.1% (3)
Skipped the question (1)
Analyzing the qualitative data collected in the survey concerning pass-offs also
revealed themes. Data in Table 8 consists of patterns found in the students' written
responses to the following questions: What were three things that you liked about the
way you were tested? What were three things that you did not like about the way you
were tested? In a few words, write any final thoughts, feelings, and/or experiences
about your testing experience.
Table 8 Qualitative Survey Results for Pass-Off Students (n = 28)
Theme Data
"Like about pass-offs" It was quick.
You did not really have to worry about your
grade
Made me want to practice more so I could
pass-off the first time.
Get a "perfect" score.
It was fun
Improve your grade by trying again.
Test when you wanted to.
"Dislike about pass-offs" Keep doing it until got it right for a good
grade.
It was hard to pass off- Required
Perfection.
More pressure (stress)
No passing = Zero
Order of grades.
One mistake = Fail
Testing in front of class
Took a longer time
"Practice Attitude" Encouraged to practice
"Final thoughts" It was too hard.
Made me practice more.
CHAPTER 5
DISCUSSION
Introduction
This study began with an invitation to the researcher's 140 middle school
orchestra students to participate. Of those 140 students, 75 students fully completed the
consent and assent forms appropriately and began the study. Over the course of the 9-
week grading period there was a 10-student reduction due to various reasons: extended
absences due to the flu, students moving away, and general absences that caused
those students not to complete all parts of the pre- and post-test treatments or the
survey. Therefore, the data collected on those 10 students were excluded from the
analysis.
Using their assessment method, all students were assessed during the school day
three times during the 9-week grading period. For grades the students did not have
completed or wanted to score higher on after those days, the students were made
responsible for testing before or after school with the researcher. Early on, the
researcher noticed that the beginners, who had no prior experience with either form of
grading, took to the pass-offs quickly. After the first testing day, many of the beginners
understood what the expectation was for something to be "passed-off." After the first
testing experience, most beginners passed-off on their first try. For the students who
had experience with the contract testing procedure in previous years, it took them
longer to understand what was expected of them to get "passed-off." Many of them
would attempt to pass-off a requirement up to five times before actually achieving their
goal. The researcher observed behavioral cues as to what the students were feeling in
their pass-off experience. Anger and frustration was apparent. Many times the
researcher feared those students using the pass-off assessment would quit the class
because of the high expectation and pressure to get a good grade (pressure applied by
the students and their parents/guardians), but reassured the students that this was only
a research project and that they would be taken care of at its conclusion. In the end,
and without informing the students so as not to effect their motivation to achieve their
chosen final grade, the researcher added some points to their final grade to help all
students who participated in the research project so that their participation in the
research was not to effect their grade in the course.
Achievement
What are the effects on secondary instrumental music students' achievement
using a rubric-based assessment tool versus a "pass-off" based assessment tool?
Statistically speaking there is no significant difference in the achievement of the
students regarding which assessment tool they used. However, when observing the
post-test total scores as they compare with what ensemble the students are involved in,
we see a slightly different result. In Table 1 the data indicates that what ensemble a
student is in has a statistically significant effect on their post-test total score (p = .02).
The data also indicates statistical significance in the post-test total score when
assessment type and ensemble co-vary. This researcher observed that the data
collected on the students in the beginning and intermediate orchestra classes on the
contract seemed to score higher on the post-test than those on pass-offs. Conversely,
the students in the advanced class who were participating on the pass-offs seemed to
score higher on the post-test than those on the contract. It is the researcher's opinion
that the reason for this difference between the students in the two ensembles could be a
combination of things. One, students in the advanced class are placed there for a
particular reason. To be placed in that class the student had to audition and perform at
a higher level. This fact in itself would indicate a high level of personal achievement on
the student's part. Also, this would indicate that the student has a high motivation to
achieve, and possibly a certain "competitive spirit." The pass-off sheet would seem to
feed into these traits: high motivation, desire for a high level of personal achievement,
and the competitive spirit. For students to "pass-off," they must be motivated to try again
when they experience failure, and there is a competitive nature when they are "passing-
off" to see who can pass-off first. This researcher observed that many of his most
advanced players enjoyed the pass-off procedure not just because it was something
new, but because it was "challenging" and "fun." He also noted that in all classes where
the student was not as serious about performing, but rather took the class as something
fun to do or was a beginner and just introduced to playing the instrument, pass-offs
were a serious issue and very uninspiring.
It is essential to note the findings in Table 2; beginners using the contract
assessment type received a higher total score on the post-test. This finding, while not
statistically significant (p = .07), regarding the beginning orchestra students on the
pass-offs scored lower on the post-test total score it is important to note that there was
a -16.75 difference of means between the two scores. This difference has a practical
importance for educators in that we want the very best educational assessment tools to
be used. Here it does seem that the contract assessment works better than the pass-off
assessment method. In the beginning and intermediate classes, the results indicate that
a rubric-based contract assessment method is beneficial. Students seemed to perform
better when given the opportunity to reflect and learn through the small incremental
achievements gained through the researcher's contract assessment process. Students
were given a score each time they performed, based on eight graded areas of their
performance. They then were able to reflect on their performance by referring to the
grading rubric. They could make adjustments to their performance and then take their
test again to gain a higher grade. This constant cycle of grading, reflection, guided
practice, and grading again lends itself well to the younger, less-experienced students
who need more guidance and care.
Motivation
Is a student's motivation to practice different based on the assessment tool used?
Data gained through the survey and researcher observation was fascinating regarding
motivation to practice. Considering all cases observed, students who were assigned the
pass-off procedure practiced on average twice as much as those using the contract.
This researcher believes that the reason for this occurrence is the extrinsic motivation
caused by the pressure of making a good grade in the class. They practiced, as
mentioned by one student, to "make the grade." This pressure was placed on them
most likely by their parents/guardians and in turn themselves. Perhaps students who
were assigned the contract method had more goal-oriented practice that allowed them
to get more done in a shorter amount of time. It is the quality of practice versus the
quantity of practice. Students practicing with the contract can use the rubric to reflect on
concepts on which they need to work. For some of the more advanced students, it
seems to this researcher, participating in the pass-off assessment procedure was truly a
thrill, a game, and in fact intrinsic in nature. Many times, students who fall into this latter
group have crossed over in their musical experience from extrinsically motivating factors
to the sheer enjoyment of the subject and performing.
As far as motivation to practice is concerned, as music teachers we walk a fine
line. We must know our students, their lives, and their personalities. We must tailor our
curriculum to their individual needs. Most music students are motivated to make good
grades in our classes. Some are fine with making grades that are below an "A," while
others have to have "straight As" and will not rest until they achieve their goal. On a
pass-off system, it is easy for the teacher to set the expectation for something to be
passed-off high and then students get frustrated and quit because they have no
success. Then, over time, it is also easy for teachers to fall into the trap of lowering their
expectations to allow the students to feel success. It is this researcher's belief that using
a system such as a rubric-based contract assessment, for the average student, is more
appropriate. The data in Table 8 shows the thoughts and feelings of students on the
pass-off system. They reported on feeling more stressed about performing. The
students also thought that it was hard compared to the contract system because one
mistake meant a zero for a grade. With the rubric-based assessment, you can motivate
students to make a numerical grade no matter how they perform. If students want to
make a higher grade, or want to increase their score then, they have that option. For
other students who are fine being average, they will continue to play for any kind of
score they receive the first time. The difference between pass-offs and contracts here is
that all students will get some sort of grade on a contract rather than not being able to
play under pressure and just failing music class. Our goal as music educators is to
teach a life-long appreciation for our art and not push students away because they do
not perform something perfectly and then teach them to hate music.
Conclusion
It is not enough for teachers to take the route of pass-offs because it has the
potential to create outstanding music program: music programs with students who can
perform something they have been conditioned to play perfectly. All music teachers'
standards of performance and backgrounds are different. What qualifies as quality to
one educator may not be quality to another. Ultimately, in music, we are dealing with a
person's opinion of a performance rather than facts. Facts, on the other hand, cannot be
disputed. "You did not use the correct bowing." "You did not play with the correct
rhythm." These are concrete concepts that no one can dispute. Assessment that is
based in a rubric will be respected by other subject areas, and be understood and gain
the support of students, parents and administrators. With a tailored curriculum that is
rubric-based, students can learn many components of the performance. They learn to
properly develop all parts of their craft and technique, and through this development,
over an extended period of time, we as music educators will foster a life-long love for
music and perhaps continue the student's participation in and support of the arts as an
adult.
Teaching is an evolutionary process. Much of my career as a music teacher can
be described as such a process. My philosophies on assessment and curriculum have
changed dramatically since I began teaching ten years ago. What started out as
teaching to get through the day, teaching to the performance, and teaching to make
high scores at festivals and competitions with my groups has changed. It has developed
into teaching the child, teaching with an end goal in mind, and teaching with a
philosophical belief that in the end, music must be taught to enrich the child's life,
strengthen creativity and broaden appreciation of music. The assessment of students
should be based on the concepts and skills you feel are essential for them to learn.
Teachers should use their state's standards as a guide in developing of such
assessments. This will encourage and develop life-long learning skills in each student,
not just skills to "pass the test." Through this research, I have affirmed my belief in using
rubrics in assessment. I also have come to believe that when students are trained to
use a rubric-based assessment procedure, they will conceptualize and organize their
practice time with more efficient learning. The rubrics can be guides in their practice that
are essential and will instill future good-practice habits and routines.
APPENDIX A
PRE- AND POST-TEST SCORING SHEET
STRIG mSTRUMENT EVALUATION FORM
PREPARED PECE EVALUATION
TIME The s IaIbt kne:
(aws COE ai)
10 Ies or etacallS rakne qpl InSl rEla g
mal dsbbi occ3mlnay i snum asE6qe
a _I of a hmacks~ e e pay i nl Uirang.
mula dbte ocaonaly in smne passage
S___e eicrnD la hi poacdkc pea iWtly
Iu orur~amSed u nr, smeMl rtime bcw
Dolalays c anecy e).
4 _t___ hi mei mnrfl in bac pmnficl pe.
calumaly haualasa aDH leali aii
nolifed lf d G"r ).
aR p.~kllnem s oa*Ta l
U3 mt icrur atam urarpm
A raRmes IaIrcale tmwm
2 _ RaciruAu plNE hWSltueiummn t IM U nea
(Le nmag iy UfM BW aDeM e)
ImnaotI TMe alntSrfibamr
10 bi atrwS ilumiEl n al r ai
ae_ is amEal, IA dodeit lam to atimt ni
___ ih mm aturat, tbut ImiS out or ie
pilmesinan 6.131pa sima Irlai.
m_ ue a ab seme our tlamo, yerEsn
case&my oit uhe
mrnl a r Time Hraldet pems fp t ml m
a10rtreian nmP. npbn. jt lf i hUL
lo_ ama raesbimeu mp.
a_ ne aycurae fm)u, hMr e UM Mae
frpela~ n or s ree rn~ palein.
6 __ jquc pdiu aciad. it a-
bl pimn (appmran rn mfl7n puall
iensed
4 ___
2 __
TEP me Ma dents
10
a__
i amrail aIn cmlarlt Im Ie pil~
petauIeX toir ies not a dcat samg cy
arnt t ptumamxn
b WffEt aMn the red imnpio mnang(E
nBHing In uBhp npta.t) Ir -e
seeun. yet raUE .n1.
I o n ilE_. (0- RElm agg.
naurate Lmpo lage).
2 is mfiacdhi E OcmnmlEt.
MERPRETATION: The sEuSt denmairabK
10 Il 1iSefl eWMi ra IMnaig We-
iaped Prias and &pIIW
a a h level .l o miity. Idh ti s pira1
usrhAnal .I Sepsao.
TECm~EIIiARCULAIIO The trfldenltm-Eiadec
(Oiec ALL at APPLY--wtmi 2 poit ean.)
ppiIme ma amre flig.
aApq t auned as mali a
app me ianetrl a ia lt
qigqtal we.o1 of eiasi nmd (L e
bgkt. stancL
nany rnc pudir ne or
nma lyfmt ipiem oDnedy.
SCALE EVALUALTIO
NOTE ACCURACY: The s-t parnns
(chea ONE aor)
2.5 2ip ehBDieti auA-r.
2.0 mnl pafssrtu- -amrak
1.5 I1m ry pas ,sA-s acaMra'.
1.0 2__ inanImrlmermamMe
0.5 a la age serorNhacwa jtlieshxtes
fwmqDtliwsca*.
at me -m -
e__ E r. tsi tag ne samdg.
IKlIANSHP: The S fdlt dm maW&
(Cec AIL fwH APPLY-1,2 paSt eah)
_n_ aB aMam.
_alque bacatlUnse.
u___ a d, ue and dyn mta.l.
SIGHT READUG EVALUATION
TOE Tie aunnrs knee:
(5~ce O a m a')
5 __ f a duaaml s us e*an Ly I nml uma
I ______
3
MOTE ACCURACY: The flAfntpai a
5 A pinaems Msaniu .
4 mint t-"A- t.
3 Mlay s Mn fr' y.
2 Wl Imn i Iomme *Iu8.
1 ma pailhlfMedeo ftiaM E nmlc
(Le. nmg kr splahes, acJiMtask ec.
RHm ACCURr The EaUn "tWn:K
~4 nme~f ac m- M man pWeit se
32
2__
g.r
aducaM k occ3asMay some p3G3pe.
is o ch3ramac ln h qualty ifm *mst ang*jr
Mlad etuk occaonaly i sane p3Eragmipe
s normaendsarm umn, ematlt ie d. a
nm alDrumnream um A )
aE BeErS mnuprlaM In baeL pfiudIm pAe
asienen SMlbeaMed s und.la IMMI baW
nolAused onecrL
Is nmla Ine q~ay duraaicde ofth
siltMMl.
Puw ,2l9tu pdMEm aary bp Eme
ma pl~ ppmniu1 s i squam patinas
HimhMpd cm crly
hD-OdB*-
1 nul E-i n pam s inkHcy-
WTERPETATHK Tme SEat dauna&ab
5 ~ e sleela i orn mcfucau nc t g em-
EqaIe pumie al dnaMn
4 __ la iSewe on malym. IbUilr me iciplum
dr d"uac and m ,at rMfMelt MIeI tiM
omWnel kWepreeo.
3_ a mDfat BFrS SormEcf aid nHa
ubarlaimag.
2 ory__ a lunl abmnt o m.usamy ac nmsc
1 __ a =l r5n um s ta ig
TECHIIBEI.AW'CULATI Tile idUlMt O Mela
(Gin ALL tIa APPLY-utt I putt &Mti)
qWpqta- med a=a H Mring
N i aul lmpty inaUcL
qupqtaAlB lenpi of rneas rnread (L.
legbaCo A
1_
APPENDIX B
SURVEY QUESTIONS
Background: The following questions will gather background information about you,
your music knowledge, and music education. Please take your time and answer every
question honestly and to the best of your ability. If you have any questions about a
question, please ask Mr. Olsen.
* Please type your name: Last, First
* How many total months have you played your orchestra instrument? Please
include any elementary orchestra time even if you switched instruments when you
came to middle school.
* How many months have you taken private lessons on your orchestra instrument?
Please state your answer to the closet half month: Ex. 3 months or 3.5 months
(three months or three and a half months). Remember there are 12 months in a
year. Zero CAN be an answer.
* How many months have you taken piano lessons? Please state your answer to the
closet half month: Ex. 3 months or 3.5 months (three months or three and a half
months). Remember there are 12 months in a year. Zero CAN be an answer.
* On average, how many MINUTES per week do you practice your orchestra
instrument?
* How many minutes in the school day do you have music class?
o 46 minutes (one class period of music)
o 92 minutes (two class periods of music)
o 138 minutes (three class periods of music)
Questions about the testing procedure: The following questions deal with the process
that you went through during the testing procedure. Please take your time and answer
each question honestly and to the best of your ability. If you have any questions about a
question, please ask Mr. Olsen.
* Which testing method were you selected for?
* What was your FIRST feelings) about the testing method for which you were
chosen? Mark all answers that apply.
o Happy
o Sad
o Encouraged
o Discouraged
o Did not care either way
o Other (please specify): A blank text box was given for their own responses
* How did you feel after you took your FIRST test using your testing method? Mark
all answers that apply.
o Happy
o Sad
o Shocked
o Encouraged
o Discouraged
o Did not care either way
* In a few short statements, answer the question that applies to your test: Blank text
box for their own response.
o Pass-offs: How did it make you feel when you "passed-off?"
o Grading contract: How did it make you feel when you received your grade
or your test?
* What were three things that you liked about the way you were tested? Please
number your answers. A blank text box was given for their own responses.
* What were three things that you did not like about the way you were tested?
Please number your answers. A blank text box was given for their own responses.
* After you took a test, how did you feel?
o Encouraged to continue
o Discouraged to continue
* My testing method made me practice more.
o I agree
o I disagree
* My testing method made me WANT to practice.
o I agree
o I disagree
* My testing method made me NOT want to practice.
o I agree
o I disagree
* My testing method helped me become a better player on my orchestra instrument.
o I agree
o I disagree
* In a few words, write any final thoughts, feelings, and/or experiences about your
testing experience. A blank text box was given for their own responses.
APPENDIX C
EXAMPLE CONTRACT SHEET AND SCORING RUBRIC
bE' irs C~ubi of Lain ma.sbc
O)inarac~Adeeonn
CLA~ 6afirm PO~lm:
-OR. oy~r
uL...Pw'
i~a ~
4wlbYw~4
4Rp~b-
r~nn-C.
iR a~
tc3~Lb.
IlPIq~baflt
h--mfrbb
."dpLdhlft
a___
n-p--aY
awam1
.1" .
aenUu
-
SF-lk
- 3rab
ama n~w ga
aL~Y a~
O~p~rh
.aqp~aapc.
-~---
up-
*-ua
UccsE
Os.w
v- .n
*-21-m
____ __ _~_ __ __
"!: offlct 2M S
Grading Rubric
foor 0"A -
2 3 4
TON E
4*6,
undeveloped sound for developing sound fbr sound for the sound fbr the
the instrument the instrument instrument instrument
RHMM
iik`l
rhydLmsa=Ae1y-.Fuheis ac=aelp PLdse is steady arcuriabeViftheissteady
rarl-ey sneady, if at an some cffthe time most af the time entiretime
NOTERFCOGNITION
&gufnp a
ffngv j Does not Sam afthe ttmft dtbeft."Nd.tir.
130tE5 M 7 Daes -il=.t amintmailiamesmime KrEi31tMWjKJM3M333D5tno1Es cffthEtcme-,A31nate3.-min
Tote Molts CQL-rECdF carrectly time-, Names au imto CDMCth'
MUSICIANSHIP
the orpre-,miaa n1a in expression marks in exprussion mark in marks in music
MuTic music Music
RIGHT HAND: Thumb PinLy Depth of hold Relxxatjon
amat WE
pUcPd- d*WN#ts- 0Wzvd-r-v1VjWdSbtisa1 pbred c-ctlp W"ft is OWred Ykismisen
OnkynatMmwitstand ttme3anmdexwtphikr. wrist DftenonindertiDtonkr.wri-st index not pin _, wrist and
hand &-z- ww6p -nA hand we at times relaimd and hand are Dfam retumd E3ad arr. m1azEd
LEFT HAND: Fingers -,=ed Thumb plarenient 'Arni'elbov., angle 'WList
cl
curpm
*--t a-jelbu% -d plarmem ami/el" and larement zWeIbM arbd
P.
,/elm and wristarecorrec someofthe wnst am mttstafthp Wtistamearreft0ofthetime
wnstamraxe R3!4carreft
7-m mme
BOWING Direction I Placement I WejghZ Speed
spe and npt hatid p" spee4 acd nftt h
are ratie]y, it at 4 carrect am carrectsatnevr = Z= Vr 51=%n W=
POSTURE: Sit/stand properly Instrumem carriagel Body
Doer, daTftf va& wnv&pmifizm Plays me puis V4&WffErtPt15hn*
poitarej Body is ahvays someoftbetimeFBady MOSt Dfthe tbltej Body is aUofthetimm; Body is
tense tpnds to be tmse Mamed mast of the time reLi:xed
APPENDIX D
EXAMPLE PASS-OFF SHEET
chain of Lakes Middle school
Orchestra Pass Offs
period
Period:
Beginning Orchestra
T1he l wing crheria wil be used tno Wrii yiF grat for eali padin peiodaf tha 2N9-2010 imo slyer. i muerBsodaftM
you pade wu illreaCt a m ii T ofpeal imithne. As cbpmi in a cmpfiy is cumpiedBd, you =u have it initiad by h.
aO9 Eh aptir nIy be spnel each gadig psric, Imw, ymay y m -nee -ay edit per optim ach pding ptind
This camplee pass f sheet im be accepted say fim diwtg S carm f r e mme weeks to &e diae &r it is ,a. EaM day frfh
ath imaf fsl wi mil reemr me tte r grat e 1mr *tk paisn shm&Lt In case ofabsm a, ymo pass off ~le will be .sqimd
on yor FIRST day back BJiriur mspaNusiy tAmad it kirpss cfftast NO EXCEFPTIOPN Ym will ruolve cohif m pAE off
she pe 9-weeks. Yoca apit mllia cmirc at www.cluifibses g.
his gade cmtract, wea signed by Mr. Olsen, will atOiri e ncanpletimn ofyor pS aff oblpigai. ALL sectionsamt be comaled.
GRADING PROCEDURES
All stldem me expecMd tn Ive: Tht 5 mmariy mpAtelim:
1. Regular tzmandce. ailnie ata ml disnpije in the anlestm 3.Noml popeess in ith eseble
4 Atldme e at al pimmmrms tldhig Ie Assemme m!d Festiv 5. All irms mmedi y pecified dmlrlnmj
PASS OFFS mUSr BE COMP~LEED IN RADE ORDER
GRAE C CRAE B GRADEA
1. ompletw e 5 mms ary 1. Cuapdte GRAB C 1. Caple GRADE B
epeca 2. _Bstia Ele anL 19 2. BaEims l Elemmes 34
2. e___BEsatial Elbnl 3. 1 sential Elemaat #22 3. D Majar Scale n(meoied)
3. ___ mig Rhythml 1 4. 1 Gmral 4. 1 Genal
4. W__ inmig Riydm 2
5. imm gRhym 3
6. ___Wmmig Rhydm 4
7. __ GeMin
a) _Play yr 7 map scales (iA hens pasbl efyar
in-stmme trk ae .
b) Cample selecteadin i- ad fl peaepm (aggmveIby
MI Olsm).
c) T_ e aliases oa a iner nk canmnims am s
d() Palidpe a small ememble poF (uEhmgitalmup's
e) Assiist i sm n aab stom (aa 3 ihumys
f) _Playi m emble a SS e & Immbe a brcis at c
time
_Pla a RSa Sab & Emoble Fewa ork fs S~at
mtipim&d.ie arn .ina
).
C CamnGWreliein~ jmniL.
j) __Play in m am asni adfly glma lasies mifim
mlya lybe Imeonce a yer. E. You&l cema).
1) ___Play bmic mBm scala (3 minima).
D Anpim flacidpf in llA--Stm a ARl-Cumy
a) _Acrampry a mse piano at So lo& Butle Fedial
at far the cis.
A) Camp-maiglnei gm m jmx anAth o 90
infim a ssemble (3MIaa mirr)
a) __Play. rle smeanem else play, fi these yur 30
p) _CmBplbi* a MEsW neatyPa
I) TAe pimlens (Pmlxe. TearIH nEt s3ElisMlSiai
adpnidepoeiriebieiraemter
a) Anyar aiiy tlinh beyte jdaiia sasificn
mabirtiae byr n medal mus. hIefBhe Ibomi
Depmanae ofli caMiddlMe School F eaplek
afiumm schael eaelf ifa amrlninr aod
reining.m iifg. a Dl aWcuB na caret guan
ani be thed). Coly o aibt isallmmd ir ptia
per aming peid.
al.'ams( asulhtmrama'.wSrw arnarral.ra25sSwesrmasm
nCdaerin vwar pr pssr M P wmh suy srim
Due: October 2 -" 0099
APPENDIX E
UF IRB PROTOCOL LETTER OF APPROVAL
U I institutional Review Board Po IUx 11225
I UNIVERSITY of FLORIDA GcainsvLl. ?2f11-2250
352.392-0433 (lhoane)
352-392-9234 (Fax}
irb2oufl.nrlij
August 12, 2009
TO: David N. Olsen
2655 Sunny Bright Court
Winter Park, FL 32792
FROM: Ira S. Fischler, PhD; Chair-1
University of Florida
Institutional Review Board 02
SUBJECT: Approval of-Prototol #2009-U-0825
TITLE: The Effect of Two Different Assessment Tools on Secondary Instrumental Music
Students' Achievement and'Motivation
SPONSOR: None
I am pleased to advise you that the Iniversity of Florida Institutional Review Board has
recommended approval of this protocol. Based on Its review, the UFIRB determined that this
research presents no more than minimal risk to participants. Your protocol was approved as
an expedited study under category 7; Research on individut or group characteristics or
behavior (Inctudng, but not lirnfted to, research on perception, corgntfo, motivation,
identity, language, communicaton, cuiturai beliefs or practices, and sodal behavior or
research employing survey, fntervlew, oral history, focus group, progrQm evaluation, human
factors evolution, or quwty assurance methodoloies. Given your protocol, it is essential
that you obtain signed documentation of informed consent from the parent or legal guardian
of each participant. When it is feasible, you should obtain signatures from both parents.
EnctosLd is the dated, LRB-approved informed consent to be used when recruiting participants
for the research.
It is essential that the parents/guardi ns of your minor partid pants sign a copy of
your approved informed consent that bears the IRB approval stamp and expiration
date. I
If you wish to make any changes to this protocol, including the need to increase the number
of participants authorized, you must disclose your plans before you implement them so that
the Board can assess their impact on your protocol. In addition, you must report to the Board
any unexpected complications that affect your participants.
The approval of this study is valid through August 11, 2010. If you have not completed the
study by this date, please telephone our office (3920433), and we will discuss the renewal
process with you. It Is important that you keep your Department Chair informed about the
status of this research protocol.
ISF:dl
An T1.|ullld l|p : Lwl it'r Ikilluim
APPENDIX F
ORANGE COUNTY PUBLIC SCHOOLS' REQUEST FOR RESEARCH FORM
Subrr il Iis form and a copy Qrange County Publir Schools Your rsee rch proposal should
of your proposal to: include:
acoauo tbXity, HReourch, ar~ RESEARCH REQUEST FORM ProjccTitl=
AS tsrrilor] PjrpDos end Research Probhlm
P.O. 9ox 271 s Instrunen-:
Or~ndo. FL 32802-0271 Procedures and Prapcsed Date
Ana4.RlI
RFeqJester'a Name Davd N. OlsEn Data B811 BN0
Address 2655 Sunry Bia ht Court, Winter Park. FL 32792 Phone 352-2B1-2e86
Ins-:Ltiknal Altliatlaon U livesity of Florida
ProJect Director or Advlsor Tmothy Brophy Phone 352-273-3193
Address Unrversiy of Flocida School of Music, P.O. 33K 117900. Gainesville, Florida. 32' 1-7900
DegreF Snmght ] Assnaciste Rachelor's ] Master's [] Specialist
[chaeE;-One) Do lm-Dtc'e N-. Aiplicable -
Project Title: The Effect oFTwo DifferentAssessmentt Tools on Secondary Instrumental Music Students' Achievement
arid Motivation
ESTIMATED INVOLVEMENT
PERSONNELCEINTERS NUMBER AMOUNT CF TIME SPECIFYVDESCRIBE GRADES,
,DAYS, HOURS. ETC.} SCHOOLS. SPECIAL NEEDS. ETC.
Studerk 150 9 weoks Graces 6, 7. 8 0 {Orc.estra)
Teaches 1 9 weak David N. O1san (mIe)
Administrat.-s 0
Schools'Centers 1 9 weeks Chain of Lakes Middle School, Chris Derrier P.in.
Others (Epecify: 0
Specify possible benefits to
stLdents'school system: Stjdent'S pe tijipatin in thip research will most likely acheve at higher IcVn bccmuso ofti'c
approaches ba.ng cmplovd. The results could hove a positive influence on assessment p-QCsures development tv
teachers.
ASSURANCE
Using the proposed procoduros and insirurent, I hereby agree to co-nduc: research n accordance with the policies of the
Orange County PLblic Schmols. Deviations from t a approved procedures shall be cleared through the Senior Uirector or
AccjrinabilLy. Research, and Assessment Reports and materials shall ba supplied as specilfed.
Requesters Sgreatlr '-. RECEIVED A 21S M
Approval Granted: OL Yes 0 No Date: 8' .
SBgnature of the Senior Dirctcd for
Accountability, Reswrch, anc Assessment -_-_
IJOTE TO REoQU CiFn: When seekirg approval at the sJhuul kval, a copy of this farm, sig-cd by the Seninr Directr, Acc-culabi!ily.
,eseardi, and AEscssurmIe should be shawn tD te school prind.iw who has 1he aplian ID refuse parficipticn ldupendlng upon any
sdch.l eIrclrnsanc Dor condition, Thir original Research Request Form is rjrvfurAhIP In a ta me ctouumrent.
FORMfID CPIB433-1.1 FY R-V6.U7
APPENDIX G
PARENT CONSENT FORM
U n ivraity of Florida
Depadrtfnt at Fine Arte
School of Music
PO Box 117B00
Gainenvilla, FL 32611-7900
Dear Parent/Guardian,
I am currently a graduate student in the School of Music at the University of Florida, conducting
research of Assessment Tools on Secondary instrumental Music Students' Achievement and
Motivation. This research is under the supervision of Timothy S. Brophy, PhD. The purpose of this
study is to explore the effects of two contrasting assessment tools on secondary instrumental music
students' achievement and motivation. The results of the study may help music teachers construct
more meaningful forms of assessment that help increase students' achievement and inspire
motivation for students to practice. These results may not directly help your child today, but may
benefit students by producing higher achievement. With your permission, I would like to ask your
child to volunteer for this research.
Half of the participating students will either utilize the grading contract that is standard to our
orchestra curriculum at Chain of Lakes Middle School or use a procedure that is widely employed by
other music teachers called "pass-offs.! This research will take place during the first 9-week grading
term. For the remaining three grading teams all students will return to our standard grading contract
system. Students will be placed into one of these two study groups by random selection. Each
student will participate in a pre and post achievement test to collect data that will be studied for
significance. Students will also participate in a short questionnaire to help observe trends in
motivation. All names of the students will be kept confidential, be replaced with a number, and only be
known to me the researcher. Results will only be reported in the form of group data. While these
procedures will determine grades in orchestra class it will be your student who earns the grade rather
than the assessment tool and all students will be graded accurately and fairly. Non- participation in
this study will not affect the children's grades or placement in any programs.
You and your child have the right to withdraw consent for your child's participation at any time without
consequence. There are no known risks or immediate benefits to the participants, No compensation
is offered for participation. Group resLlts of this study will be available in January upon request. If you
have any questions about this research protocol, please contact me at 407-909-5400 or my faculty
supervisor, Dr. Brophy, at 352-273-3193. Questions or concerns about your chitd's rights as research
participant may be directed to the IRB02 office. University of Florida. Box 112250. Gainesville, FL
32611, (352) 302-0433,
David N. Olsen
I have read the procedure described above. I voluntarily give my consent for my child,
to participate in David N, Olsen's study on the effects of assessment tools on
achievement and motivation. I have received a copy of this description.
Parent i Guardian Date 2r Parent 1 Witness Date
Approved by
S University of Florida
Ins!rutioral Review Board 02
Protocol # 2009-U-0825
For UseThrough 08-11-2010
APPENDIX H
STUDENT ASSENT FORM
StLdent Assent Form
Mr. Olsen is a graduate student at the University of Florida. He is studying different methods of
assessing students in instrumental music. He will be working with several students at Chain of
Laken Middle School on thit study, aid you are invited to participate. If you decide to
participate, you will be assessed in one of twc ways: either through a grading contract (how
students are normally graded at Chain of Lakes) or through pass-offs (another method
comrnonly used to grade music students). We will sper.d 9 weeks on this project. There are no
known risks to participants, and most students find both ways of being assessed equally fair.
You do not have to be in this study if you don't want to and you can quit the study al any tine,
even after you have started. Other Lian Mr. Olsen and your parents, no one will know your
assessment results. Whatever you decide, this will rot affect your grades in class. Your
parent/guardlan said it would be OK for ycL to participate. Would you like to participate in this
study?
SYes, I am willing to participate In this study.
SNo, I am not willing to partcipatl in this tu ,y.
Students Signature Date
Appro'eft by
Urnivcrsity o F;orida
Inst;Cutioal Review Boar 02
Drotocor# 2009U-0256
-or Use Through o- -a2- 0
LIST OF REFERENCES
Abeles, H. F., Hoffer, C. R, & Klotman, R. H. (1995). Foundations of Music Education.
New York, NY: Schirmer Books. 2nd Edition.
Antmann, M. D. (2007). Assessment and Grading in the Beginning Band Classroom.
Masters' Thesis. Florida State University, Florida. Retrieved December 5th 2009.
http://etd.lib.fsu.edu/theses/available/etd-07092007-172856/
Asmus, E. P. (19991). Music Assessment Concepts. Music Educators Journal, 86(2,
Special Focus: Assessment in Music Education), 19-24.
Asmus, E. P. (19992). Assessment. Retrieved August, 2009, from Music Assessment
Web Site at the University of Miami:
http://www.music.miami.edu/assessment/index.html
Asmus, E. P., Jr. (1986). Student Beliefs about the Causes of Success and Failure in
Music: A Study of Achievement Motivation. Journal of Research in Music Education,
34(4), 262-278.
Baker, E. L., Aschbacher, P. R., Niemi, D., & Sato, E. (1992). CRESSTPerformance
Assessment Models: Assessing Content Area Explanations. California: UCLA Center
for the Study of Evaluation. Retrieved from:
http://www.cse.ucla.edu/products/guidebooks/cmodels.pdf
Bell, A., & Bell, M. (2003). Developing Authentic Assessment Methods from a Multiple
Intelligence Perspective. (ERIC Document Reproduction Service No. ED 479391)
Hewitt, Michael P. (2001). The Effects of Modeling, Self-Evaluation, and Self-Listening
on Junior High Instrumentalists' Music Performance and Practice Attitude. Journal of
Research in Music Education, 49(4), 307-322. Retrieved from International Index to
Music Periodicals database.
Learning Theories Knowledgebase (2009, December). Attribution Theory (Weiner) at
Learning-Theories.com. Retrieved December 3rd, 2009 from
http://www.learning-theories.com/weiners-attribution-theory.html
Motivation. (n.d.). Dictionary.com Unabridged. Retrieved November 23, 2009, from
Dictionary.com website: http://dictionary.reference.com/browse/Motivation
H. Res. 1, 107th Cong, 115 Strat.1425 (2002) (enacted). Public Law 107-110, No Child
Left Behind Act of 2001 (Short Title). [Electronic Version]
Orenstein, A. C., & Hunkins, F. P. (2009). Curriculum: Foundations, Principles, and
Issues. Boston, Mass.: Pearson. 5th Edition.
Radocy, Rudof E., & Boyle, J. David. (2003). Psychological Foundations of Musical
Behavior. Springfield, Ill.: Charles C. Thomas, Publisher, LTD. 4th Edition.
Saunders, T. C. & Holahan, John M. (1997). Criteria-Specific Rating Scales in the
Evaluation of High School Instrumental Performance. Journal of Research in Music
Education, 45(2), 259-272. Retrieved from http://www.jstor.org/stable/3345585
Willoughby, M., Feifs, H., Baenen, N., & Grimes, E. (1995). Behind the scenes:
Measuring student progress in the arts and beyond, evaluation and research report
95E.06 Evaluation and Research Dept., Wake County Public School System, 3600
Wake Forest Rd., Raleigh, NC 27611.
BIOGRAPHICAL SKETCH
A native of Marshfield, Maine, Mr. David N. Olsen received his Bachelor of Music
Education degree from the University of Florida in 2000. He then obtained a position in
Alachua County, Florida, as Director of Music at Lincoln Middle School, where he taught
for seven years. Mr. Olsen's bands and orchestras have consistently received high
honors at the district level and at other music festivals. Mr. Olsen served as Alachua
County Middle School Honor Band Chairman, the District Liaison for Secondary Music
in the Alachua County School System, and the School Board of Alachua County
Representative for the Alachua County Youth Orchestra Executive Board. He is on the
Board of Advisors for the National Adjudicators Middle School Festival. He also serves
as a guest clinician for numerous ensembles. Professional memberships include Florida
Bandmasters Association, Florida Music Educator's Association, Music Educator's
National Conference, and the Florida Orchestra Association. Mr. Olsen currently
teaches orchestra at Chain of Lakes Middle School in Orlando, Florida, for the Orange
County Public School system, and he resides in Winter Park.
|
Full Text |
xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E5XKKXPUM_3GSOS0 INGEST_TIME 2017-07-14T23:22:13Z PACKAGE IR00000047_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES
PAGE 1
1 THE EFFECT OF TWO DIFFERENT ASSESSMENT TOOLS ON SECONDARY INSTRUMENTAL MUSIC STUDENTS’ ACHI E VEMENT AND MOTIVATION By DAVID N. OL S EN SUPERVISORY COMMITTEE TIMOTHY S. BROPHY, CHAIR SILVIO DOS SANTOS, MEMBER A PROJECT IN LIEU O F THESIS PRESENTED TO THE COLLEGE OF FINE ARTS OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF MUSIC UNIVERSITY OF FLORIDA 2009
PAGE 2
2 2009 David N. Olsen
PAGE 3
3 To Kristin and my family.
PAGE 4
4 ACKNOWLEDGMENTS I thank the 200809 master of music education faculty for their hard work and dedication to seeing the first class of summer masters in music through to their goal. Drs . Timothy S. Brophy, Charles R. Hoffer, Russell L. Robinson, Paul Richards, Silvio dos Santos , Kara Dawson, and Miriam Zach are educators who care and are truly dedicated to the future of music education. I thank my parents for getting me involved and supporting me in music. Finally, I thank Kristin for being patient and confident thr ough the long summers apart and the falls of online classes.
PAGE 5
5 TABLE OF CONTENTS page ACKNOWLEDGMENTS .................................................................................................. 4 LIST OF TABLES ............................................................................................................ 7 LIST OF ABBREVIATIONS ............................................................................................. 8 ABSTRACT ................................................................................................................... 10 CHAPTER 1 INTRODUCTION .................................................................................................... 12 Significance of the Problem .................................................................................... 12 Purpose of the Study .............................................................................................. 13 Delimitations ........................................................................................................... 13 2 REVIEW OF LITERATURE .................................................................................... 14 Introduction ............................................................................................................. 14 Philosophical Rational ............................................................................................. 14 Theoretical Background .......................................................................................... 15 Assessment Development ...................................................................................... 17 Motivation ............................................................................................................... 18 3 METHODOLOGY AND PROCEDURES ................................................................. 20 Introduction ............................................................................................................. 20 Study Procedures ................................................................................................... 20 Data Collection and Analysis .................................................................................. 22 Pre and Post Test ........................................................................................... 22 Survey .............................................................................................................. 23 4 RESULTS ............................................................................................................... 25 Pre and Post Test Results ..................................................................................... 25 Survey Results ........................................................................................................ 28 Contract ............................................................................................................ 29 Pass Offs .......................................................................................................... 31 5 DISCUSSION ......................................................................................................... 34 Introduction ............................................................................................................. 34 Achievement ........................................................................................................... 35 Motivation ............................................................................................................... 37
PAGE 6
6 Conclusion .............................................................................................................. 39 APPENDIX A PREAND POST TEST SCORING SHEET ........................................................... 41 B SURVEY QUESTIONS ........................................................................................... 43 C EXAMPLE CONTRACT SHEET AND SCORING RUBRIC .................................... 46 D EXAMPLE PASSOFF SHEET ............................................................................... 48 E UF IRB PROTOCOL LETTER OF APPRO VAL ...................................................... 49 F ORANGE COUNTY PUBLIC SCHOOLS’ REQUEST FOR RESEARCH FORM .... 50 G PARENT CONSENT FORM ................................................................................... 51 H STUDENT ASSENT FORM .................................................................................... 52 LIST OF REFERENCES ............................................................................................... 53 BIOGRAPHICAL SKETCH ............................................................................................ 55
PAGE 7
7 LIST OF TABLES Table page Table 1 Analysis of Variance for Post Test Total Scores as a Function of Assessment Type and Ensemble ....................................................................... 26 Table 2 Post Hoc Comparison of Post Test Total Scores of all Assessment Types as a Function of Ensemble ................................................................................. 27 Table 3 Analysis of Variance for Average Minutes of Practice as a Function of Assessment Type and Ensemble ....................................................................... 28 Table 4 Post Hoc Comparison of Average Minutes of Practice Time per Week as a Function of All Assessment Types and Ensemble .............................................. 28 Table 5 Quantitative Survey Results for Contract Students ( n = 32) ............................. 30 Table 6 Qualitative Survey Results for Contract Students ( n = 32) ............................... 31 Table 7 Quantitative Survey Results for Pass Off Students ( n = 28) ............................. 32 Table 8 Qualitative Survey Results for Pass Off Students ( n = 28) ............................... 33
PAGE 8
8 LIST OF ABBREVIATIONS Achievement a specific musical accomplishment, often the result of specific instruction. Reading notation, performing a specific piece. (Radocy & Boyle, 2003, p. 385). Contracts a researcher develop ed assessment ins trument based on a four level rating scale: Excellent (5), Good (4), Fair (3), and Poor (2). There are eight different scored categories: Tone, Rhythm, Note Recognition, Musicianship, Right Hand, Left Hand, Bowing, and Posture. The contrac t its elf is a list of items that the student must perform during the grading period. It consists of the list of items with the grading rubric. When a student is prepared to perform one of the requirement s on their contract , the teacher uses the scoring cri teria to assign the grade. Students are allowed to replay any tests to receive their desired final grade during the grading period. Pass Offs a researcher developed assessment instrument based on a pass (A) or fail (F) rating scale. The pass off sheet is a list of items that the student must perform during the grading period. “Passing off†refers to the fact that the student performer meets the high standard that would result in a perfect score. The student performs the selection “flawless ly .†Flaws would i nclude rhythmic errors, note errors, bowing errors, and posture problems. This system requires the assessor to have a high musical standard when listening and assigning scores. Feedback is given verbally to students when they do not perform to the set standard. When student s are prepared to perform one of the requirements on the pass off sheet , they perform for the assessor. If student s do not “ pass off †then they may try again at another time until they do “ pass off .†As presented to the students participating, pass offs must be completed in a specific order to achieve the desired grade. EX: Requirements # 1 through 5 must be completed to receive a “C.†If the student completed # 1 through 4 and then # 8 their grade would in effect be a “D.†Generals This c ategory exists on both the pass off sheet and the contract sheet . This category is open for student choice and possible developm ent outside of the “performancebased†grading procedure. Included in this category are 18 different choices ranging from compos ing to journal writing, and concert attendance. For some students this is a difficult category to complete while others have no problem at all. Much of its completion has to do with personal initiative.
PAGE 9
9 Motivation 1. T he act or an instance of motivating. 2. T he state or condition of being motivated. 3. S omething that motivates; inducement, incentive ( Motivation, N.D).
PAGE 10
10 Abstract of Project in Lieu of Thesis Presented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Master of Music THE EFFECT OF TWO DIFFERENT ASSESSMENT TOOLS ON SECONDARY INSTRUMENTAL MUSIC STUDENTS ’ ACHIEVEMENT AND MOTIVATION By David N. Olsen December 2009 Chair: Timothy S. Brophy Member : Silvio dos Santos M ajor: Music Education The purpose of this study was to explore the effect of two contrasting assessment tools on secondary instrumental music students’ achievement and motivation. The study was guided by the following questions: What is the effect on secondary instrumental music students’ achievement using a rubric based assessment tool versus a “pass off†based assessment tool? What is the effect of the assessment tool used on a student’s motivation to practice? Using two different assessment tools, the study examined 65 students in Grades 6, 7, and 8 and ranging in abilities from beginning through advanced. These students from the researcher’s class were randomly assigned through a systematic process to the researcher ’s created “pass off†grading system (n = 31) and the researcher’ screated rubric based “contract†grading system ( n = 34). The study was conducted over the course of the first 9week grading period of the school year. All students participated in a preand post test to attain “achievement†s core data and a survey to collect background information on the students ’ music education, experience, and their thoughts and feelings about their assessment procedure. Results indicated there was no statistically significant ( p < .05) effect of assessment tool on student
PAGE 11
11 achievement. D ata analysis did indicate significance in the post test achievement score as a function of assessment type and ensemble ( p = .04).
PAGE 12
12 CHAPTER 1 INTRODUCTION Finding the most appropriate way to assess music students, and music in general, has been a quest for music teachers at every grade level . How does one grade a subject, such as music, that can be so subjective and open to personal opinion and preference? One person’s opinion about a performance can be vastly different from another’s. In music education there are different opinions on performance quality. These opinions are formed by teachers’ experiences, their personal abilities, and education. There are ways, though, to help even the playing field for all educators and make the subjective assessment of a musical performance more objective. Using a more objective assessment process , such as an assessment procedure delineated by welldefined rubrics, it may be possible to achieve consistency among music educators ’ assessment scores . The process may also inspire students to achieve advanced goals by using the quantifiable guidelines in their grading procedure that are easy to read, understand, and put into practice. Significance of the Problem An ongoing challenge for instrumental music teachers is finding a defined and practical way of formally assessing students in the secondary instrumental music ensemble. Bell & Bell (2003) found that many times music teachers grade students based on nonmusical attributes such as attendance, behavior, effort, and attitude. Antmann (2007) found in his study of “successful†Florida middle school band programs that many middle school band directors are assigning grades to their students based on playing tests, concert attendance, conduct, and participation. Since s tudent assessment has become an increasingly major component in the public education system , it is even
PAGE 13
13 more imperative music teachers create and use meaningful forms of assessment in this subject. Asmus (1999) and Bell & Bell (2003) noted the importance of why music teachers must come up with meaningful ways of assessment and that documenting students’ learning of specific skills in a standards based curriculum helps demonstrate music education’s worth to the community at large. Purpose of the Study The purpose of this study is to explore the effect of two contrasting assessment tools on secondary instrumental music students’ achievement and motivation. This study is guided by the following questions: What is the effect of a rubric bas ed assessment tool versus a “pass off†based assessment tool on secondary instrumental music students’ achievement? What is the effect of the assessment tool used on a st udent’s motivation to practice? Delimitations The following were not accounted for in this study: g ender, ethnicity, or socioeconomic background of students participating in the study.
PAGE 14
14 CHAPTER 2 REVIEW OF LITERATURE Introduction Recent trends , such as the No Child Left Behind Act (NCLB), have brought our school systems into the national spotlight and focused attention on national standards and assessment . With federal, state, and local governments focusing on assessment, student achievement , and accountability , it is only appropriate to align our music curriculum with these ideas to remain a justifiable subject in our society (Asmus, 1999; Bell & Bell, 2003; Willoughby, Feifs, Baenen, & Grimes, 1995). NCLB has established the arts as part of our education system ’ s “core curriculum†(H.Res.1, 2002). Because of the arts ’ inclusion as core curr iculum , it is even more essential to develop meaningful assessment procedures to demonstrate music’s importance to a child’s education. In this review , the researcher explored philosophical and theoretical rationales about education and learning, and asses sment development and student motivation. Philosophical Rational Orenstein and Hunkins (2009) deliver an introduction to the four major philosophies that have influenced education in the United States : idealism, realism, pragmatism, and existentialism . They go on to w rite about four “agreedupon†educational philosophies . These educational philosophies are perennialism, essentialism, progressivism, and reconstructionism . Orenstein and Hunkins (2009) also indicate these educational philosophies have their roots in one or more of the previously mentioned major philosophies. T his researcher finds their philosophy of education aligned with a pragmatic essentialist view in that the purpose of education is to impart knowledge, skills, and values in students to mak e them self sufficient contributing
PAGE 15
15 members of society. The pragmatist believes in culturing critical thinking in students and using the scientific process (Orenstein & Hunkins, 2009, pg 37) . The essentialist believes in teaching the core set of basic ski lls and knowledge (Orenstein & Hunkins, 2009, pg. 56) . Knowing the foundation of educational philosophies will assist in the development of one’s own philosophy of education, which in turn drives curriculum development and choices in assessment procedures. Theoretical Background Learning theories can be viewed in a traditional sense and categorized into two camps: behavioral and cognitive (Abels, Hoffer & Klotman, 1995; Radocy & Boyle, 2003). Radocy and Boyle go on to write that learning is defined as “an observable change in behavior, due to experience, which is not attributable to anything else (p. 396). †Different forms of learning in the behaviorist camp include classical conditioning, established by Russian physiologist I. P. Pavlov, and operant conditioning, associated with works by B. F. Skinner. In c lassical conditioning, also known as Pavlovian conditioning, an unconditioned stimulus which elicits an unconditioned response is preceded by a conditioned stimulus to the point that the unconditioned response becomes a conditioned response when the unconditioned stimulus is removed and replaced with the conditioned stimulus. A classic example of this type of conditioning is in the experiments Pavlov performed with dogs. Pavlov exposed dogs to meat powder (unconditioned stimulus) and the dogs would salivate (unconditioned response) . Pavlov then rang a bell (conditioned stimulus) preceding each exposure to the meat powder and the dogs would salivate. Over time, the ringing of the bell made the dogs salivate (conditioned response) even when there was no meat powder presented. In operant conditioning, a desired response is made
PAGE 16
16 stronger through selective reinforcement. Radocy and Boyle give the example of “if an encaged pigeon pecks at a particular spot and receives food as a consequence, the pigeon is more likely to peck at that spot again and can learn to do it when requiring food (2009, p. 399).†Cognitive learning theories base their learning on organization and reorganization of learning structures (Radocy & Boyle, 2009) . Leading names in this area of research include Bruner, Kohler, and Piaget. T heoretical views in this camp stem from and include Gestalt theory. Gestalt theorists are primarily interested in perception, or understanding the whole, and thr ough the organization of this perception, information and concepts are learned (Radocy & Boyle, 2009). Swiss biologist Piaget has had a great impact on child development theory with his wellknown Stage Theory (Abeles et al., 1995). His theory presents four theoretical stages of child development: sensorimotor stage, preoperational stage, concrete operational thought, and finally formal operational thought. J. Bruner, an Am erican psychologist, also has developed a theory on child development that is divided into three, less rigid , stages: enactive, iconic, and symbolic. Studying these child development theories could assist teachers in understand ing the mental development and the readiness to learn of the particular students they are teaching and in turn, ca n sequence their curriculum to best fit the students’ learning process (Abeles et al., 1995). Applications of both learning theories find their way into the music classroom. Behavioral theories help with maintaining general order in the classroom and proper behavior. This theory can also be used in general rehearsal techniques as the teacher is giving verbal feedback to the students about their performance to increase the
PAGE 17
17 likelihood of the same response. This can be categorized as a form of operant conditi oning (Abeles et al., 1995). Cognitive theories help teachers organize information and curriculum into segments that can be taught to students and then analyzed to see if they have been learned and to what degree they have been learned. One such method that organizes information and then uses behavioral techniques to assess skill acquisition is Bloom’s Taxonomy of the Cognitive Domain (Abeles et al., 1995). Assessment Development Developing quality assessment methods presents its own challenges (Baker, Asc hbacher, Niemi, & Sato, 1992). Some teachers believe intensely that developing formal assessments will take away from the abstract nature of the art (Asmus, 19991; Willoughby et al.,1995). Baker, Aschbacher, Niemi, & Sato (1992) conducted a fiveyear research study on constructing an assessment method that would thoroughly test cognition of material taught. Their study explored the construction of accurate performancebased assessments. Assessments should measure students’ knowledge of skills and verify th at they have acquired the concepts taught in class. Assessment using rubrics helps guide the teacher in pacing instruction, present ing curriculum, and setting performance standards (Asmus, 19991). Asmu s goes on to state that having wellthought out learni ng objectives helps guide the teacher and the students through the curriculum and to the assessment method. Using rubrics can also help the students being assessed by giving them clear guidelines as to what is being assessed and what they can do to perform better in the future ( Asmus, 19992). Willoughby et al. (1995) write that rubrics document student progress and supply teachers with useful information when communicating with the principal, parents, and students. They also state how assessing an arts cour se can be subjective and having a method of formal
PAGE 18
18 assessment helps to objectify the results. Many school districts, as well as State and National associations such as The National Association for Music Education ( MENC) , have curriculum resources and guides that make a good starting point for targeting learning objectives and developing a sequence of instruction (Asmus , 19991). Motivation Abeles et al. (1995, pg. 212) define motivation as “ the energy that a learner employs when changing a behavior .†They g o on to state that educators primarily focus on a student’s secondary or psychological drives that “ include fear, love, frustration, curiosity, the need for stimulation, [ and] the need for achievement†(pg. 213). Assessment should be a motivational teachi ng tool (Baker, Aschbacher, Niemi, & Sato, 1992). In his study, Asmus (1986) focused on the opinions of the students and their ideas of success and failure pertaining to music class. Asmus us ed a 10item openended questionnaire to collect data from the st udents and then conceptualized those data using Weiner’s Attribution Theory . Weiner’s Attribution Theory is a framework that assumes that people try to determine why people do what they do ( L earning Theories Knowledgebase, 2009). When people succeed at som ething, they tend to attribute that success with their own skill and when people fail, they will attribute that failure to some external cause. Assessment using rubrics give s the students the power to self assess. The guidelines or descriptors, a key elem ent on the rubric chart (Asmus 19992), act as a guide in individuals ’ practice. Hewitt (1991) found research literature relating to self assessment on musical performances to be inconclusive, because of the l ack of study in this area, but self assessment d id have a positive result on students ’ practice attitude. Hewitt mentions that students who received training on how to self assess were more
PAGE 19
19 positive about music, their class es , and their teacher (pg. 309). Using a rubric to guide their practice, student’ s attitude toward practice may increase. Asmus ( 19991) notes that having a mappedout sequence of learning can lead to increased student motivation, in turn enhancing knowledge and skill.
PAGE 20
20 CHAPTER 3 METHODOLOGY AND PROCED U RES Introduction This action research was based on a mixed method research approach, dominant status sequential design, in which the researcher collected both quantitative and qualitative data (QUAN --> qual) . Permission for this study was acquired from the University of Florida Institutional Review Board (Appendix E). Permission was also acquired from the researcher’s school district’s Accountability, Research, and Assessment Department (Appendix F). The researcher used students from the researcher’s class who volunteered to participate. P arents completed consent forms (Appendix G) and students completed assent forms (Appendix H) in order to take part in the study. The study w as conducted during the first 9week grading period ( beginning September 7th and concluding on October 27th) of the 20092010 school year. Sixty five students volunteered to participate: 28 beginners, 12 intermediate, and 25 advanced students in G rades 6, 7, and 8. Study Procedures S tudents participating in this research were placed in their assessment method based on s ystematic sampling by their class. Each class was alphabetized by last name. The researcher then chose a random starting place in each list and placed every 3rd student into an assessment procedure: either the contract system or the pass off system. During the course of the 9week grading period , students were given three opportunities during their class period to test using their assessment procedure. Each of the student s tested by performing their musical selection for the entire class. Students on the rubric based contract received a numerical grade based on how they performed
PAGE 21
21 and based on the standards outlined in the rubric. Students using the pass off system either played their excerpt “ flawlessly †and “passedoff†or were stopped at the first flaw and told that they did not pass off. All students were given the opportunity to stay after school or come to school early to perform their excerpts again for the grade they desired. R equirements for beginning orchestra students’ tests were as follows : T he D major scale one octave with arpeggio pizzicato, memorized Essential Elements 2000 book 1 exercises # 9, 19, 22, and 34 Winning Rhythms charts 1, 2, 3, and 4 (students were required to count a randomly selected line from each chart) C omplete 3 out of 18 “generals.†Requirements for intermediate orchestra students ’ tests were as follows : the D and G Major scales two octaves (where appropriate for their instrument ) with arpeggio, memorized Essential Elements 2000 book 2 exercises # 36 and 47 and Essential Technique 2000 book 3 exercises 21 and 22 as one exercise and 23 and 24 as exercise Winning Rhythms charts 10, 11, 12, and 14 (students were required to count a randomly selected line from each chart) C omplete 3 out of 18 “generals.†Requirements f or a dvanced orchestra students’ test were as follows : T he D, G and C Major scales two octaves (where appropriate for their instrument ) with arpeggio, memorized
PAGE 22
22 Essential Technique 2000 book 3 exercises 21 and 22 as one exercise, # 35, a nd # 47 and 48 as one exercise Winning Rhythms charts 14, 15, 16, and 17 (students were required to count a randomly selected line from each chart) C omplete 3 out of 18 “generals.†Data Collection and Analysis Data were collected through a preand post test pr ocedure and a concluding survey with which the researcher gathered background information regarding students’ musical experience and their experiences and feelings concerning their assessment procedure. Pre and Post Test The preand post test consisted of each stu dent recording two of their required materials, a “prepared piece†and a scale, and also a sight reading exercise. The beginning students’ required materials were exercise # 34 and the memorized D Major scale with arpeggio (pizzicato) . The intermediate and advanced students’ required materials were exercise # 21 and 22 as a continuous line and the memorized D Major scale two octaves with arpeggio (bowed) . For the pre test , students were only told that they were to play a prepared piece , a scale, and a sight reading exercise. The students were not informed what those items would actually consist of until they entered the recording area. Once the students recorded, they were informed that the selections constituting the pretest would make up the post test and the only element that would change would be the sight reading exercise. The pretest occurred a few weeks into the 9 week grading period to give all students the opportunity to begin work on all exercises required of them for their grading procedure. The post test occurred after the end of the
PAGE 23
23 grading period, when all grades had been finalized . Students again recorded the prepared piece and scale mentioned previously and a slightly different sight reading than before. Recordings were made in the researcher ’s office using a laptop, an external microphone, and the computer program Audacity. R esults of the preand post test were analyzed by the researcher using an instrument based on the “Woodwind/Brass Solo†evaluation form (Saunders & Holahan, 1997) ( Appendix A ) . This form, while used in almost its complete state, was modified to work with string performers rather than wind performers . Elements such as breath control and articulation were replaced with bowing and bow control. Other elements such as embouchur e formation and observable posture elements were eliminated from scoring. Survey The survey was constructed by the researcher using the online survey creator Surve y Monkey ( Appendix B ). The information collected was used to gain background information about the students’ education in music . The information collected in the survey included their time spent in music class during school, the number of months they took private lessons on their orchestra instrument and/or piano, the average number of minutes per week they spent practicing their instrument during the grading period, the number of months the student had been playing an orchestra instrument, and their thoughts and opinions on their experience during the grading period as it related to their assign ed assessment procedure. The survey was administered during each students’ 46minute class period in the school’s computer lab. The students were given written instructions on how to access the Survey Monkey survey. Students who needed help raised their hands and were helped by the researcher. R esults of the survey yielded both quantitative and qualitative data.
PAGE 24
24 Other data collected included the assigned assessment procedure (pass offs or contract), their grade level, their age in months, their instrument (violin, viola, cello, or bass), and their ensemble level (beginning, i ntermediate, a dvanced).
PAGE 25
25 CHAPTER 4 RESULTS Pre and Post Test Results Analysis of Variance for preand post test total score as function of the assessment type were conducted. The pretest as a function of assessment method results indicated that minutes of music instruction had a high significance ( p= .005) along with months of playing on the instrument ( p= .03) in predicting the student’s score. This test also indicated that assessment type was not found to be significant ( p= .70). The post test as a function of assessment type yielded slightly different results indicating that assessment type had more significance ( p= .04) ; months of playing an orchestra instrument was still significant ( p= .0 4 ) ; and minutes of music instruction was no longer significant ( p= .80). Observing the effects of the assessment type and the ensemble yielded a statistically interesting result. In an analysis of variance of the pretest total scores using two independent variables ( ensemble and assessment type) yielded the result that minutes of music instruction in school was significant in the score the students received ( p= .01). A nalysis of variance of the post test total scores with the same two independen t variables yielded the result that ensemble type, while not as statistically significant, was more significant ( p= .22) than minutes of music instruction ( p= .97) which was no longer significant. Analysis of Variance using assessment type combined with e nsemble revealed interesting statistics (Table 1) . D ata collected concerning the post test total score as a function of assessment type were not statistically significant ( p= .29). The post test total score as a function of the ensemble in which the students were enrolled was significant
PAGE 26
26 ( p= .02). When looking at the post test total score as a function of both assessment type and ensemble together there was also a statistically significant result ( p= .04) . Table 1 Analysis of Variance for Post Test Total S cores a s a Function of Assessment Type and Ensemble Source df F p Assessment Type 1 1.14 .29 Ensemble 2 4.22 .02 * Assessment Type * Ensemble 2 3.29 .04* Error 59 Total 64 Note: * p < .05 T he mean post test total score in beginning orchestra was 73.96. When observing the beginner’s post test total scores delineated by assessment type, it was found that the students who used the contract had a mean score of 81.63, while those on the pass offs received a mean score of 64.88. For the intermediate orc hestra as a whole, the mean score for the post test total score was 77.54. S tudents on the contract had a mean score of 78.21 and students on the pass offs earned a mean score of 76.60. In the advanced orchestra, the mean post test total score as a group w as 85.78. The students who were using the contract earned a mean score of 83.21, while those on pass off s earned a mean score of 88.15. Table 2 shows the post hoc comparison of the post test total scores of all assessment types as a function of the ensembl e in which the student is enrolled.
PAGE 27
27 Table 2 Post Hoc Comparison of Post Test Total Scores of all Assessment Types a s a F unction of Ensemble Assessment Type Ensemble Difference of Means p Pass Offs Beginning 16.75 .0 7 Intermediate 1.61 1.00 A dvanced 4.95 .96 Note: All comparison types are based on post test total score for that ensemble’s contract. Observing the effects of assessment type on the average minutes of practice per week also yielded some interesting results. The mean number of m inutes per week for students on the contract was 88.5 minutes while those on the pass off assessment practiced a mean of 149.5 minutes , a 51.3% difference. Table 3 shows data indicating the significance of the assessment type and ensemble on average practi ce time in minutes. Breaking it down further and analyzing those assessment type effects pertaining to each ensemble member’s practice time yields much the same result in that the pass off students practiced longer: Beginning students on contract = 69.5 mi nutes vs. passoffs = 87.7 minutes , a difference of 23.3% ; Intermed iate students on contract = 88.3 minutes vs. pass offs = 116.0 minutes , a difference of 27.1% ; Advanced students on contract = 112.5 minutes vs. pass offs = 224.2 minutes , a difference of 66.3% . Table 4 shows the average amount of practice time in minutes as a function of assessment type and ensemble. There is a significance indicated with the assessment type used only in the symphonic orchestra as to how much average time per week the students spent practicing.
PAGE 28
28 Table 3 Analysis of Variance for Average Minutes of Practice a s a F unction of Assessment Type and Ensemble Source df F p Assessment Type 1 6.93 .01* Ensemble 2 9.87 .00* Assessment Type and Ensemble 2 2.85 . 07 Error 59 Total 64 Note: * p < .05 Table 4 Post Hoc Comparison of Average Minutes of Practice Time p er Week as a Function of All Assessment Types and Ensembl e Assessment Type Ensemble Difference of Means p Pass Offs Beginning 18.23 .98 Intermediate 27.71 .99 Advanced 111.73 .01* Note: All comparison types are based on student reported average practice time per week for that ensemble’s contract participants . * p < .05 Survey Results M uch of the survey was used to collect information to assist with the preand post test results . Information was also collected regarding students ’ feelings and thoughts about their assessment procedure. All 65 students participating in the research completed the survey information that was necessary for analyzing the preand post test result data. That information included how many months they had been taking private lessons on their orchestra instrument, how many months they had been taking
PAGE 29
29 piano lessons, on average how many minutes per week they practiced, and how many minutes they had a music class each day. Sixty one of the 65 students completed the remainder of the survey , which consisted of questions concerning their thoughts and feelings about their assessment procedure. Some students skipped various questions for reasons unknown to the researcher . Contract Table 5 shows the quantitative results gained through the survey as answered by the students using the contract assessment procedure. Questions are listed with the responses the students had to choose from. The first t wo questions allowed multiple answers while the last five only allowed one possible answer.
PAGE 30
30 Table 5 Quantitative Survey Results for Contract Students ( n = 32) What was your FIRST feeling(s) about the testing method for which you were chosen? Mark all ans wers that apply. Responses Happy 72.2% (21) Sad 0.0% (0) Encouraged 41.4% (12) Discouraged 0.0% (0) Did not care either way 34.5% (10) Other 31.0% (9) Skipped question (3) How did you feel after you took you FIRST test using your te sting method? Mark all answers that apply. Happy 59.4% (19) Sad 0.0% (0) Shocked 9.4% (3) Encouraged 37.5% (12) Discouraged 0.0% (0) Did not care either way 31.3% (10) Skipped question (0) After you took a test, how did you feel? Encouraged to continue 96.6% (28) Discouraged to continue 3.4% (1) Skipped the question (3) My testing method made me practice more I agree 93.8% (30) I disagree 6.3% (2) Skipped the question (0) My testing method made me WANT to practice I agree 75.0% (24) I disagree 25.0% (8) Skipped the question (0) My testing method made me NOT want to practice I agree 6.3% (2) I disagree 93.8% (30) Skipped the question (0) My testing method helped me become a better player on my orchestra instrument I agree 83.9% (26) I disagree 16.1% (5) Skipped the question (1)
PAGE 31
31 Analyzing the qualitative data collected in the survey concerning the contract revealed themes. Data in Table 6 consist s of patterns found in the student wri tten responses to the following questions about their assessment procedure: W hat were three things that you liked about the way you were tested? What were three things that you did not like about the way you were tested? In a few words, write any final thoughts, feelings, and/or experiences about your testing experience. Table 6 Qualitative Survey Results for Contract Students ( n = 32) Theme Data “Like about the Contract†Could see what was done wrong and how to fix it. Improve grade by trying again. Order of what you completed did not matter. Get a grade no matter. Test when you wanted to. “Dislike about the Contract†Testing in front of the class Inability to make a perfect score. “Practice Attitude†None. “Final thoughts†Helped make me a better player. Could always do it again. Could always get a higher grade. PassOffs D ata in Table 7 are the quantitative results gained through the survey as answered by the pass off students. Questions are listed with the responses the students had to choose from. The first two questions allowed multiple answers while the last five only allowed one possible answer.
PAGE 32
32 Table 7 Quantitative Survey Results for Pass Off Students ( n = 28) What was your FIRST feeling(s) about the testing method for whic h you were chosen? Mark all answers that apply. Response s Happy 27.3% (6) Sad 13.6% (3) Encouraged 31.8% (7) Discouraged 18.2% (4) Did not care either way 31.8% (7) Other 50.0% (11) Skipped question (6) How did you feel after you to ok you FIRST test using your testing method? Mark all answers that apply. Happy 48.1% (13) Sad 18.5% (5) Shocked 40.7% (11) Encouraged 29.6% (8) Discouraged 25.9% (7) Did not care either way 3.7% (1) Skipped question (1) After you took a test, how did you feel? Encouraged to continue 85.2% (23) Discouraged to continue 22.2% (6) Skipped the question (1) My testing method made me practice more I agree 92.6% (25) I disagree 7.4% (2) Skipped the question (1) My t esting method made me WANT to practice I agree 77.8% (21) I disagree 22.2% (6) Skipped the question (1) My testing method made me NOT want to practice I agree 22.2% (6) I disagree 77.8% (21) Skipped the question (1) My testing method helped me become a better player on my orchestra instrument I agree 88.9% (24) I disagree 11.1% (3) Skipped the question (1)
PAGE 33
33 Analyzing the qualitative data collected in the survey concerning pass offs al so revealed themes. Data in Table 8 co nsists of patterns found in the student s’ written responses to the following questions: What were three things that you liked about the way you were tested? What were three things that you did not like about the way you were tested? In a few words, write any final thoughts, feelings, and/or experiences about your testing experience. Table 8 Qualitative Survey Results for Pass Off Students ( n = 28) Theme Data “Like about p ass o ffs†It was quick. You did not really have to worry about your grade Made me want to practice more so I could pass off the first time. Get a “perfect†score. It was fun Improve your grade by trying again. Test when you wanted to. “Dislike about pass o ffs†Keep doing it until got it right for a good grade. It was hard to pass off Required Perfection. More pressure (stress) No passing = Zero Order of grades. One mistake = Fail Testing in front of class Took a longer time “Practice Attitude†Encouraged to practice “Final thoughts†It was too hard. Made me practice more.
PAGE 34
34 CHAPTER 5 DISCUSSION Introduction This study began with an invitation to the researcher’s 140 middle school orchestra students to participate. Of those 140 students, 75 students fully completed the consent and assent forms appropriately and began the study. Over the course of the 9week grading period there was a 10student reduction due to various reasons: extended absences due to the flu, students moving away, and general absences that caused those students not to complete all parts of the preand post test treatments or the survey. Therefore, the data collected on those 10 students were excluded from the analysis. U sing their assessment method, all students were assessed during the school day three times during the 9week grading per iod. For grades the student s did not have completed or wanted to score higher on after those days , the students were made responsible for testing before or after school with the researcher. Early on, the researcher noticed that the beginners, who had no pr ior experience with either form of grading, took to the pass offs quickly. After the first testing day , many of the beginners understood what the expectation was for something to be “passed off.†After the first testing experience, most beginners passedof f on their first try. For the students who had experience with the contract testing procedure in previous years, it took them longer to understand what was expected of them to get “passedoff.†Many of them would attempt to pass off a requirement up to fiv e times before actually achieving their goal. The researcher observed behavioral cues as to what the students were feeling in their pass off experience. Anger and frustration was apparent. Many times the
PAGE 35
35 researcher feared those students using the pass off assessment would quit the class because of the high expectation and pressure to get a good grade (pressure applied by the student s and their parent s/guardians) , but reassured the students that this was only a research project and that they would be taken c are of at its conclusion. In the end, and without informing the students so as not to effect their motivation to achieve their chosen final grade, the researcher added some point s to their final grade to help all students who participated in the research project so that their participation in the research was not to effect their grade in the course. Achievement What are the effects on secondary instrumental music students’ achievement using a rubric based assessment tool versus a “pass off†based assessment tool? Statistically speaking there is no significant difference in the achievement of the students regarding which assessment tool they used. However, w hen observ ing the post test total scores as they compare with what ensemble the students are involved in, we see a slightly different result. In T able 1 the data indicates that what ensemble a student is in has a statistically significant effect on their post test total score (p = .02). The data also indicates statistical significance in the post test tota l score when assessment type and ensemble covary. This researcher obser ved that the data collected on the students in the beginning and intermediate orchestra classes on the contract seemed to score higher on the post test than those on pass offs . Convers ely, the students in the advanced class who wer e participating on the pass offs seemed to score higher on the post test than those on the contract . It is the researcher ’ s opinion that the reason for this difference between the students in the two ensembles could be a combination of things. One, students in the advanced class are placed there for a
PAGE 36
36 particular reason. To be placed in that class the student had to audition and perform at a higher level. This fact in itself would indicate a high level of personal achievement on the student’s part. Also, this would indicate that the student has a high motivation to achieve, and possibly a certain “ competitive spirit. †The pass off sheet would seem to feed into these traits: high motivation, desire for a high level of personal achievement, and the competitive spirit. For student s to “pass off , †they must be motivated to try again when they experience failure, and there is a competitive nature when they are “pass ing off†to see who can pass off first. This researcher observed that many of his most advanced players enjoyed the pass off procedure not just because it was something new, but because it was “challenging†and “fun.†He also noted that in all classes where the student was not as serious about performing, but rather took the class as something fun to do or was a beginner and just introduced to playing the instrument , pass offs were a serious issue and very uninspiring. I t is es sential to note the findings in Table 2 ; beginners using the contract assessment t ype received a higher total score on the post test. This finding, while not statistically significant ( p = .07) , regarding the beginning orchestra students on the pass offs scored lower on the post test total score it is important to note that there was a 16.75 difference of means between the two scores. This difference has a practical importance for educators in that we want the very best educational assessment tools to be used. Here it does seem that the contract assessment work s better than the pass off assessment method. In the beginning and inter mediate classes, the results indicate that a rubric based contract assessment method is beneficial. Students seemed to perform better when given the opportunity to reflect and learn through the small incremental
PAGE 37
37 achievement s gained through the researcher’s contract assessment process . Students were given a score each time they performed, based on eight graded areas of their performance. They then were able to reflect on their performance by referring to the grading rubric. They could make adjustments to their performance and then take their test again to gain a higher grade. This constant cycle of grading, reflection, guided practice, and grading again lends itself well to the younger , less experienced student s w ho need more guidance and care. Motivation Is a student’s motivation to practice different bas ed on the assessment tool used? D ata gained through the survey and researcher observation was fascinating regarding motivation to practice. Considering all case s observed, students who were assigned the pass off procedure practiced on average twice as much as those using the contract. T h is researcher believes that the reason for this occurrence is the extrinsic motivat ion caused by the pressure of making a good grade in the class. They practiced, as mentioned by one student, to “make the grade.†This pressure was placed on them most likely by their parents/guardians and in turn themselves. Perhaps students who were assigned the contract method had more goal orient ed practice that allowed them to get more done in a shorter amount of time. It is the quality of practice versus the quantity of practice. Students practicing with the contract can use the rubric to reflect on concepts on which they need to work. For some of the more advanced students, it seems to this researcher, participating in the pass off assessment procedure was truly a thrill, a game, and in fact intrinsic in nature. Many times , students who fall into this latter group have crossed over in their musi cal experience from extrinsically motivating factors to the sheer enjoyment of the subject and performing.
PAGE 38
38 As far as motivation to practice is concerned, as music teachers we walk a fine line. We must know our students, their lives, and their personalities. We must tailor our curriculum to their individual needs. Most music students are motivated to make good grades in our classes. Some are fine with making grades that are below an “A,†while other s have to have “straight As†and will not rest until they achieve their goal. On a pass off system, it is easy for the teacher to set the expectation for something to be pass edoff high and then students get frustrated and quit because they have no success. Then, over time, it is also easy for teacher s to fall int o the trap of low er ing their expectations to allow the students to feel success. It is this researcher ’ s belief that using a system such as a rubric based contract assessment, for the average student, i s more appropriate . The data in Table 8 shows the thoughts and feelings of students on the pass off system. They reported on feeling more stressed about performing. The students also thought that it was hard compared to the contract system because one mistake meant a zero for a grade. With the rubric based as sessment , you can motivate students to make a numerical grade no matter how they perform. If students want to make a higher grade, or want to increase their score then , they have that option. For other students who are fine being average, they will continue to play for any kind of score they receive the first time. The difference between pass offs and contracts here is that all students will get some sort of grade on a contract rather than not being able to play under pressure and just failing music class. Our goal as music educators is to teach a lifelong appreciation for our art and not push students away because they do not perform something perfectly and then teach them to hate music.
PAGE 39
39 Conclusion It is not enough for teachers to take the route of pass o ffs because it has the potential to c reate outstanding music program: m usic programs with students who can perform something they have been conditioned to play perfectly. All music teachers’ standards of performance and backgrounds are different. What qual ifies as quality to one educator may not be quality to another. Ultimately , in music, we are dealing with a person’ s opinion of a performance rather than facts. Facts , on the other hand, cannot be disputed. “You did not use the correct bowing. †“ You did not play with the correct rhythm.†These are concrete concepts that no one can dispute. Assessment that is based in a rubric will be respected by other subject areas, and be understood and gain the support of students, parents and administrators. W ith a tai l ored curriculum that is rubric based, students can learn many components of the performance. They learn to properly develop all parts of their craft and technique, and through this development , over an extended period of time, we as music educators will fo ster a life long love for music and perhaps continue the student’s participation in and s upport of the arts as an adult . Teaching is an evolutionary process. Much of my career as a music teacher can be described as such a process. My philosophies on assess ment and curriculum have changed dramatically since I began teaching ten years ago. What started out as teaching to get through the day, teaching to the performance, and teaching to make high scores at festivals and competitions with my group s has changed. It has developed into teaching the child, teaching with an end goal in mind, and teaching with a philosophical belief that in the end, music must be taught to enrich the child’s life, strengthen creativity and broaden appreciation of music . The assessment of students
PAGE 40
40 should be based on the concepts and skills you feel are essential for them to learn. Teachers should use their state’s s tandards as a guide in developing of such assessments. This will encourage and develop life l ong learning skills in each st udent , not just skills to “pass the test.†Through this research, I have affirmed my belief in using rubrics in assessment. I also have come to believe that when students are trained to use a rubric based assessment procedure, they will conceptualize and organize their practice time with more efficient learning. The rubrics can be guides in their practice that are essential and will instill future goodpractice habits and routines.
PAGE 41
41 APPENDIX A PREAND POST TEST SCORING SHEET
PAGE 42
42
PAGE 43
43 APPENDIX B SURVEY QUESTIONS Background: The following questions will gather background information about you, your music knowledge, and music education. Please take your time and answer every question honestly and to t he best of your ability. If you have any questions about a question, please ask Mr. Olsen. Please type your name: Last, First How many total months have you played your orchestra instrument? Please include any elementary orchestra time even if you switched instruments when you came to middle school. How many months have you taken private lessons on your orchestra instrument? Please state your answer to the closet half month: Ex. 3 months or 3.5 months (three months or three and a half months). Remember ther e are 12 months in a year. Zero CAN be an answer. How many months have you taken piano lessons? Please state your answer to the closet half month: Ex. 3 months or 3.5 months (three months or three and a half months). Remember there are 12 months in a year. Zero CAN be an answer. On average, how many MINUTES per week do you practice your orchestra instrument? How many minutes in the school day do you have music class? o 46 minutes (one class period of music) o 92 minutes (two class periods of music) o 138 minutes (three class periods of music) Questions about the testing procedure: The following questions deal with the process that you went through during the testing procedure. Please take your time and answer each question honestly and to the best of your ability . If you have any questions about a question, please ask Mr. Olsen. Which testing method were you selected for?
PAGE 44
44 What was your FIRST feeling(s) about the testing method for which you were chosen? Mark all answers that apply. o Happy o Sad o Encouraged o Discouraged o Did not care either way o Other (please specify): A b lank text box was given for their own respons e s How did you feel after you took your FIRST test using your testing method? Mark all answers that apply. o Happy o Sad o Shocked o Encouraged o Discouraged o Did not car e either way In a few short statements, answer the question that applies to your test: Blank text box for their own response. o Pass o ffs: How did it make you feel when you “passed off?†o Grading contract: How did it make you feel when you received your grade or your test? What were three things that you liked about the way you were tested? Please number your answers. A blank text box was given for their own responses. What were three things that you did not like about the way you were tested? Please number your answers. A blank text box was given for their own responses. After you took a test, how did you feel? o Encouraged to continue o Discouraged to continue
PAGE 45
45 My testing method made me practice more. o I agree o I disagree My testing method made me WANT to practice. o I agree o I disagree My testing method made me NOT want to practice. o I agree o I disagree My testing method helped me become a better player on my orchestra instrument. o I agree o I disagree In a few words, write any final thoughts, feelings, and/or experiences about your testing experience. A blank text box was given for their own responses.
PAGE 46
46 APPENDIX C EXAMPLE CONTRACT SHEET AND SCORING RUBRI C
PAGE 47
47
PAGE 48
48 APPENDIX D EXAMPLE PASSOFF SHEET
PAGE 49
49 APPENDIX E UF IRB PROTOCOL LETTER OF APPROVAL
PAGE 50
50 APPENDIX F ORANGE COUNTY PUBLIC SCHOOLS’ REQUEST FOR RESEARCH FORM
PAGE 51
51 APPENDIX G PARENT CONS ENT FORM
PAGE 52
52 APPENDIX H STUDENT ASSENT FORM
PAGE 53
53 LIST OF REFERENCES Abeles, H. F., Hoffer, C. R, & Klotman, R. H. (1995). Foundations of Music Education. New York, NY: Schirmer Books. 2nd Edition. Antmann, M. D. (2007). Assessment a nd Grading in the Beginning Band Classroom . Masters’ Thesis. Florida State University, Florida. Retrieved December 5th 2009. http://etd.lib.fsu.edu/theses/available/etd07092007172856/ Asmus, E. P. (19991). Music Assessment Concepts. Music Educators Journal , 86(2, Special Focus: Assessment in Music Education), 1924. Asmus, E. P. (19992). Assessment. Retrieved August, 2009, from Music Assessment Web Site at the University of Miami: http://www.music.miami.edu/assessment/index.html Asmu s, E. P., Jr. (1986). Student Beliefs about the Causes of Success and Failure in Music: A Study of Achievement Motivation. Journal of Research in Music Education, 34(4), 262278. Baker, E. L., Aschbacher, P. R., Niemi, D., & Sato, E. (1992). CRESST Performance Assessment Models: Assessing Content Area Explanations . California: UCLA Center for the Study of Evaluation. Retrieved from: h ttp://www.cse.ucla.edu/products/guidebooks/cmodels.pdf Bell, A., & Bell, M. (2003). Developing Authentic Assessment Methods from a Multiple Intelligences Perspective. (ERIC Document Reproduction Service No. ED 479391) Hewitt, Michael P. (2001). The Effects of Modeling, Self Evaluation, and Self Listening on Junior High Instrumentalists' Music Perform ance and Practice Attitude. Journal of Research in Music Education, 49(4), 307322. Retrieved from International Index to Music Periodicals database.
PAGE 54
54 Learning Theories Knowledgebase (2009, December). Attribution Theory (Weiner) at LearningTheories.com. Retrieved December 3rd, 2009 from http://www.learning theories.com/weiners attributiontheory.html Motivation. (n.d.). Dictionary.com Unabridged. Retrieved November 23, 2009, from Dictionary.com website: http://dictionary.reference.com/browse/Motivation H. Res. 1, 107th Cong, 115 Strat.1425 (2002) (enacted). Public Law 107110, No Child Left Behind Act of 2001 (Short Title). [Electronic Version] Orenstein, A. C., & Hunkins, F. P. (2009). Curriculum: Foundations, Principles, and Issues . Boston, Mass.: Pearson. 5th Edition. Radocy, Rudof E., & Boyle, J. David. (2003). Psychological Foundations of Musical Behavior. Springfield, Ill.: Charles C. Thomas, Publisher, LTD. 4th Edition. Saunders, T. C. & Holahan, John M. (1997). Criteria Sp ecific Rating Scales in the Evaluation of High School Instrumental Performance. Journal of Research in Music Education, 45(2), 259272. Retrieved from http://www.jstor.org/stable/3345585 Willoughby, M., Feifs, H., Baenen, N., & Grimes, E. (1995). B ehind the scenes: Measuring student progress in the arts and beyond. evaluation and research report 95E.06 Evaluation and Research Dept., Wake County Public School System, 3600 Wake Forest Rd., Raleigh, NC 27611.
PAGE 55
55 BIOGRAPHICAL SKETCH A native of Marshfield, Maine, Mr. David N. Olsen received his Bachelor of Music E ducation degree from the University of Florida in 2000. He then obtained a position in Alachua County , Florida, as Director of Music at Lincoln Middle School, where he taught for seven years. Mr. Olsen's bands and orchestras have consistently received high honors at the district level and at other music festivals. Mr. Olsen served as Alachua County Middle School Honor Band Chairman, the District Liaison for Secondary Music in the Alachua County School System, and the School Board of Alachua County Representative for the Alachua County Youth Orchestra Executive Board. He is on the Board of Advisors for the National Adjudicators Middle School Festival. He also serves as a guest clinician for numerous ensembles. Professional memberships include Florida Bandmasters Association, Florida Music Educator's Association, Music Educator's National Conference, and the Florida Orchestra Association. Mr. Olsen currently teaches orchestra at Chain of Lakes Middle School in Orlando, Florida, for the Orange County Public School system, and he resides in Winter Park.
PAGE 1
1 THE EFFECT OF TWO DIFFERENT ASSESSMENT TOOLS ON SECONDARY INSTRUMENTAL MUSIC STUDENTS’ ACHI E VEMENT AND MOTIVATION By DAVID N. OL S EN SUPERVISORY COMMITTEE TIMOTHY S. BROPHY, CHAIR SILVIO DOS SANTOS, MEMBER A PROJECT IN LIEU O F THESIS PRESENTED TO THE COLLEGE OF FINE ARTS OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF MUSIC UNIVERSITY OF FLORIDA 2009
PAGE 2
2 2009 David N. Olsen
PAGE 3
3 To Kristin and my family.
PAGE 4
4 ACKNOWLEDGMENTS I thank the 200809 master of music education faculty for their hard work and dedication to seeing the first class of summer masters in music through to their goal. Drs . Timothy S. Brophy, Charles R. Hoffer, Russell L. Robinson, Paul Richards, Silvio dos Santos , Kara Dawson, and Miriam Zach are educators who care and are truly dedicated to the future of music education. I thank my parents for getting me involved and supporting me in music. Finally, I thank Kristin for being patient and confident thr ough the long summers apart and the falls of online classes.
PAGE 5
5 TABLE OF CONTENTS page ACKNOWLEDGMENTS .................................................................................................. 4 LIST OF TABLES ............................................................................................................ 7 LIST OF ABBREVIATIONS ............................................................................................. 8 ABSTRACT ................................................................................................................... 10 CHAPTER 1 INTRODUCTION .................................................................................................... 12 Significance of the Problem .................................................................................... 12 Purpose of the Study .............................................................................................. 13 Delimitations ........................................................................................................... 13 2 REVIEW OF LITERATURE .................................................................................... 14 Introduction ............................................................................................................. 14 Philosophical Rational ............................................................................................. 14 Theoretical Background .......................................................................................... 15 Assessment Development ...................................................................................... 17 Motivation ............................................................................................................... 18 3 METHODOLOGY AND PROCEDURES ................................................................. 20 Introduction ............................................................................................................. 20 Study Procedures ................................................................................................... 20 Data Collection and Analysis .................................................................................. 22 Pre and Post Test ........................................................................................... 22 Survey .............................................................................................................. 23 4 RESULTS ............................................................................................................... 25 Pre and Post Test Results ..................................................................................... 25 Survey Results ........................................................................................................ 28 Contract ............................................................................................................ 29 Pass Offs .......................................................................................................... 31 5 DISCUSSION ......................................................................................................... 34 Introduction ............................................................................................................. 34 Achievement ........................................................................................................... 35 Motivation ............................................................................................................... 37
PAGE 6
6 Conclusion .............................................................................................................. 39 APPENDIX A PREAND POST TEST SCORING SHEET ........................................................... 41 B SURVEY QUESTIONS ........................................................................................... 43 C EXAMPLE CONTRACT SHEET AND SCORING RUBRIC .................................... 46 D EXAMPLE PASSOFF SHEET ............................................................................... 48 E UF IRB PROTOCOL LETTER OF APPRO VAL ...................................................... 49 F ORANGE COUNTY PUBLIC SCHOOLS’ REQUEST FOR RESEARCH FORM .... 50 G PARENT CONSENT FORM ................................................................................... 51 H STUDENT ASSENT FORM .................................................................................... 52 LIST OF REFERENCES ............................................................................................... 53 BIOGRAPHICAL SKETCH ............................................................................................ 55
PAGE 7
7 LIST OF TABLES Table page Table 1 Analysis of Variance for Post Test Total Scores as a Function of Assessment Type and Ensemble ....................................................................... 26 Table 2 Post Hoc Comparison of Post Test Total Scores of all Assessment Types as a Function of Ensemble ................................................................................. 27 Table 3 Analysis of Variance for Average Minutes of Practice as a Function of Assessment Type and Ensemble ....................................................................... 28 Table 4 Post Hoc Comparison of Average Minutes of Practice Time per Week as a Function of All Assessment Types and Ensemble .............................................. 28 Table 5 Quantitative Survey Results for Contract Students ( n = 32) ............................. 30 Table 6 Qualitative Survey Results for Contract Students ( n = 32) ............................... 31 Table 7 Quantitative Survey Results for Pass Off Students ( n = 28) ............................. 32 Table 8 Qualitative Survey Results for Pass Off Students ( n = 28) ............................... 33
PAGE 8
8 LIST OF ABBREVIATIONS Achievement a specific musical accomplishment, often the result of specific instruction. Reading notation, performing a specific piece. (Radocy & Boyle, 2003, p. 385). Contracts a researcher develop ed assessment ins trument based on a four level rating scale: Excellent (5), Good (4), Fair (3), and Poor (2). There are eight different scored categories: Tone, Rhythm, Note Recognition, Musicianship, Right Hand, Left Hand, Bowing, and Posture. The contrac t its elf is a list of items that the student must perform during the grading period. It consists of the list of items with the grading rubric. When a student is prepared to perform one of the requirement s on their contract , the teacher uses the scoring cri teria to assign the grade. Students are allowed to replay any tests to receive their desired final grade during the grading period. Pass Offs a researcher developed assessment instrument based on a pass (A) or fail (F) rating scale. The pass off sheet is a list of items that the student must perform during the grading period. “Passing off†refers to the fact that the student performer meets the high standard that would result in a perfect score. The student performs the selection “flawless ly .†Flaws would i nclude rhythmic errors, note errors, bowing errors, and posture problems. This system requires the assessor to have a high musical standard when listening and assigning scores. Feedback is given verbally to students when they do not perform to the set standard. When student s are prepared to perform one of the requirements on the pass off sheet , they perform for the assessor. If student s do not “ pass off †then they may try again at another time until they do “ pass off .†As presented to the students participating, pass offs must be completed in a specific order to achieve the desired grade. EX: Requirements # 1 through 5 must be completed to receive a “C.†If the student completed # 1 through 4 and then # 8 their grade would in effect be a “D.†Generals This c ategory exists on both the pass off sheet and the contract sheet . This category is open for student choice and possible developm ent outside of the “performancebased†grading procedure. Included in this category are 18 different choices ranging from compos ing to journal writing, and concert attendance. For some students this is a difficult category to complete while others have no problem at all. Much of its completion has to do with personal initiative.
PAGE 9
9 Motivation 1. T he act or an instance of motivating. 2. T he state or condition of being motivated. 3. S omething that motivates; inducement, incentive ( Motivation, N.D).
PAGE 10
10 Abstract of Project in Lieu of Thesis Presented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Master of Music THE EFFECT OF TWO DIFFERENT ASSESSMENT TOOLS ON SECONDARY INSTRUMENTAL MUSIC STUDENTS ’ ACHIEVEMENT AND MOTIVATION By David N. Olsen December 2009 Chair: Timothy S. Brophy Member : Silvio dos Santos M ajor: Music Education The purpose of this study was to explore the effect of two contrasting assessment tools on secondary instrumental music students’ achievement and motivation. The study was guided by the following questions: What is the effect on secondary instrumental music students’ achievement using a rubric based assessment tool versus a “pass off†based assessment tool? What is the effect of the assessment tool used on a student’s motivation to practice? Using two different assessment tools, the study examined 65 students in Grades 6, 7, and 8 and ranging in abilities from beginning through advanced. These students from the researcher’s class were randomly assigned through a systematic process to the researcher ’s created “pass off†grading system (n = 31) and the researcher’ screated rubric based “contract†grading system ( n = 34). The study was conducted over the course of the first 9week grading period of the school year. All students participated in a preand post test to attain “achievement†s core data and a survey to collect background information on the students ’ music education, experience, and their thoughts and feelings about their assessment procedure. Results indicated there was no statistically significant ( p < .05) effect of assessment tool on student
PAGE 11
11 achievement. D ata analysis did indicate significance in the post test achievement score as a function of assessment type and ensemble ( p = .04).
PAGE 12
12 CHAPTER 1 INTRODUCTION Finding the most appropriate way to assess music students, and music in general, has been a quest for music teachers at every grade level . How does one grade a subject, such as music, that can be so subjective and open to personal opinion and preference? One person’s opinion about a performance can be vastly different from another’s. In music education there are different opinions on performance quality. These opinions are formed by teachers’ experiences, their personal abilities, and education. There are ways, though, to help even the playing field for all educators and make the subjective assessment of a musical performance more objective. Using a more objective assessment process , such as an assessment procedure delineated by welldefined rubrics, it may be possible to achieve consistency among music educators ’ assessment scores . The process may also inspire students to achieve advanced goals by using the quantifiable guidelines in their grading procedure that are easy to read, understand, and put into practice. Significance of the Problem An ongoing challenge for instrumental music teachers is finding a defined and practical way of formally assessing students in the secondary instrumental music ensemble. Bell & Bell (2003) found that many times music teachers grade students based on nonmusical attributes such as attendance, behavior, effort, and attitude. Antmann (2007) found in his study of “successful†Florida middle school band programs that many middle school band directors are assigning grades to their students based on playing tests, concert attendance, conduct, and participation. Since s tudent assessment has become an increasingly major component in the public education system , it is even
PAGE 13
13 more imperative music teachers create and use meaningful forms of assessment in this subject. Asmus (1999) and Bell & Bell (2003) noted the importance of why music teachers must come up with meaningful ways of assessment and that documenting students’ learning of specific skills in a standards based curriculum helps demonstrate music education’s worth to the community at large. Purpose of the Study The purpose of this study is to explore the effect of two contrasting assessment tools on secondary instrumental music students’ achievement and motivation. This study is guided by the following questions: What is the effect of a rubric bas ed assessment tool versus a “pass off†based assessment tool on secondary instrumental music students’ achievement? What is the effect of the assessment tool used on a st udent’s motivation to practice? Delimitations The following were not accounted for in this study: g ender, ethnicity, or socioeconomic background of students participating in the study.
PAGE 14
14 CHAPTER 2 REVIEW OF LITERATURE Introduction Recent trends , such as the No Child Left Behind Act (NCLB), have brought our school systems into the national spotlight and focused attention on national standards and assessment . With federal, state, and local governments focusing on assessment, student achievement , and accountability , it is only appropriate to align our music curriculum with these ideas to remain a justifiable subject in our society (Asmus, 1999; Bell & Bell, 2003; Willoughby, Feifs, Baenen, & Grimes, 1995). NCLB has established the arts as part of our education system ’ s “core curriculum†(H.Res.1, 2002). Because of the arts ’ inclusion as core curr iculum , it is even more essential to develop meaningful assessment procedures to demonstrate music’s importance to a child’s education. In this review , the researcher explored philosophical and theoretical rationales about education and learning, and asses sment development and student motivation. Philosophical Rational Orenstein and Hunkins (2009) deliver an introduction to the four major philosophies that have influenced education in the United States : idealism, realism, pragmatism, and existentialism . They go on to w rite about four “agreedupon†educational philosophies . These educational philosophies are perennialism, essentialism, progressivism, and reconstructionism . Orenstein and Hunkins (2009) also indicate these educational philosophies have their roots in one or more of the previously mentioned major philosophies. T his researcher finds their philosophy of education aligned with a pragmatic essentialist view in that the purpose of education is to impart knowledge, skills, and values in students to mak e them self sufficient contributing
PAGE 15
15 members of society. The pragmatist believes in culturing critical thinking in students and using the scientific process (Orenstein & Hunkins, 2009, pg 37) . The essentialist believes in teaching the core set of basic ski lls and knowledge (Orenstein & Hunkins, 2009, pg. 56) . Knowing the foundation of educational philosophies will assist in the development of one’s own philosophy of education, which in turn drives curriculum development and choices in assessment procedures. Theoretical Background Learning theories can be viewed in a traditional sense and categorized into two camps: behavioral and cognitive (Abels, Hoffer & Klotman, 1995; Radocy & Boyle, 2003). Radocy and Boyle go on to write that learning is defined as “an observable change in behavior, due to experience, which is not attributable to anything else (p. 396). †Different forms of learning in the behaviorist camp include classical conditioning, established by Russian physiologist I. P. Pavlov, and operant conditioning, associated with works by B. F. Skinner. In c lassical conditioning, also known as Pavlovian conditioning, an unconditioned stimulus which elicits an unconditioned response is preceded by a conditioned stimulus to the point that the unconditioned response becomes a conditioned response when the unconditioned stimulus is removed and replaced with the conditioned stimulus. A classic example of this type of conditioning is in the experiments Pavlov performed with dogs. Pavlov exposed dogs to meat powder (unconditioned stimulus) and the dogs would salivate (unconditioned response) . Pavlov then rang a bell (conditioned stimulus) preceding each exposure to the meat powder and the dogs would salivate. Over time, the ringing of the bell made the dogs salivate (conditioned response) even when there was no meat powder presented. In operant conditioning, a desired response is made
PAGE 16
16 stronger through selective reinforcement. Radocy and Boyle give the example of “if an encaged pigeon pecks at a particular spot and receives food as a consequence, the pigeon is more likely to peck at that spot again and can learn to do it when requiring food (2009, p. 399).†Cognitive learning theories base their learning on organization and reorganization of learning structures (Radocy & Boyle, 2009) . Leading names in this area of research include Bruner, Kohler, and Piaget. T heoretical views in this camp stem from and include Gestalt theory. Gestalt theorists are primarily interested in perception, or understanding the whole, and thr ough the organization of this perception, information and concepts are learned (Radocy & Boyle, 2009). Swiss biologist Piaget has had a great impact on child development theory with his wellknown Stage Theory (Abeles et al., 1995). His theory presents four theoretical stages of child development: sensorimotor stage, preoperational stage, concrete operational thought, and finally formal operational thought. J. Bruner, an Am erican psychologist, also has developed a theory on child development that is divided into three, less rigid , stages: enactive, iconic, and symbolic. Studying these child development theories could assist teachers in understand ing the mental development and the readiness to learn of the particular students they are teaching and in turn, ca n sequence their curriculum to best fit the students’ learning process (Abeles et al., 1995). Applications of both learning theories find their way into the music classroom. Behavioral theories help with maintaining general order in the classroom and proper behavior. This theory can also be used in general rehearsal techniques as the teacher is giving verbal feedback to the students about their performance to increase the
PAGE 17
17 likelihood of the same response. This can be categorized as a form of operant conditi oning (Abeles et al., 1995). Cognitive theories help teachers organize information and curriculum into segments that can be taught to students and then analyzed to see if they have been learned and to what degree they have been learned. One such method that organizes information and then uses behavioral techniques to assess skill acquisition is Bloom’s Taxonomy of the Cognitive Domain (Abeles et al., 1995). Assessment Development Developing quality assessment methods presents its own challenges (Baker, Asc hbacher, Niemi, & Sato, 1992). Some teachers believe intensely that developing formal assessments will take away from the abstract nature of the art (Asmus, 19991; Willoughby et al.,1995). Baker, Aschbacher, Niemi, & Sato (1992) conducted a fiveyear research study on constructing an assessment method that would thoroughly test cognition of material taught. Their study explored the construction of accurate performancebased assessments. Assessments should measure students’ knowledge of skills and verify th at they have acquired the concepts taught in class. Assessment using rubrics helps guide the teacher in pacing instruction, present ing curriculum, and setting performance standards (Asmus, 19991). Asmu s goes on to state that having wellthought out learni ng objectives helps guide the teacher and the students through the curriculum and to the assessment method. Using rubrics can also help the students being assessed by giving them clear guidelines as to what is being assessed and what they can do to perform better in the future ( Asmus, 19992). Willoughby et al. (1995) write that rubrics document student progress and supply teachers with useful information when communicating with the principal, parents, and students. They also state how assessing an arts cour se can be subjective and having a method of formal
PAGE 18
18 assessment helps to objectify the results. Many school districts, as well as State and National associations such as The National Association for Music Education ( MENC) , have curriculum resources and guides that make a good starting point for targeting learning objectives and developing a sequence of instruction (Asmus , 19991). Motivation Abeles et al. (1995, pg. 212) define motivation as “ the energy that a learner employs when changing a behavior .†They g o on to state that educators primarily focus on a student’s secondary or psychological drives that “ include fear, love, frustration, curiosity, the need for stimulation, [ and] the need for achievement†(pg. 213). Assessment should be a motivational teachi ng tool (Baker, Aschbacher, Niemi, & Sato, 1992). In his study, Asmus (1986) focused on the opinions of the students and their ideas of success and failure pertaining to music class. Asmus us ed a 10item openended questionnaire to collect data from the st udents and then conceptualized those data using Weiner’s Attribution Theory . Weiner’s Attribution Theory is a framework that assumes that people try to determine why people do what they do ( L earning Theories Knowledgebase, 2009). When people succeed at som ething, they tend to attribute that success with their own skill and when people fail, they will attribute that failure to some external cause. Assessment using rubrics give s the students the power to self assess. The guidelines or descriptors, a key elem ent on the rubric chart (Asmus 19992), act as a guide in individuals ’ practice. Hewitt (1991) found research literature relating to self assessment on musical performances to be inconclusive, because of the l ack of study in this area, but self assessment d id have a positive result on students ’ practice attitude. Hewitt mentions that students who received training on how to self assess were more
PAGE 19
19 positive about music, their class es , and their teacher (pg. 309). Using a rubric to guide their practice, student’ s attitude toward practice may increase. Asmus ( 19991) notes that having a mappedout sequence of learning can lead to increased student motivation, in turn enhancing knowledge and skill.
PAGE 20
20 CHAPTER 3 METHODOLOGY AND PROCED U RES Introduction This action research was based on a mixed method research approach, dominant status sequential design, in which the researcher collected both quantitative and qualitative data (QUAN --> qual) . Permission for this study was acquired from the University of Florida Institutional Review Board (Appendix E). Permission was also acquired from the researcher’s school district’s Accountability, Research, and Assessment Department (Appendix F). The researcher used students from the researcher’s class who volunteered to participate. P arents completed consent forms (Appendix G) and students completed assent forms (Appendix H) in order to take part in the study. The study w as conducted during the first 9week grading period ( beginning September 7th and concluding on October 27th) of the 20092010 school year. Sixty five students volunteered to participate: 28 beginners, 12 intermediate, and 25 advanced students in G rades 6, 7, and 8. Study Procedures S tudents participating in this research were placed in their assessment method based on s ystematic sampling by their class. Each class was alphabetized by last name. The researcher then chose a random starting place in each list and placed every 3rd student into an assessment procedure: either the contract system or the pass off system. During the course of the 9week grading period , students were given three opportunities during their class period to test using their assessment procedure. Each of the student s tested by performing their musical selection for the entire class. Students on the rubric based contract received a numerical grade based on how they performed
PAGE 21
21 and based on the standards outlined in the rubric. Students using the pass off system either played their excerpt “ flawlessly †and “passedoff†or were stopped at the first flaw and told that they did not pass off. All students were given the opportunity to stay after school or come to school early to perform their excerpts again for the grade they desired. R equirements for beginning orchestra students’ tests were as follows : T he D major scale one octave with arpeggio pizzicato, memorized Essential Elements 2000 book 1 exercises # 9, 19, 22, and 34 Winning Rhythms charts 1, 2, 3, and 4 (students were required to count a randomly selected line from each chart) C omplete 3 out of 18 “generals.†Requirements for intermediate orchestra students ’ tests were as follows : the D and G Major scales two octaves (where appropriate for their instrument ) with arpeggio, memorized Essential Elements 2000 book 2 exercises # 36 and 47 and Essential Technique 2000 book 3 exercises 21 and 22 as one exercise and 23 and 24 as exercise Winning Rhythms charts 10, 11, 12, and 14 (students were required to count a randomly selected line from each chart) C omplete 3 out of 18 “generals.†Requirements f or a dvanced orchestra students’ test were as follows : T he D, G and C Major scales two octaves (where appropriate for their instrument ) with arpeggio, memorized
PAGE 22
22 Essential Technique 2000 book 3 exercises 21 and 22 as one exercise, # 35, a nd # 47 and 48 as one exercise Winning Rhythms charts 14, 15, 16, and 17 (students were required to count a randomly selected line from each chart) C omplete 3 out of 18 “generals.†Data Collection and Analysis Data were collected through a preand post test pr ocedure and a concluding survey with which the researcher gathered background information regarding students’ musical experience and their experiences and feelings concerning their assessment procedure. Pre and Post Test The preand post test consisted of each stu dent recording two of their required materials, a “prepared piece†and a scale, and also a sight reading exercise. The beginning students’ required materials were exercise # 34 and the memorized D Major scale with arpeggio (pizzicato) . The intermediate and advanced students’ required materials were exercise # 21 and 22 as a continuous line and the memorized D Major scale two octaves with arpeggio (bowed) . For the pre test , students were only told that they were to play a prepared piece , a scale, and a sight reading exercise. The students were not informed what those items would actually consist of until they entered the recording area. Once the students recorded, they were informed that the selections constituting the pretest would make up the post test and the only element that would change would be the sight reading exercise. The pretest occurred a few weeks into the 9 week grading period to give all students the opportunity to begin work on all exercises required of them for their grading procedure. The post test occurred after the end of the
PAGE 23
23 grading period, when all grades had been finalized . Students again recorded the prepared piece and scale mentioned previously and a slightly different sight reading than before. Recordings were made in the researcher ’s office using a laptop, an external microphone, and the computer program Audacity. R esults of the preand post test were analyzed by the researcher using an instrument based on the “Woodwind/Brass Solo†evaluation form (Saunders & Holahan, 1997) ( Appendix A ) . This form, while used in almost its complete state, was modified to work with string performers rather than wind performers . Elements such as breath control and articulation were replaced with bowing and bow control. Other elements such as embouchur e formation and observable posture elements were eliminated from scoring. Survey The survey was constructed by the researcher using the online survey creator Surve y Monkey ( Appendix B ). The information collected was used to gain background information about the students’ education in music . The information collected in the survey included their time spent in music class during school, the number of months they took private lessons on their orchestra instrument and/or piano, the average number of minutes per week they spent practicing their instrument during the grading period, the number of months the student had been playing an orchestra instrument, and their thoughts and opinions on their experience during the grading period as it related to their assign ed assessment procedure. The survey was administered during each students’ 46minute class period in the school’s computer lab. The students were given written instructions on how to access the Survey Monkey survey. Students who needed help raised their hands and were helped by the researcher. R esults of the survey yielded both quantitative and qualitative data.
PAGE 24
24 Other data collected included the assigned assessment procedure (pass offs or contract), their grade level, their age in months, their instrument (violin, viola, cello, or bass), and their ensemble level (beginning, i ntermediate, a dvanced).
PAGE 25
25 CHAPTER 4 RESULTS Pre and Post Test Results Analysis of Variance for preand post test total score as function of the assessment type were conducted. The pretest as a function of assessment method results indicated that minutes of music instruction had a high significance ( p= .005) along with months of playing on the instrument ( p= .03) in predicting the student’s score. This test also indicated that assessment type was not found to be significant ( p= .70). The post test as a function of assessment type yielded slightly different results indicating that assessment type had more significance ( p= .04) ; months of playing an orchestra instrument was still significant ( p= .0 4 ) ; and minutes of music instruction was no longer significant ( p= .80). Observing the effects of the assessment type and the ensemble yielded a statistically interesting result. In an analysis of variance of the pretest total scores using two independent variables ( ensemble and assessment type) yielded the result that minutes of music instruction in school was significant in the score the students received ( p= .01). A nalysis of variance of the post test total scores with the same two independen t variables yielded the result that ensemble type, while not as statistically significant, was more significant ( p= .22) than minutes of music instruction ( p= .97) which was no longer significant. Analysis of Variance using assessment type combined with e nsemble revealed interesting statistics (Table 1) . D ata collected concerning the post test total score as a function of assessment type were not statistically significant ( p= .29). The post test total score as a function of the ensemble in which the students were enrolled was significant
PAGE 26
26 ( p= .02). When looking at the post test total score as a function of both assessment type and ensemble together there was also a statistically significant result ( p= .04) . Table 1 Analysis of Variance for Post Test Total S cores a s a Function of Assessment Type and Ensemble Source df F p Assessment Type 1 1.14 .29 Ensemble 2 4.22 .02 * Assessment Type * Ensemble 2 3.29 .04* Error 59 Total 64 Note: * p < .05 T he mean post test total score in beginning orchestra was 73.96. When observing the beginner’s post test total scores delineated by assessment type, it was found that the students who used the contract had a mean score of 81.63, while those on the pass offs received a mean score of 64.88. For the intermediate orc hestra as a whole, the mean score for the post test total score was 77.54. S tudents on the contract had a mean score of 78.21 and students on the pass offs earned a mean score of 76.60. In the advanced orchestra, the mean post test total score as a group w as 85.78. The students who were using the contract earned a mean score of 83.21, while those on pass off s earned a mean score of 88.15. Table 2 shows the post hoc comparison of the post test total scores of all assessment types as a function of the ensembl e in which the student is enrolled.
PAGE 27
27 Table 2 Post Hoc Comparison of Post Test Total Scores of all Assessment Types a s a F unction of Ensemble Assessment Type Ensemble Difference of Means p Pass Offs Beginning 16.75 .0 7 Intermediate 1.61 1.00 A dvanced 4.95 .96 Note: All comparison types are based on post test total score for that ensemble’s contract. Observing the effects of assessment type on the average minutes of practice per week also yielded some interesting results. The mean number of m inutes per week for students on the contract was 88.5 minutes while those on the pass off assessment practiced a mean of 149.5 minutes , a 51.3% difference. Table 3 shows data indicating the significance of the assessment type and ensemble on average practi ce time in minutes. Breaking it down further and analyzing those assessment type effects pertaining to each ensemble member’s practice time yields much the same result in that the pass off students practiced longer: Beginning students on contract = 69.5 mi nutes vs. passoffs = 87.7 minutes , a difference of 23.3% ; Intermed iate students on contract = 88.3 minutes vs. pass offs = 116.0 minutes , a difference of 27.1% ; Advanced students on contract = 112.5 minutes vs. pass offs = 224.2 minutes , a difference of 66.3% . Table 4 shows the average amount of practice time in minutes as a function of assessment type and ensemble. There is a significance indicated with the assessment type used only in the symphonic orchestra as to how much average time per week the students spent practicing.
PAGE 28
28 Table 3 Analysis of Variance for Average Minutes of Practice a s a F unction of Assessment Type and Ensemble Source df F p Assessment Type 1 6.93 .01* Ensemble 2 9.87 .00* Assessment Type and Ensemble 2 2.85 . 07 Error 59 Total 64 Note: * p < .05 Table 4 Post Hoc Comparison of Average Minutes of Practice Time p er Week as a Function of All Assessment Types and Ensembl e Assessment Type Ensemble Difference of Means p Pass Offs Beginning 18.23 .98 Intermediate 27.71 .99 Advanced 111.73 .01* Note: All comparison types are based on student reported average practice time per week for that ensemble’s contract participants . * p < .05 Survey Results M uch of the survey was used to collect information to assist with the preand post test results . Information was also collected regarding students ’ feelings and thoughts about their assessment procedure. All 65 students participating in the research completed the survey information that was necessary for analyzing the preand post test result data. That information included how many months they had been taking private lessons on their orchestra instrument, how many months they had been taking
PAGE 29
29 piano lessons, on average how many minutes per week they practiced, and how many minutes they had a music class each day. Sixty one of the 65 students completed the remainder of the survey , which consisted of questions concerning their thoughts and feelings about their assessment procedure. Some students skipped various questions for reasons unknown to the researcher . Contract Table 5 shows the quantitative results gained through the survey as answered by the students using the contract assessment procedure. Questions are listed with the responses the students had to choose from. The first t wo questions allowed multiple answers while the last five only allowed one possible answer.
PAGE 30
30 Table 5 Quantitative Survey Results for Contract Students ( n = 32) What was your FIRST feeling(s) about the testing method for which you were chosen? Mark all ans wers that apply. Responses Happy 72.2% (21) Sad 0.0% (0) Encouraged 41.4% (12) Discouraged 0.0% (0) Did not care either way 34.5% (10) Other 31.0% (9) Skipped question (3) How did you feel after you took you FIRST test using your te sting method? Mark all answers that apply. Happy 59.4% (19) Sad 0.0% (0) Shocked 9.4% (3) Encouraged 37.5% (12) Discouraged 0.0% (0) Did not care either way 31.3% (10) Skipped question (0) After you took a test, how did you feel? Encouraged to continue 96.6% (28) Discouraged to continue 3.4% (1) Skipped the question (3) My testing method made me practice more I agree 93.8% (30) I disagree 6.3% (2) Skipped the question (0) My testing method made me WANT to practice I agree 75.0% (24) I disagree 25.0% (8) Skipped the question (0) My testing method made me NOT want to practice I agree 6.3% (2) I disagree 93.8% (30) Skipped the question (0) My testing method helped me become a better player on my orchestra instrument I agree 83.9% (26) I disagree 16.1% (5) Skipped the question (1)
PAGE 31
31 Analyzing the qualitative data collected in the survey concerning the contract revealed themes. Data in Table 6 consist s of patterns found in the student wri tten responses to the following questions about their assessment procedure: W hat were three things that you liked about the way you were tested? What were three things that you did not like about the way you were tested? In a few words, write any final thoughts, feelings, and/or experiences about your testing experience. Table 6 Qualitative Survey Results for Contract Students ( n = 32) Theme Data “Like about the Contract†Could see what was done wrong and how to fix it. Improve grade by trying again. Order of what you completed did not matter. Get a grade no matter. Test when you wanted to. “Dislike about the Contract†Testing in front of the class Inability to make a perfect score. “Practice Attitude†None. “Final thoughts†Helped make me a better player. Could always do it again. Could always get a higher grade. PassOffs D ata in Table 7 are the quantitative results gained through the survey as answered by the pass off students. Questions are listed with the responses the students had to choose from. The first two questions allowed multiple answers while the last five only allowed one possible answer.
PAGE 32
32 Table 7 Quantitative Survey Results for Pass Off Students ( n = 28) What was your FIRST feeling(s) about the testing method for whic h you were chosen? Mark all answers that apply. Response s Happy 27.3% (6) Sad 13.6% (3) Encouraged 31.8% (7) Discouraged 18.2% (4) Did not care either way 31.8% (7) Other 50.0% (11) Skipped question (6) How did you feel after you to ok you FIRST test using your testing method? Mark all answers that apply. Happy 48.1% (13) Sad 18.5% (5) Shocked 40.7% (11) Encouraged 29.6% (8) Discouraged 25.9% (7) Did not care either way 3.7% (1) Skipped question (1) After you took a test, how did you feel? Encouraged to continue 85.2% (23) Discouraged to continue 22.2% (6) Skipped the question (1) My testing method made me practice more I agree 92.6% (25) I disagree 7.4% (2) Skipped the question (1) My t esting method made me WANT to practice I agree 77.8% (21) I disagree 22.2% (6) Skipped the question (1) My testing method made me NOT want to practice I agree 22.2% (6) I disagree 77.8% (21) Skipped the question (1) My testing method helped me become a better player on my orchestra instrument I agree 88.9% (24) I disagree 11.1% (3) Skipped the question (1)
PAGE 33
33 Analyzing the qualitative data collected in the survey concerning pass offs al so revealed themes. Data in Table 8 co nsists of patterns found in the student s’ written responses to the following questions: What were three things that you liked about the way you were tested? What were three things that you did not like about the way you were tested? In a few words, write any final thoughts, feelings, and/or experiences about your testing experience. Table 8 Qualitative Survey Results for Pass Off Students ( n = 28) Theme Data “Like about p ass o ffs†It was quick. You did not really have to worry about your grade Made me want to practice more so I could pass off the first time. Get a “perfect†score. It was fun Improve your grade by trying again. Test when you wanted to. “Dislike about pass o ffs†Keep doing it until got it right for a good grade. It was hard to pass off Required Perfection. More pressure (stress) No passing = Zero Order of grades. One mistake = Fail Testing in front of class Took a longer time “Practice Attitude†Encouraged to practice “Final thoughts†It was too hard. Made me practice more.
PAGE 34
34 CHAPTER 5 DISCUSSION Introduction This study began with an invitation to the researcher’s 140 middle school orchestra students to participate. Of those 140 students, 75 students fully completed the consent and assent forms appropriately and began the study. Over the course of the 9week grading period there was a 10student reduction due to various reasons: extended absences due to the flu, students moving away, and general absences that caused those students not to complete all parts of the preand post test treatments or the survey. Therefore, the data collected on those 10 students were excluded from the analysis. U sing their assessment method, all students were assessed during the school day three times during the 9week grading per iod. For grades the student s did not have completed or wanted to score higher on after those days , the students were made responsible for testing before or after school with the researcher. Early on, the researcher noticed that the beginners, who had no pr ior experience with either form of grading, took to the pass offs quickly. After the first testing day , many of the beginners understood what the expectation was for something to be “passed off.†After the first testing experience, most beginners passedof f on their first try. For the students who had experience with the contract testing procedure in previous years, it took them longer to understand what was expected of them to get “passedoff.†Many of them would attempt to pass off a requirement up to fiv e times before actually achieving their goal. The researcher observed behavioral cues as to what the students were feeling in their pass off experience. Anger and frustration was apparent. Many times the
PAGE 35
35 researcher feared those students using the pass off assessment would quit the class because of the high expectation and pressure to get a good grade (pressure applied by the student s and their parent s/guardians) , but reassured the students that this was only a research project and that they would be taken c are of at its conclusion. In the end, and without informing the students so as not to effect their motivation to achieve their chosen final grade, the researcher added some point s to their final grade to help all students who participated in the research project so that their participation in the research was not to effect their grade in the course. Achievement What are the effects on secondary instrumental music students’ achievement using a rubric based assessment tool versus a “pass off†based assessment tool? Statistically speaking there is no significant difference in the achievement of the students regarding which assessment tool they used. However, w hen observ ing the post test total scores as they compare with what ensemble the students are involved in, we see a slightly different result. In T able 1 the data indicates that what ensemble a student is in has a statistically significant effect on their post test total score (p = .02). The data also indicates statistical significance in the post test tota l score when assessment type and ensemble covary. This researcher obser ved that the data collected on the students in the beginning and intermediate orchestra classes on the contract seemed to score higher on the post test than those on pass offs . Convers ely, the students in the advanced class who wer e participating on the pass offs seemed to score higher on the post test than those on the contract . It is the researcher ’ s opinion that the reason for this difference between the students in the two ensembles could be a combination of things. One, students in the advanced class are placed there for a
PAGE 36
36 particular reason. To be placed in that class the student had to audition and perform at a higher level. This fact in itself would indicate a high level of personal achievement on the student’s part. Also, this would indicate that the student has a high motivation to achieve, and possibly a certain “ competitive spirit. †The pass off sheet would seem to feed into these traits: high motivation, desire for a high level of personal achievement, and the competitive spirit. For student s to “pass off , †they must be motivated to try again when they experience failure, and there is a competitive nature when they are “pass ing off†to see who can pass off first. This researcher observed that many of his most advanced players enjoyed the pass off procedure not just because it was something new, but because it was “challenging†and “fun.†He also noted that in all classes where the student was not as serious about performing, but rather took the class as something fun to do or was a beginner and just introduced to playing the instrument , pass offs were a serious issue and very uninspiring. I t is es sential to note the findings in Table 2 ; beginners using the contract assessment t ype received a higher total score on the post test. This finding, while not statistically significant ( p = .07) , regarding the beginning orchestra students on the pass offs scored lower on the post test total score it is important to note that there was a 16.75 difference of means between the two scores. This difference has a practical importance for educators in that we want the very best educational assessment tools to be used. Here it does seem that the contract assessment work s better than the pass off assessment method. In the beginning and inter mediate classes, the results indicate that a rubric based contract assessment method is beneficial. Students seemed to perform better when given the opportunity to reflect and learn through the small incremental
PAGE 37
37 achievement s gained through the researcher’s contract assessment process . Students were given a score each time they performed, based on eight graded areas of their performance. They then were able to reflect on their performance by referring to the grading rubric. They could make adjustments to their performance and then take their test again to gain a higher grade. This constant cycle of grading, reflection, guided practice, and grading again lends itself well to the younger , less experienced student s w ho need more guidance and care. Motivation Is a student’s motivation to practice different bas ed on the assessment tool used? D ata gained through the survey and researcher observation was fascinating regarding motivation to practice. Considering all case s observed, students who were assigned the pass off procedure practiced on average twice as much as those using the contract. T h is researcher believes that the reason for this occurrence is the extrinsic motivat ion caused by the pressure of making a good grade in the class. They practiced, as mentioned by one student, to “make the grade.†This pressure was placed on them most likely by their parents/guardians and in turn themselves. Perhaps students who were assigned the contract method had more goal orient ed practice that allowed them to get more done in a shorter amount of time. It is the quality of practice versus the quantity of practice. Students practicing with the contract can use the rubric to reflect on concepts on which they need to work. For some of the more advanced students, it seems to this researcher, participating in the pass off assessment procedure was truly a thrill, a game, and in fact intrinsic in nature. Many times , students who fall into this latter group have crossed over in their musi cal experience from extrinsically motivating factors to the sheer enjoyment of the subject and performing.
PAGE 38
38 As far as motivation to practice is concerned, as music teachers we walk a fine line. We must know our students, their lives, and their personalities. We must tailor our curriculum to their individual needs. Most music students are motivated to make good grades in our classes. Some are fine with making grades that are below an “A,†while other s have to have “straight As†and will not rest until they achieve their goal. On a pass off system, it is easy for the teacher to set the expectation for something to be pass edoff high and then students get frustrated and quit because they have no success. Then, over time, it is also easy for teacher s to fall int o the trap of low er ing their expectations to allow the students to feel success. It is this researcher ’ s belief that using a system such as a rubric based contract assessment, for the average student, i s more appropriate . The data in Table 8 shows the thoughts and feelings of students on the pass off system. They reported on feeling more stressed about performing. The students also thought that it was hard compared to the contract system because one mistake meant a zero for a grade. With the rubric based as sessment , you can motivate students to make a numerical grade no matter how they perform. If students want to make a higher grade, or want to increase their score then , they have that option. For other students who are fine being average, they will continue to play for any kind of score they receive the first time. The difference between pass offs and contracts here is that all students will get some sort of grade on a contract rather than not being able to play under pressure and just failing music class. Our goal as music educators is to teach a lifelong appreciation for our art and not push students away because they do not perform something perfectly and then teach them to hate music.
PAGE 39
39 Conclusion It is not enough for teachers to take the route of pass o ffs because it has the potential to c reate outstanding music program: m usic programs with students who can perform something they have been conditioned to play perfectly. All music teachers’ standards of performance and backgrounds are different. What qual ifies as quality to one educator may not be quality to another. Ultimately , in music, we are dealing with a person’ s opinion of a performance rather than facts. Facts , on the other hand, cannot be disputed. “You did not use the correct bowing. †“ You did not play with the correct rhythm.†These are concrete concepts that no one can dispute. Assessment that is based in a rubric will be respected by other subject areas, and be understood and gain the support of students, parents and administrators. W ith a tai l ored curriculum that is rubric based, students can learn many components of the performance. They learn to properly develop all parts of their craft and technique, and through this development , over an extended period of time, we as music educators will fo ster a life long love for music and perhaps continue the student’s participation in and s upport of the arts as an adult . Teaching is an evolutionary process. Much of my career as a music teacher can be described as such a process. My philosophies on assess ment and curriculum have changed dramatically since I began teaching ten years ago. What started out as teaching to get through the day, teaching to the performance, and teaching to make high scores at festivals and competitions with my group s has changed. It has developed into teaching the child, teaching with an end goal in mind, and teaching with a philosophical belief that in the end, music must be taught to enrich the child’s life, strengthen creativity and broaden appreciation of music . The assessment of students
PAGE 40
40 should be based on the concepts and skills you feel are essential for them to learn. Teachers should use their state’s s tandards as a guide in developing of such assessments. This will encourage and develop life l ong learning skills in each st udent , not just skills to “pass the test.†Through this research, I have affirmed my belief in using rubrics in assessment. I also have come to believe that when students are trained to use a rubric based assessment procedure, they will conceptualize and organize their practice time with more efficient learning. The rubrics can be guides in their practice that are essential and will instill future goodpractice habits and routines.
PAGE 41
41 APPENDIX A PREAND POST TEST SCORING SHEET
PAGE 42
42
PAGE 43
43 APPENDIX B SURVEY QUESTIONS Background: The following questions will gather background information about you, your music knowledge, and music education. Please take your time and answer every question honestly and to t he best of your ability. If you have any questions about a question, please ask Mr. Olsen. Please type your name: Last, First How many total months have you played your orchestra instrument? Please include any elementary orchestra time even if you switched instruments when you came to middle school. How many months have you taken private lessons on your orchestra instrument? Please state your answer to the closet half month: Ex. 3 months or 3.5 months (three months or three and a half months). Remember ther e are 12 months in a year. Zero CAN be an answer. How many months have you taken piano lessons? Please state your answer to the closet half month: Ex. 3 months or 3.5 months (three months or three and a half months). Remember there are 12 months in a year. Zero CAN be an answer. On average, how many MINUTES per week do you practice your orchestra instrument? How many minutes in the school day do you have music class? o 46 minutes (one class period of music) o 92 minutes (two class periods of music) o 138 minutes (three class periods of music) Questions about the testing procedure: The following questions deal with the process that you went through during the testing procedure. Please take your time and answer each question honestly and to the best of your ability . If you have any questions about a question, please ask Mr. Olsen. Which testing method were you selected for?
PAGE 44
44 What was your FIRST feeling(s) about the testing method for which you were chosen? Mark all answers that apply. o Happy o Sad o Encouraged o Discouraged o Did not care either way o Other (please specify): A b lank text box was given for their own respons e s How did you feel after you took your FIRST test using your testing method? Mark all answers that apply. o Happy o Sad o Shocked o Encouraged o Discouraged o Did not car e either way In a few short statements, answer the question that applies to your test: Blank text box for their own response. o Pass o ffs: How did it make you feel when you “passed off?†o Grading contract: How did it make you feel when you received your grade or your test? What were three things that you liked about the way you were tested? Please number your answers. A blank text box was given for their own responses. What were three things that you did not like about the way you were tested? Please number your answers. A blank text box was given for their own responses. After you took a test, how did you feel? o Encouraged to continue o Discouraged to continue
PAGE 45
45 My testing method made me practice more. o I agree o I disagree My testing method made me WANT to practice. o I agree o I disagree My testing method made me NOT want to practice. o I agree o I disagree My testing method helped me become a better player on my orchestra instrument. o I agree o I disagree In a few words, write any final thoughts, feelings, and/or experiences about your testing experience. A blank text box was given for their own responses.
PAGE 46
46 APPENDIX C EXAMPLE CONTRACT SHEET AND SCORING RUBRI C
PAGE 47
47
PAGE 48
48 APPENDIX D EXAMPLE PASSOFF SHEET
PAGE 49
49 APPENDIX E UF IRB PROTOCOL LETTER OF APPROVAL
PAGE 50
50 APPENDIX F ORANGE COUNTY PUBLIC SCHOOLS’ REQUEST FOR RESEARCH FORM
PAGE 51
51 APPENDIX G PARENT CONS ENT FORM
PAGE 52
52 APPENDIX H STUDENT ASSENT FORM
PAGE 53
53 LIST OF REFERENCES Abeles, H. F., Hoffer, C. R, & Klotman, R. H. (1995). Foundations of Music Education. New York, NY: Schirmer Books. 2nd Edition. Antmann, M. D. (2007). Assessment a nd Grading in the Beginning Band Classroom . Masters’ Thesis. Florida State University, Florida. Retrieved December 5th 2009. http://etd.lib.fsu.edu/theses/available/etd07092007172856/ Asmus, E. P. (19991). Music Assessment Concepts. Music Educators Journal , 86(2, Special Focus: Assessment in Music Education), 1924. Asmus, E. P. (19992). Assessment. Retrieved August, 2009, from Music Assessment Web Site at the University of Miami: http://www.music.miami.edu/assessment/index.html Asmu s, E. P., Jr. (1986). Student Beliefs about the Causes of Success and Failure in Music: A Study of Achievement Motivation. Journal of Research in Music Education, 34(4), 262278. Baker, E. L., Aschbacher, P. R., Niemi, D., & Sato, E. (1992). CRESST Performance Assessment Models: Assessing Content Area Explanations . California: UCLA Center for the Study of Evaluation. Retrieved from: h ttp://www.cse.ucla.edu/products/guidebooks/cmodels.pdf Bell, A., & Bell, M. (2003). Developing Authentic Assessment Methods from a Multiple Intelligences Perspective. (ERIC Document Reproduction Service No. ED 479391) Hewitt, Michael P. (2001). The Effects of Modeling, Self Evaluation, and Self Listening on Junior High Instrumentalists' Music Perform ance and Practice Attitude. Journal of Research in Music Education, 49(4), 307322. Retrieved from International Index to Music Periodicals database.
PAGE 54
54 Learning Theories Knowledgebase (2009, December). Attribution Theory (Weiner) at LearningTheories.com. Retrieved December 3rd, 2009 from http://www.learning theories.com/weiners attributiontheory.html Motivation. (n.d.). Dictionary.com Unabridged. Retrieved November 23, 2009, from Dictionary.com website: http://dictionary.reference.com/browse/Motivation H. Res. 1, 107th Cong, 115 Strat.1425 (2002) (enacted). Public Law 107110, No Child Left Behind Act of 2001 (Short Title). [Electronic Version] Orenstein, A. C., & Hunkins, F. P. (2009). Curriculum: Foundations, Principles, and Issues . Boston, Mass.: Pearson. 5th Edition. Radocy, Rudof E., & Boyle, J. David. (2003). Psychological Foundations of Musical Behavior. Springfield, Ill.: Charles C. Thomas, Publisher, LTD. 4th Edition. Saunders, T. C. & Holahan, John M. (1997). Criteria Sp ecific Rating Scales in the Evaluation of High School Instrumental Performance. Journal of Research in Music Education, 45(2), 259272. Retrieved from http://www.jstor.org/stable/3345585 Willoughby, M., Feifs, H., Baenen, N., & Grimes, E. (1995). B ehind the scenes: Measuring student progress in the arts and beyond. evaluation and research report 95E.06 Evaluation and Research Dept., Wake County Public School System, 3600 Wake Forest Rd., Raleigh, NC 27611.
PAGE 55
55 BIOGRAPHICAL SKETCH A native of Marshfield, Maine, Mr. David N. Olsen received his Bachelor of Music E ducation degree from the University of Florida in 2000. He then obtained a position in Alachua County , Florida, as Director of Music at Lincoln Middle School, where he taught for seven years. Mr. Olsen's bands and orchestras have consistently received high honors at the district level and at other music festivals. Mr. Olsen served as Alachua County Middle School Honor Band Chairman, the District Liaison for Secondary Music in the Alachua County School System, and the School Board of Alachua County Representative for the Alachua County Youth Orchestra Executive Board. He is on the Board of Advisors for the National Adjudicators Middle School Festival. He also serves as a guest clinician for numerous ensembles. Professional memberships include Florida Bandmasters Association, Florida Music Educator's Association, Music Educator's National Conference, and the Florida Orchestra Association. Mr. Olsen currently teaches orchestra at Chain of Lakes Middle School in Orlando, Florida, for the Orange County Public School system, and he resides in Winter Park.
|
|