<%BANNER%>

Construction, Validation, and Administration of a Diagnostic Test of Cello Technique for Undergraduate Cellists

Permanent Link: http://ufdc.ufl.edu/UFE0019642/00001

Material Information

Title: Construction, Validation, and Administration of a Diagnostic Test of Cello Technique for Undergraduate Cellists
Physical Description: 1 online resource (163 p.)
Language: english
Creator: Mutschlecner, Timothy
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2007

Subjects

Subjects / Keywords: cello, diagnostic, education, music, string, teaching, technique, tests
Music -- Dissertations, Academic -- UF
Genre: Music Education thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: The purpose of this study was to construct, validate, and administer a diagnostic test of cello technique for use with undergraduate cellists. The test consisted of three parts: (1) A written test, which assessed a student?s understanding of fingerboard geography, intervals, pitch location, and note reading, (2) A playing test, which measured a student?s technique through the use of excerpts from the standard repertoire for cello, and (3) A self-assessment form, through which students could describe their experience, areas of interest, and goals for study. A criteria-specific rating scale with descriptive statements for each technique was designed to be used with the playing test. The written test, playing test, and self-assessment were pilot-tested with five undergraduate students at a university in the southeast. A validation study was conducted to determine to what extent teachers felt this test measured a student?s technique. Nine cello teachers on the college and preparatory level were asked to evaluate the test. The test was administered to 30 undergraduate cellists at universities located in the southeastern region of the United States. Strong interitem consistency was found for the written test (r KR20 = .95). A high internal consistency of items from the playing test was found (? = .92). Interjudge reliability of the playing test was high, as measured by comparing the independent evaluations of two judges with the researcher?s evaluations using Pearson's r (Judge A r = .92; Judge B r = .95. Other conclusions drawn from the study include: (1) Piano experience has a significant positive effect on the results of the playing test (R2 = .15); (2) The playing test is a good predictor of teacher-rankings of their student in terms of technique; (3) Year in school, degree program, or years of playing experience were not significant indicators of students? playing ability as measured by this test. Participating teachers described this test as a valuable tool for evaluating students and charting their course of study. They found it to be an efficient means to identify a student's strengths and weaknesses in cello technique.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Timothy Mutschlecner.
Thesis: Thesis (Ph.D.)--University of Florida, 2007.
Local: Adviser: Brophy, Timothy S.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2007
System ID: UFE0019642:00001

Permanent Link: http://ufdc.ufl.edu/UFE0019642/00001

Material Information

Title: Construction, Validation, and Administration of a Diagnostic Test of Cello Technique for Undergraduate Cellists
Physical Description: 1 online resource (163 p.)
Language: english
Creator: Mutschlecner, Timothy
Publisher: University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: 2007

Subjects

Subjects / Keywords: cello, diagnostic, education, music, string, teaching, technique, tests
Music -- Dissertations, Academic -- UF
Genre: Music Education thesis, Ph.D.
bibliography   ( marcgt )
theses   ( marcgt )
government publication (state, provincial, terriorial, dependent)   ( marcgt )
born-digital   ( sobekcm )
Electronic Thesis or Dissertation

Notes

Abstract: The purpose of this study was to construct, validate, and administer a diagnostic test of cello technique for use with undergraduate cellists. The test consisted of three parts: (1) A written test, which assessed a student?s understanding of fingerboard geography, intervals, pitch location, and note reading, (2) A playing test, which measured a student?s technique through the use of excerpts from the standard repertoire for cello, and (3) A self-assessment form, through which students could describe their experience, areas of interest, and goals for study. A criteria-specific rating scale with descriptive statements for each technique was designed to be used with the playing test. The written test, playing test, and self-assessment were pilot-tested with five undergraduate students at a university in the southeast. A validation study was conducted to determine to what extent teachers felt this test measured a student?s technique. Nine cello teachers on the college and preparatory level were asked to evaluate the test. The test was administered to 30 undergraduate cellists at universities located in the southeastern region of the United States. Strong interitem consistency was found for the written test (r KR20 = .95). A high internal consistency of items from the playing test was found (? = .92). Interjudge reliability of the playing test was high, as measured by comparing the independent evaluations of two judges with the researcher?s evaluations using Pearson's r (Judge A r = .92; Judge B r = .95. Other conclusions drawn from the study include: (1) Piano experience has a significant positive effect on the results of the playing test (R2 = .15); (2) The playing test is a good predictor of teacher-rankings of their student in terms of technique; (3) Year in school, degree program, or years of playing experience were not significant indicators of students? playing ability as measured by this test. Participating teachers described this test as a valuable tool for evaluating students and charting their course of study. They found it to be an efficient means to identify a student's strengths and weaknesses in cello technique.
General Note: In the series University of Florida Digital Collections.
General Note: Includes vita.
Bibliography: Includes bibliographical references.
Source of Description: Description based on online resource; title from PDF title page.
Source of Description: This bibliographic record is available under the Creative Commons CC0 public domain dedication. The University of Florida Libraries, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.
Statement of Responsibility: by Timothy Mutschlecner.
Thesis: Thesis (Ph.D.)--University of Florida, 2007.
Local: Adviser: Brophy, Timothy S.

Record Information

Source Institution: UFRGP
Rights Management: Applicable rights reserved.
Classification: lcc - LD1780 2007
System ID: UFE0019642:00001


This item has the following downloads:


Full Text
xml version 1.0 encoding UTF-8
REPORT xmlns http:www.fcla.edudlsmddaitss xmlns:xsi http:www.w3.org2001XMLSchema-instance xsi:schemaLocation http:www.fcla.edudlsmddaitssdaitssReport.xsd
INGEST IEID E20101208_AAAAJP INGEST_TIME 2010-12-08T20:24:44Z PACKAGE UFE0019642_00001
AGREEMENT_INFO ACCOUNT UF PROJECT UFDC
FILES
FILE SIZE 799 DFID F20101208_AABTSF ORIGIN DEPOSITOR PATH mutschlecner_t_Page_130.txt GLOBAL false PRESERVATION BIT MESSAGE_DIGEST ALGORITHM MD5
89b59c984064466a5fa2e665fc7982b7
SHA-1
676c535a3e2e9ec07ba8c13dcbfd1209018b26bc
435 F20101208_AABTRQ mutschlecner_t_Page_109.txt
468ac2bfbdc1df7141f0940db1a33561
1346ee4947fc103604f65053bf41da4383d2bf3c
48288 F20101208_AABSPD mutschlecner_t_Page_152.jpg
af49fd465ead1f349859de7335b7c2cd
f124baab6530172646682b7dc46cf16f0c97a458
12037 F20101208_AABSOO mutschlecner_t_Page_158.QC.jpg
d3003a56841cc87f70a0516188b2e404
e1d0f2cc9e5708d3afcb7b77bbe5df73880342dc
782 F20101208_AABTSG mutschlecner_t_Page_131.txt
d9e03acfc2569744f3631d82a67ed681
3b344f722a1127a04317c95e6a1ad7037edb5005
104 F20101208_AABTRR mutschlecner_t_Page_110.txt
ec9e57854800f4f1bcb2d5acbbff37a1
87a407264362f3e097dff91f59ae1c397b2f74ec
535 F20101208_AABSPE mutschlecner_t_Page_099.txt
f6cb86c0f3be4a051a99a6cdb251489e
a613e3afb0909395f79305973f20269797920b0b
16128 F20101208_AABSOP mutschlecner_t_Page_151.QC.jpg
a596bc70c452adf1bb75429ac89f02b1
95aa5bcb2b23546b4bbcbc31e2d550e620001f30
532 F20101208_AABTSH mutschlecner_t_Page_132.txt
609ba259fcf11b823ff993a9f0b597d2
a2b78416ed50fe681d7cb5d712e8e8e33baae97b
250 F20101208_AABTRS mutschlecner_t_Page_111.txt
bda387471aaa3d391c22ba5e5eb34107
7561daefd6122f2d597be807a4e894d0351a83cd
8511 F20101208_AABSPF mutschlecner_t_Page_068.QC.jpg
59a0100f78edd6a584b6768f59d1b373
adbffcbda038a5a9490c9fcd44f2ec6716da919f
1053954 F20101208_AABSOQ mutschlecner_t_Page_034.tif
fb1117cb71095191763abf6d89a48a51
1d8a8a81c2fa41f6375e9530fa9116c337373d9d
963 F20101208_AABTSI mutschlecner_t_Page_134.txt
b883b10a9cb2f9a82480760c86e44c46
cce04925160c2f13e675d77b7e7ae9166ee063fd
679 F20101208_AABTRT mutschlecner_t_Page_112.txt
9ac6bb974906ba25eccfb477d0639e4b
331110444bd24adcfb432e62fd864930e20f5657
24890 F20101208_AABSPG mutschlecner_t_Page_043.QC.jpg
eea5b5c3da3d86e15a42c80144d4527f
19414a3a3570dac22188a9859dc749707bf0aec2
4805 F20101208_AABSOR mutschlecner_t_Page_100thm.jpg
2c46c3dad07335f802f6f610dc69dd70
feddd753dcd483ba92c97011c625faaca97f249d
953 F20101208_AABTSJ mutschlecner_t_Page_135.txt
b053d42a9994685327cf44fa8f79c81c
30649056b5735113588f93d4b98e27ac913de946
616 F20101208_AABTRU mutschlecner_t_Page_113.txt
894dd79e33686d13dea93bfad4103075
21422479fefe3e2f39518346ec813044c520c1d7
19800 F20101208_AABSPH mutschlecner_t_Page_153.jpg
414b04531ba67aee0a7f7a4a6f2b4225
6768323cbf7de54af2c974e3031b496d495ba6ed
68135 F20101208_AABSOS mutschlecner_t_Page_057.jpg
1b2f119fb6010a6f49b1c25736cb775d
36b7aa180edcceb388cc35e00224116d461d0236
1025 F20101208_AABTSK mutschlecner_t_Page_136.txt
51cf7824265b15c5b2416f40065f55b8
e0c369042b404ec52e466f6f4e9d7888fcdc21e8
741 F20101208_AABTRV mutschlecner_t_Page_115.txt
02534ffee9b90f56415e187f620b7b0f
cf25ce4cc0a860220ebcf8fc6726d575f4adc0fe
4168 F20101208_AABSPI mutschlecner_t_Page_106thm.jpg
52a46519470eb95a53483207ae959b6b
01b12f66c0b0d4bcc31d7636092c19240bcd40c0
26710 F20101208_AABSOT mutschlecner_t_Page_017.QC.jpg
89f24370799c051cb9dc67fb5c50f96b
51d29cbd59ceadd9a405d2d359a52764592d3dfd
985 F20101208_AABTSL mutschlecner_t_Page_137.txt
8ad401fbeb2ff424385633b953d51aa5
e09a95ce8f769c2bbae88c8607feb38e2f3dc358
1033 F20101208_AABTRW mutschlecner_t_Page_117.txt
7e90fdd3a81789c656d7188311182a3e
c88de00975df8925cf0b99d4f6dac1822a6837c1
40930 F20101208_AABSPJ mutschlecner_t_Page_113.jpg
2d0320da5cc9b28013939c5302362d20
85cf063c9868ac5d7648cb5948050594b75d512d
1445 F20101208_AABSOU mutschlecner_t_Page_144.txt
a3fe747f9212662b1c0ebf3424c4cad1
3a6bf22251e3c6128f4c206f43c870f4d1d26e9e
2101 F20101208_AABTTA mutschlecner_t_Page_159.txt
864b97fcd71e0aefe9b6130980f1f3de
605c234c6a1f68213f62e0d2e33a10eb7e11966e
1600 F20101208_AABTSM mutschlecner_t_Page_138.txt
f2252334980308ff835ba2a54936fd50
9bcdac0cc4c568050e71cba6c3cc4c81e095ccc1
753 F20101208_AABTRX mutschlecner_t_Page_118.txt
e44e589febbc2ed9108a9b8ff2ea7534
4dd9d79c7ede1440a8a2733698af5ee3ac8f788b
2102 F20101208_AABSPK mutschlecner_t_Page_047.txt
5379dfb7eda1ea6dd333bbc57ad9e20f
f4c602554e3ba4cb0eb3cbb6ed1d73f25fa31a17
54689 F20101208_AABSOV mutschlecner_t_Page_077.pro
9675de15bfff48fd6b4bb034c75d89bf
4aa870e4038539edf620aaaa4f2549535efbd067
2226 F20101208_AABTTB mutschlecner_t_Page_160.txt
58b35e40626bd87be8db307a52547ba6
3cb834060ffc1adecb6b8a03a4ea0843bcdb3ac9
831 F20101208_AABTRY mutschlecner_t_Page_120.txt
7dde9f52f3c8a9bc95897c7a02cd74d4
31f0d11f0ab7aa8c8b5af2cc0b54613b68660a89
67766 F20101208_AABSOW mutschlecner_t_Page_031.jpg
3f7306b1ebe82b00acc5953d75359167
d5361bc06e2c7726cbe88aa7b08157148ec554a4
2212 F20101208_AABTTC mutschlecner_t_Page_161.txt
f251abb9ce8f57ce0507dc21a9f2e4f8
4af9ce7717b1cbb082055a4f455efbed0b174520
989 F20101208_AABTSN mutschlecner_t_Page_140.txt
c4186f629a01b65e00526b233737c5e0
76b070856d53de2085b6aa57db667829b7dbc050
717 F20101208_AABTRZ mutschlecner_t_Page_123.txt
89009fc355765ccf0618129484f359c3
12f0da749c674537595340704fd9721dd14c32f0
256277 F20101208_AABSPL mutschlecner_t_Page_103.jp2
3f7df6a10d1b17baf7cc421ef63b84f1
92f0e7e236fecb94b89312fa0854fcf05f11d216
10050 F20101208_AABSOX mutschlecner_t_Page_112.QC.jpg
7a87e8c5c87e62a420b1f5fef79f66e5
13284d2326daeabd7fe0f11297c9564c2166b3a9
F20101208_AABSQA mutschlecner_t_Page_094.tif
baf3b3b387f6ab8bdae9625d1601d440
99dce1b66a0fa30cc54e099c4d6c68a9f14b4d8d
2030 F20101208_AABTTD mutschlecner_t_Page_162.txt
1cde5f895432189e6b6a3732bac0062b
e7e568c7da61f3fd1446cb232641451a9bf9efc6
770 F20101208_AABTSO mutschlecner_t_Page_141.txt
ae81a6bee39072e51c7c3a0c27ee7009
8940a62d8b43892343c1147c489295076ff0015f
35655 F20101208_AABSPM mutschlecner_t_Page_074.pro
b9d011f1998c6ee07e21daadb7f219b9
70f3b00a55ffca55be64b2a3c5e1574c4718e192
4651 F20101208_AABSOY mutschlecner_t_Page_138thm.jpg
e5fb515dbde4c3a99c682fd72ecdf20c
5ae81b8cd5676b5c16802a2bf58c46d9648ec557
14417 F20101208_AABSQB mutschlecner_t_Page_121.QC.jpg
99c8019dd66c0b18598f586a83c1a77f
0cda5019b53ac08e2c20ff62ac61221697103cb4
259 F20101208_AABTTE mutschlecner_t_Page_163.txt
a55c4175b33fad2ea3ec90c3d7c36da2
bdf05a33d64e5c502e3307f29774ff5fb9207475
1425 F20101208_AABTSP mutschlecner_t_Page_142.txt
61f3e30ad00ade9d9663cf87733cc1a7
def1c96fc6c471237df20e54df5cbef6bbd12d47
11964 F20101208_AABSPN mutschlecner_t_Page_126.pro
e16fa184ae42a52e7822c4b42ad85c17
a45fd7c40c915751f3efff132c5c11ff61ac1169
8479 F20101208_AABSOZ mutschlecner_t_Page_072.QC.jpg
a74ebef987b33e7cd055f39b36a9d751
d8411c73fe11ff0d28531b087d9bcde1dee67b57
54297 F20101208_AABSQC mutschlecner_t_Page_043.pro
dd221f0579665a06b03fc635ea164f0e
7712c99f6799d4cad6e2a0c2bd7ad2e1b743cdef
4830247 F20101208_AABTTF mutschlecner_t.pdf
f2abf0f6151336fb45a2b8ab21e53eb6
f77d092cdca503d547c8961ee53840fb10b8e8a2
2184 F20101208_AABTSQ mutschlecner_t_Page_145.txt
1a0f91f24cf6055697bbd0570f621976
f863eb18dd4fa0ad395f2c4c4a91784720f70205
2400 F20101208_AABSPO mutschlecner_t_Page_063thm.jpg
db7e6f35f05d49857ffe37e6eea5855f
c619fc9ab14181fb4ddc9f1bd17fcfe7a11b3f02
17499 F20101208_AABSQD mutschlecner_t_Page_107.QC.jpg
1601068545ac97623501ac636e3b74d2
c359fb032fb148b69548a94f8e8d9b66bb0fff18
7865 F20101208_AABTTG mutschlecner_t_Page_001.QC.jpg
7b388d052c3c605dffec406461fcb931
7b19d631dd2d3725d0faf764f612398f3f8823fb
2026 F20101208_AABTSR mutschlecner_t_Page_146.txt
f68529fbd02474acd1386941bc920cca
5b094c7c791b77911eea49b840daac11abb218be
F20101208_AABSPP mutschlecner_t_Page_050.tif
d5855634a9094d81ff756520fb43c9a9
aa474d785fac3f461ad31ce7acffd94edba7d020
F20101208_AABSQE mutschlecner_t_Page_069.tif
4edc43baf1b4b96d5414f5b17c1ec4d4
7e4acb396d6b537f0215cafc3fa407d2a3386cd5
3339 F20101208_AABTTH mutschlecner_t_Page_002.QC.jpg
450c9221c53485aa6302601ea79e5686
52d922951501818b2465a6548d32f0767764b5ec
1960 F20101208_AABTSS mutschlecner_t_Page_147.txt
5b37736e9cc08aeb56fc3770b95b0332
2e7c46ca951cea72cea6abd20696350b39737a43
609279 F20101208_AABSPQ mutschlecner_t_Page_113.jp2
d98d036b54eb32f96215099d8b34b20e
8a8b245a820f6a7dbc3fcc836a2d9de60788884f
2107 F20101208_AABSQF mutschlecner_t_Page_076.txt
9923ce0129d8eb428c28e002f0a90ac4
8eabd9b0383ba4a0a4f036621a62672cd07ca82e
3822 F20101208_AABTTI mutschlecner_t_Page_003.QC.jpg
d44ee1f78ab17da5becc1b8a3a933e1e
7ce1c369c9e59ab0f23a35aacb61bbef47999a8c
1310 F20101208_AABTST mutschlecner_t_Page_149.txt
6f4f77e7780cd65082402f26cccbabaa
5c451fcd3c4bf6c370dc74c8483e703d1b8f9fb0
825 F20101208_AABSPR mutschlecner_t_Page_122.txt
23d6e91f76744e32189db02e2384cca3
178cfefccbee58be3696826f79f0a54528bf73c6
2199 F20101208_AABSQG mutschlecner_t_Page_082.txt
31d24e5391dd93abe934fcdd8c78ac72
00ba7990cd9f4a1d146d2ab252a3d4fd59aaa905
23893 F20101208_AABTTJ mutschlecner_t_Page_004.QC.jpg
a902a686bf9c85317bfbe37adcfc3afc
307ef32188bb680303dd24603930c010232663c4
859 F20101208_AABTSU mutschlecner_t_Page_152.txt
f75fac42f35c8209b31b846a0311b7ad
d04ccc32bb5fcc3133f7edfa351bd0fcf6bd5620
6822 F20101208_AABSPS mutschlecner_t_Page_060thm.jpg
64ae068c93e1c7d786028bdcf51934bc
1bdf55c014569dff053e8712f526bd8f9cc9cb47
6973 F20101208_AABSQH mutschlecner_t_Page_055thm.jpg
f793cbd26d13274d072adab4805559d7
5f8cc05caa2a1b8ced90c450f0ed0164371f289f
12108 F20101208_AABTTK mutschlecner_t_Page_005.QC.jpg
552fa17cbe726390dd43599c46ed7d7d
eee32a9303b228362dcf421ea2d2ec1358a0e751
156 F20101208_AABTSV mutschlecner_t_Page_153.txt
60411bb5b32365227691f0ce9976e77f
56c7810b64a27666d2738592b590b9efcf8c6fc5
25271604 F20101208_AABSPT mutschlecner_t_Page_008.tif
af6c35aa9288a152b79437071b7b82a4
7c597513f9a290770858e1c579c30511d01e211a
51221 F20101208_AABSQI mutschlecner_t_Page_081.pro
fc23c5282549af61ea287c87fa689583
f51f14ef67c7788640480b4cb844466ea7416251
3612 F20101208_AABTTL mutschlecner_t_Page_005thm.jpg
eb608c25665e11184ec26383cb512d4f
32cb367948e2a573d1f57bac1e6ddbeecd224216
1340 F20101208_AABTSW mutschlecner_t_Page_154.txt
def24925d9b364a2b3bc40ee9474da66
ff910e3324b53a21650194788a8cca2a75fb32d1
2006 F20101208_AABSPU mutschlecner_t_Page_163thm.jpg
64ec180dd3d6a1d0953ff9c9e98f6726
546d0c0067ef12e21c2b56eb483868a91b96175b
14865 F20101208_AABSQJ mutschlecner_t_Page_127.QC.jpg
f7ce4b9abbef3583f50f232510779d5b
90dfc58839265489a6f4477143d568bc0ff23c67
6598 F20101208_AABTUA mutschlecner_t_Page_015thm.jpg
e30bf8a49acd53bbbf89eaee71ee7257
cbf52a552633c05e72950279f64ba652505b7b93
4940 F20101208_AABTTM mutschlecner_t_Page_006thm.jpg
e7f638d0590a938a0f7e25b6bb823f5b
b23da4fd4ed82eac4cd484539d6c1f1d58518c26
1368 F20101208_AABTSX mutschlecner_t_Page_155.txt
cf1ee9bf10b832a04e5a737076cdc6d7
5d9494d10e03b16c1abc08a6878f8531b45c8f7a
1051908 F20101208_AABSPV mutschlecner_t_Page_130.jp2
dc93a833c631576536a525004e46a047
027263e2df00048fd940f503c09d5b2edfd76dd3
F20101208_AABSQK mutschlecner_t_Page_020.tif
d4cd129ebaf7cdcd884070f75215220c
ded8fd514a9de47fc33f26c5eed33d7c9ca56b33
6146 F20101208_AABTUB mutschlecner_t_Page_016thm.jpg
4a31b9f5054d336414483c4fe3f8872e
18f0a01a2f460b4ffbedcd7a00d413a296c589a0
5828 F20101208_AABTTN mutschlecner_t_Page_007thm.jpg
c98bb8bae3ef8b095d74441bff1d9c73
1f47ce0a9f0ca61b79ae62ad4492a6ba79ad4a31
3225 F20101208_AABTSY mutschlecner_t_Page_156.txt
76ce544ef8b9fe1e84b7edeceb2eeed3
b653b79844022c4f65a63748432f6ab6f0886f7b
F20101208_AABSPW mutschlecner_t_Page_095.tif
a9aa0ac00b385b30f5ddb41ac0fcdd5d
72d9905ee38100f409e78979d47501f54c3a5bfd
8423998 F20101208_AABSQL mutschlecner_t_Page_126.tif
7c2e430b1b6d669a98e3f2d5abd0032c
65f6a1d132afd79c97ee83e01314582087a0e9f2
7126 F20101208_AABTUC mutschlecner_t_Page_017thm.jpg
3679cf996088bd381cb308caef39d643
ab111b5e35aa8ca010b80c129772c9005579c790
3264 F20101208_AABTSZ mutschlecner_t_Page_157.txt
162c03a274646544730df1e49dad9c5c
2db01d309c226dc06b1ad68b11939e06bb22bfa8
F20101208_AABSPX mutschlecner_t_Page_077.tif
914e5396f52919e22259f9c673095cc7
93f9f477b1672613264991128371db343fa306de
6500 F20101208_AABSRA mutschlecner_t_Page_094thm.jpg
b689f57562a3a37c8ca7ebe7950ee3d6
7cb979be35350982f9403165d84e0d056407daa0
6966 F20101208_AABTUD mutschlecner_t_Page_018thm.jpg
c3709850b5185130e411531ab96b48d4
38590ebe06be607327a52cb398d010f40ee24ec2
19887 F20101208_AABTTO mutschlecner_t_Page_008.QC.jpg
4a7a0c377c55b17e5dd678e29e79df31
d6552cf2472e97c96095a52cfc9afdf7a242d799
5166 F20101208_AABSPY mutschlecner_t_Page_142thm.jpg
4fbdb78214636a16d92cf0b4e1f8eb0f
5de71511c84024e94182fd2f76e945f9f347a61b
1369 F20101208_AABSRB mutschlecner_t_Page_139.txt
4c9068ca9db3af8969a0a37468f5cb82
57d7b60d2ce69c88d9029bb38ce7d23ed3b4a350
22908 F20101208_AABSQM mutschlecner_t_Page_045.QC.jpg
928846fa2f4bbad254094aea75679c7b
03f82fcebcce677cf31c7292cb601f688f4cc4c3
23762 F20101208_AABTUE mutschlecner_t_Page_019.QC.jpg
3e851dc6ce423d95022f22fda5c66a36
3712a6e768cd087c4155a10a01f292021edc8e01
5402 F20101208_AABTTP mutschlecner_t_Page_008thm.jpg
ce8f238d694645ee5db14d665bfbe785
63ac806b39acf16ebd6e271c970525fba6d388e8
854629 F20101208_AABSPZ mutschlecner_t_Page_122.jp2
87ffe2c0360714f333d6c6ce7972e8f7
e39da993656bdb680dcc5cf1b59792a50bd10088
16761 F20101208_AABSRC mutschlecner_t_Page_139.QC.jpg
3e4fb6181f05eb984bb4f6c018ce59a4
a9e68d35de183422106d257de63cb207867d7135
13794 F20101208_AABSQN mutschlecner_t_Page_137.QC.jpg
822f2ce32fab8346e4eef6c06ee1b796
f2d4de4df28f599018cbd394a8a194414acb89fe
25151 F20101208_AABTUF mutschlecner_t_Page_020.QC.jpg
bb4a9301d33b3e9e69d97fc5063bc392
9ab853b43e15a93b414a0689b6c429b1f739b2f9
19614 F20101208_AABTTQ mutschlecner_t_Page_009.QC.jpg
cf326a6f4e8afe5d8259f38bee93dfdf
f1aa69c59c00c2c90e1582d0b13dbfa8214364bd
80837 F20101208_AABSRD mutschlecner_t_Page_006.pro
cd27709275c676a77d33ef0593a4c82a
ce2bd5cf00c3d9cf47a7ef35b00604b10b333ddb
2143 F20101208_AABSQO mutschlecner_t_Page_083.txt
f74b4378c565de969cc4053587787ab5
995ef0c700f0c45c63902fd659dd5b01d44f401e
8642 F20101208_AABUAA mutschlecner_t_Page_119.QC.jpg
904ca0baeb4c453ae8064ad1f85768bf
b435f0bd45c89a7634337aee6016fda1cb1bb297
6806 F20101208_AABTUG mutschlecner_t_Page_020thm.jpg
cc735318b2caca107a7ce5c0c7ba83a5
63fdc0bd9f1101ebca485e7d5db1680b8e6855ff
5394 F20101208_AABTTR mutschlecner_t_Page_009thm.jpg
d2a7729b9f7f9f62c86bec291cc9ee76
91fe160a6699c064530fe1ed3c72c15e4f821a84
1598 F20101208_AABSRE mutschlecner_t_Page_148.txt
6d23c10962a3d5dbb27a747b54ac28ac
c043b31cdd7be6b5451e8351cbd12a9799730212
116293 F20101208_AABSQP mutschlecner_t_Page_095.jp2
5376ed2f434a7509b1f3ad55d35b96b3
a3f2f78a6bc2fe150ab59bb113a04f8fe275abba
2895 F20101208_AABUAB mutschlecner_t_Page_119thm.jpg
ea9fb8fe5aa1b88c79ec7350c4ad6c72
ea6a637301e81e1b02faaecc9f4c6f0ba88e1b49
22967 F20101208_AABTUH mutschlecner_t_Page_021.QC.jpg
c26dd73715a868dec597ecd044ad7205
99d5df34ec3a3278156ba23e4bd20268a06c3542
21626 F20101208_AABTTS mutschlecner_t_Page_011.QC.jpg
6964542cdba582118e915a6d2abd577f
bb8e5b390c6dc05ed5bea5062f486c603ff2121f
F20101208_AABSRF mutschlecner_t_Page_156.tif
e2b4d2bc0b79df83389c1066cc9b6a5c
a3f880599827751a259d5b20db8f3edb34311da8
49370 F20101208_AABSQQ mutschlecner_t_Page_010.jp2
082cef7660ca7e0e73edb8c8c0b138c2
0473ba910a3b6f19f3e743ef9e152839c6d3578e
4306 F20101208_AABUAC mutschlecner_t_Page_120thm.jpg
ec2ccb31553cd2877f5dfac8342c260b
f754f0e91b94ca641fefd79f8b98b1dea567b56d
6338 F20101208_AABTUI mutschlecner_t_Page_021thm.jpg
b43de0542b0beffca5479e14a15cec4d
2c18388a43ede26f216c00faf5782be9d835f699
5990 F20101208_AABTTT mutschlecner_t_Page_011thm.jpg
a3517b0bba25b94abab1ffaf5faa3b73
bb17bcad47a59819fb961faf56db45676136a26a
54287 F20101208_AABSRG mutschlecner_t_Page_074.jpg
9e49a0fd8312bf2f92e2f1f64fd14b74
3cddba7eb767f096d990a60584323757e3211362
100873 F20101208_AABSQR mutschlecner_t_Page_059.jp2
39948cdcd66a43698d0e52bbe72ddcc2
60182d063a5a9eee05886d2992419b9a91e9bbf1
4210 F20101208_AABUAD mutschlecner_t_Page_121thm.jpg
9d17db6ca7719d205d738a4d646bf664
b21343fe3a5a82d6e2bcb1906feef472cc400d0c
26119 F20101208_AABTUJ mutschlecner_t_Page_022.QC.jpg
5dd0a78394f8f1cdec661edfc87f6c76
b8829b41b81adee00d135aede68a97b6f1394cd5
11982 F20101208_AABTTU mutschlecner_t_Page_012.QC.jpg
a3b7bfaed8963a62be31671d5b0a8e10
70d574eabcc3a758b13fc4c1e84c44f9b394f1a1
7030 F20101208_AABSRH mutschlecner_t_Page_088thm.jpg
a437d53dfafccb108c62ce0cbf2cd6c1
b9e2307aa263fd3721b52d3caacee0853f3aa203
74338 F20101208_AABSQS mutschlecner_t_Page_033.jpg
3507f0da857a755766b456fdf7a76d16
93a3bbc16b34bfe987ddefb041bbd2d2e5c1f8bd
19424 F20101208_AABUAE mutschlecner_t_Page_123.QC.jpg
4b88486b53163bd6715e425aa877110a
07f8de18a7f5eecb80e011efd4f3cc7faf07bf89
7002 F20101208_AABTUK mutschlecner_t_Page_022thm.jpg
49e0d339a5b0eb22aac5f40769651e9f
6e7226a2f486ec57b4ac322f02151476c0d38aad
3595 F20101208_AABTTV mutschlecner_t_Page_012thm.jpg
8fd67b11cccc8ea8d9cf77ba6b43a0de
333361b50592875bb2f48cbcafe5a184c945fa2b
6197 F20101208_AABSRI mutschlecner_t_Page_046thm.jpg
15e0cc2363b8e5887e1c573e79d25cc7
82a1c7d0f5ccf54af886165ef6575f6e4f7c423d
54652 F20101208_AABSQT mutschlecner_t_Page_095.pro
82ccd3ff5449782d08fbe810fb767f5b
a521b20099a4b1b0574ba61629b61ac832ce8e91
20282 F20101208_AABUAF mutschlecner_t_Page_124.QC.jpg
f4d3db7201acc17adf6b2c8720427241
557d8e7de5d555e46c5f6a49c926e3a9467f2e3b
24479 F20101208_AABTUL mutschlecner_t_Page_023.QC.jpg
f60f1d43ed091be74a9dce12b06513b2
32e83f14be225e8957395dd4052a634054d1d7d1
6590 F20101208_AABTTW mutschlecner_t_Page_013thm.jpg
cdaadadaf8dfd4a55622cc1c40effd90
d553b94cc86a4e5d63fd8e01cecce3cb91c16abf
1998 F20101208_AABSRJ mutschlecner_t_Page_078.txt
d05dbeeef8d15be3b6a96bd943d4fb5f
58dc71872e454c7ee000d99f54f676da10ba7f10
17756 F20101208_AABSQU mutschlecner_t_Page_105.QC.jpg
e5db7495a660c8fc7e943f6dda0faf33
c6479386d5599d2cc8e4c35701e78d618a34c21b
5917 F20101208_AABUAG mutschlecner_t_Page_124thm.jpg
25d5881445327bbea0218f0d69ee2890
49926ea4accec0d35d1cd90a5da0929c4ea2f96c
22048 F20101208_AABTVA mutschlecner_t_Page_031.QC.jpg
de2fc08cfeb762d9c27be2b26b668835
786b43de6412feed307c04507362f5f979a9447d
6778 F20101208_AABTUM mutschlecner_t_Page_023thm.jpg
4f29e01ec859539a04514d2587a909cb
9820c5f818daeca8443bf10e6818b4deece0425b
20877 F20101208_AABTTX mutschlecner_t_Page_014.QC.jpg
1d6172be3ddac0141f8df4ddb3533de6
dea530adaee9963382fdf2cef50003f8d8c16c17
24163 F20101208_AABSRK mutschlecner_t_Page_007.QC.jpg
b4153508791a3cad330fbe918b4c6970
bd2b4250aafc69b4deafcbf4b503e9d95d7c3634
21343 F20101208_AABSQV mutschlecner_t_Page_016.QC.jpg
fbcde2130c9bef50e996be5f791ebed7
df03dfe4ca6d584445f194870531bcf122705883
15778 F20101208_AABUAH mutschlecner_t_Page_125.QC.jpg
e99fd7d17e148a4a77af83e20eb0ba6b
8b5231b1f44bdabc94763cb3cd0403e22b5247b2
6040 F20101208_AABTVB mutschlecner_t_Page_031thm.jpg
6cd7e7a0ef587b5a416b40b8b699f944
daea8da281237b17f9aedbee91a460b81abf25d0
25877 F20101208_AABTUN mutschlecner_t_Page_024.QC.jpg
cca2e0f51031aa4ca7c930658b0c979c
3e66fcc8d9cde27a9cb8575f301623de4140a4ca
5724 F20101208_AABTTY mutschlecner_t_Page_014thm.jpg
051e1d297b08689d5e70570f1476d674
36b3000d9f7f2b71195a0f03433f28655ad6c8b5
10626 F20101208_AABSRL mutschlecner_t_Page_010.QC.jpg
a5c285dc59d0aae95be447e78123edcc
0e3799735a492c5f4abb9cb5be27507ed643e464
84529 F20101208_AABSQW mutschlecner_t_Page_024.jpg
c0255b553bd93a2ec983a465a51c9511
85be0ccbaa3b8563b47c136c2eba9dfc59f6589d
5451 F20101208_AABUAI mutschlecner_t_Page_125thm.jpg
c78b86e1d7d8eb8ba5f02c273b1e8c1b
8f47ca3b70f02dc907ec4da0816f4a2d32b865ac
23948 F20101208_AABTVC mutschlecner_t_Page_032.QC.jpg
c57b83f5e8f41d37dd1e4776e684226d
08174825940a0a81f90f8dd73c0f1f764e56aee8
7059 F20101208_AABTUO mutschlecner_t_Page_024thm.jpg
a0f453ee3ec7fa1c0a3cf3f326f20cf3
592bd625fcb9a7863deb4f83ca0c5acee2d00fae
23055 F20101208_AABTTZ mutschlecner_t_Page_015.QC.jpg
a3618085b5fee5185edfa03c61475ede
039a01847f87723b88dee25ec813e09aa775b9e6
16647 F20101208_AABSSA mutschlecner_t_Page_163.jpg
27cc11c00ec25e4c4cbaa680769dae68
c05455f825a9a279e862c8fd773d13766163b449
6877 F20101208_AABSRM mutschlecner_t_Page_083thm.jpg
1247c7de0882d82464a783166fa2ab1f
171b675f1b456681b21d3e603750ffe4f0b59269
6671 F20101208_AABSQX mutschlecner_t_Page_048thm.jpg
c6a923c6123fbba0ab9a1443258603cb
ffb0d80f9cd6affc39e13859220c7dadfc00fc19
11605 F20101208_AABUAJ mutschlecner_t_Page_126.QC.jpg
0ba02453e1446513895bd3aec1f3f367
c95587bf4265b45c6aa3966ec12ea555629a2de6
6533 F20101208_AABTVD mutschlecner_t_Page_032thm.jpg
f34c979496c808b5b33083bd3e04a348
1f250567de3d848af2b19e1e6627c36034602449
4518 F20101208_AABSSB mutschlecner_t_Page_113thm.jpg
b40d22b440aa3d75532e313529711ad7
816ed4626d222703ea7ce31e31b9af6ce2e021d2
50641 F20101208_AABSQY mutschlecner_t_Page_034.pro
51a6c239fee18ce3fae4b422a071560c
fa0134d4f9e69b5605efb85b36caab494d4ccf7a
4550 F20101208_AABUAK mutschlecner_t_Page_127thm.jpg
10e7ab4d36b8b55ec7f4b85c1151a750
2b2d004821ad735c92b7f076ff27d8487ea4ad56
23008 F20101208_AABTVE mutschlecner_t_Page_033.QC.jpg
deeb3e9bb602195644e3675d0ddccf52
dc8fc63d7ca0ad5cd6335cd9bad1787418315d04
26861 F20101208_AABTUP mutschlecner_t_Page_025.QC.jpg
8818a795ee25eccd4d4b870cf694ffdd
5f80086df1bd083511d8e53b6818be16a8381a35
620860 F20101208_AABSSC mutschlecner_t_Page_126.jp2
1d4319ee5a445cbc5a23dca16eea6d87
86acd39641ccb86cf6011b6b957afcff70dbf336
104149 F20101208_AABSRN mutschlecner_t_Page_058.jp2
d7b43dfb530a503559688c18e5c40ee2
90023e856d9b3a687601e3804a2258ae4ed51c34
6865 F20101208_AABSQZ mutschlecner_t_Page_087thm.jpg
e54cf0c35d7d34ca99be70511c299c91
d75d1f29a3015346947ddb3465602c85879b4ddf
13986 F20101208_AABUAL mutschlecner_t_Page_128.QC.jpg
367bba4439cd91d6b8e65731ff1a606f
c163210543ce3d97288ae3ea5e9e32994e9704b3
6368 F20101208_AABTVF mutschlecner_t_Page_033thm.jpg
725b5c1c72fd4f5a95dbde5b9db11d11
58cfa65f8e3dd976fa726a24925b7cff7bd81fb8
7247 F20101208_AABTUQ mutschlecner_t_Page_025thm.jpg
193ec679bd4ecc34f5c6f4efe0215fb0
828ef31466007fc327604721644d22ec5b9b231b
53577 F20101208_AABSRO mutschlecner_t_Page_040.pro
c381f568c5985f264320cef947beca5c
182c33e0e656193224d713e368f5984027bad7c1
49065 F20101208_AABSSD mutschlecner_t_Page_065.jp2
d6674a74947670b1d0194ede5e973b42
27282f4b2640eab5a2d78668e9c991b3113c7666
5771 F20101208_AABUBA mutschlecner_t_Page_140thm.jpg
a8f8c374a441eedfeca79c51837b4253
75192f6fd4628a3bf48663f7a49bc1719a3da166
4216 F20101208_AABUAM mutschlecner_t_Page_128thm.jpg
44f71c22f45d53d6319bc558f31e434f
8b8fa4d4d723c93f8328f5cb85d0263905cd5214
24086 F20101208_AABTVG mutschlecner_t_Page_034.QC.jpg
9cc75e7e56b83d4922c4a525edad67db
d4fdd6ee19f2cf755d95cf01a53f44eae62d19ab
24035 F20101208_AABTUR mutschlecner_t_Page_026.QC.jpg
58682ed7dcbddf3a4103765bc3837e8c
42de9105f0bf7b0cc59250d1511c5d2377857bc6
F20101208_AABSRP mutschlecner_t_Page_021.tif
1dba79710568b386eb01a7d4707254f1
6001dfee55eafbf66d296aa18bb86707ed30b56f
23338 F20101208_AABSSE mutschlecner_t_Page_046.QC.jpg
09c83097370dc4398f2d9919d3adff29
3f44417f9e3b3bbd0279723c2e88ffc860f1473d
17285 F20101208_AABUBB mutschlecner_t_Page_141.QC.jpg
cb24444b6841f238666c0a61320b510d
ce2767b428c6aa69bc14d43c8a05167f9d53b92c
4347 F20101208_AABUAN mutschlecner_t_Page_129thm.jpg
361d996e44225b0a0992b3e0e27ec790
18848496d0c448abc9e08ee8c7041aa073b60970
6521 F20101208_AABTVH mutschlecner_t_Page_034thm.jpg
1ffbcffd42907e7e490b18bdf76f9317
07507de21756891cd8f47b99f25c3e6dde513ce4
6886 F20101208_AABTUS mutschlecner_t_Page_026thm.jpg
d6bb90ac23d283b65615e2c40f2d7102
cb0259910cb046fb018ede7f9ee48cf4de9352c9
108545 F20101208_AABSRQ mutschlecner_t_Page_097.jp2
9b43a759b3db3700a0a64795aca47b03
5399ebfd6dbcce8c19afe5c67f389c5ee6c1725b
6459 F20101208_AABSSF mutschlecner_t_Page_095thm.jpg
fef7357653201568686a9f7f51ec56da
65dcde2b396a0136425ee37ea33c2325b143bc75
5289 F20101208_AABUBC mutschlecner_t_Page_141thm.jpg
61482d50a629aad93cda18c2f1d11246
b8478458ae84300a833c1114cc6f7c54b904394f
6250 F20101208_AABUAO mutschlecner_t_Page_130thm.jpg
8fffeabad4dab789bb25ad8ccad0f629
ed99718fc8661b9be43c3fee93e83942855c54d9
23023 F20101208_AABTVI mutschlecner_t_Page_035.QC.jpg
e88026fbd175b043861ba104af1d7979
61f43209f0038e76acf63d4d999b5520524dac74
23544 F20101208_AABTUT mutschlecner_t_Page_027.QC.jpg
a8ebc3f09e9b9c6b60af39f460779442
c537ca085441297a8140c2a2124974c10d22dd4a
25028 F20101208_AABSRR mutschlecner_t_Page_068.jpg
247d9ce8f278a54b1e63fe6b38e90785
27f6f669f9c65fe71daf65a4ec81ee5c0096302d
24069 F20101208_AABSSG mutschlecner_t_Page_039.QC.jpg
1a38c105bca3b270aa110cea6a351c54
c59be635bc2c30288691b022635c0951dd010826
17018 F20101208_AABUBD mutschlecner_t_Page_142.QC.jpg
e8c3cc8a77cd8032d7b37feef9f2855e
186c1abc73115b5075ce953727bd55ef57e0ebc2
4872 F20101208_AABUAP mutschlecner_t_Page_131thm.jpg
84cb20450628a35b729fd5105be721e3
8aa1812703d1ec918293f28fde38f2e9c279df30
6376 F20101208_AABTVJ mutschlecner_t_Page_035thm.jpg
0f113f588e2e4c9e07557644abc8ae27
4df4814b1205d27001e1969d6e552b9e0c294e4d
6496 F20101208_AABTUU mutschlecner_t_Page_027thm.jpg
8b001668c03c887c7bbfbfd3e5edc505
a48fd6b168119744cc296b8c9764014a70c0dbe0
4616 F20101208_AABSRS mutschlecner_t_Page_134thm.jpg
554f69616ccf29da9df6e11d3c28d33e
e1ba68d0ba84893b461b50c117bf39b1f9558be0
74708 F20101208_AABSSH mutschlecner_t_Page_029.jpg
8c0afb3cff92b08a2401831d5514f0df
d185edaee12503c3dbb410746228f208f5a29ea7
7723 F20101208_AABUBE mutschlecner_t_Page_143.QC.jpg
9fbfb41ee95f7de65f565f9bbbc535b7
d6ead535a8c36eb88326ca0c920d2942bf0c2380
7809 F20101208_AABUAQ mutschlecner_t_Page_132.QC.jpg
7771088be0a9be3a43c766f191c697c9
0f29b6adf3b3f306ffa8d2dcd4e84db1f8d8484a
24785 F20101208_AABTVK mutschlecner_t_Page_036.QC.jpg
305d7f82407879401ce428064b0154ec
a3b8962270413a233946b7b8c817feedd7f670d2
23414 F20101208_AABTUV mutschlecner_t_Page_028.QC.jpg
cacbc3a65baafee9db10738ddb2bcdc8
b049142bae341c913eb79283b71b01d836850889
24065 F20101208_AABSRT mutschlecner_t_Page_013.QC.jpg
1ecb73906e9b67786ac0947a01076383
7db4c94d31f7fb346be0ca611396a664a0b246fe
23745 F20101208_AABSSI mutschlecner_t_Page_041.QC.jpg
9c228fd05a01c9316a361696115e8cb9
c4dbe5ff3d639db96a6793cfaa616bb0f69a89d2
2455 F20101208_AABUBF mutschlecner_t_Page_143thm.jpg
5ee68fa61990f8b322c6f1fd37a2dce9
5fc12a67b9f677550330d523f3eaf9506303e6f5
2502 F20101208_AABUAR mutschlecner_t_Page_132thm.jpg
020f7f28f042ce31268c704d517f37d0
57c350733f74026ab18d0328c60aa5b02b259d9e
24139 F20101208_AABTVL mutschlecner_t_Page_037.QC.jpg
9059c2a7e23da6d803d267eb5bb01d4f
e0fb4c898114598dd11855f75090fbbc485fe2c0
6549 F20101208_AABTUW mutschlecner_t_Page_028thm.jpg
4a97fe5da03e09d75fe4cd90bdac93f7
404e414f67a64a485afdc5d39ec66d49346b1e53
51488 F20101208_AABSRU mutschlecner_t_Page_039.pro
13a8194e60b4aea0775e0d381dacea46
c2f8b6c89022e4666d3d64b9c1bf40ac31f0d576
78777 F20101208_AABSSJ mutschlecner_t_Page_089.jpg
316c94e9cc9a57f27a9fe42d6c0d2baf
f79a2814cb91c258dda2ca7b45f2de023fedd581
4166 F20101208_AABUBG mutschlecner_t_Page_144thm.jpg
ccd679f746119ce7504c5e188de9fdb2
f7ba64a0389d129e4f45aa8fba99cc54f9b69801
12855 F20101208_AABUAS mutschlecner_t_Page_133.QC.jpg
9a4768b82772cf8c84796aa01cea6579
9c919cae0bcb0d72d8a440fb65114b85c721cee2
23545 F20101208_AABTWA mutschlecner_t_Page_049.QC.jpg
f29315c587bea428703cbf037edfc36c
d1c69b14764650c56b09bf50b9ffd43f8504602f
6774 F20101208_AABTVM mutschlecner_t_Page_037thm.jpg
530f6d93c70a8f7fa213c2a1be2cb9f1
039e78d3f764ff6cc2a736ee87600802a796c4dc
24435 F20101208_AABTUX mutschlecner_t_Page_029.QC.jpg
03e0c2c193406f1f502970e9be638889
b3be53c62e7a148e9be02b076958d3120bfcae5c
130518 F20101208_AABSRV mutschlecner_t_Page_088.jp2
63dc16c1ee53fb2e8eb626fd1d0051d3
28e47d4176296f1a0c0a89e4d8640d72ec92c155
112664 F20101208_AABSSK mutschlecner_t_Page_039.jp2
c7563dafa2dc626d542d5fea2808717c
f553d120286536c61ce123f7e91bd1fed34aedb3
19860 F20101208_AABUBH mutschlecner_t_Page_145.QC.jpg
8ff781ac0224cb64365fe82fc539c77a
4ed29a542b7c13fd73df615a1cc8eb3a04cc6cb3
16273 F20101208_AABUAT mutschlecner_t_Page_134.QC.jpg
dbb6e94c245ee82027da037ef6356aa5
45eff66f6289cb2010593b8717fa03a5efc07026
6446 F20101208_AABTWB mutschlecner_t_Page_049thm.jpg
c87aa89d505d1ab6de4e31258b0cf0d6
c16fe965d249e8bd270bdb59c86459d890e7df99
23779 F20101208_AABTVN mutschlecner_t_Page_038.QC.jpg
d7420c6fb5193c77e76ec340fbfbb0f9
76629f22d9e716fd1305e65a97ac4f9d7da44cb3
6706 F20101208_AABTUY mutschlecner_t_Page_029thm.jpg
9b85002fc422dc1e3ada115c3ca07f61
82c3cd35dadebf99e6051d006490594accdba7eb
100 F20101208_AABSRW mutschlecner_t_Page_108.txt
96ed06847d311e883cd1cfacb67a6a6d
1961381275d60d17efcc06a80ee838f9b620aaf3
91653 F20101208_AABSSL mutschlecner_t_Page_147.jp2
7ea97d28399b9b0b9d80d7443b961c06
c4429d1f02aa1cade281749f85f7a0ad8c80c5a1
5567 F20101208_AABUBI mutschlecner_t_Page_145thm.jpg
24d352e9e4fcaf2df5a8b0256a33f7c9
475f605476e08f472a415e2ce6c6e06ee6e251d0
12750 F20101208_AABUAU mutschlecner_t_Page_135.QC.jpg
53a98b26fa5f0643da5c034adbf408a4
666a0f3696239aeebd08002dc41dde682e4c6865
17336 F20101208_AABTWC mutschlecner_t_Page_050.QC.jpg
0b1800c862ca13d7e1f82257a353cf7c
16a516aefb6db9706eb65fddbaf3111077fdb57f
6679 F20101208_AABTVO mutschlecner_t_Page_038thm.jpg
9877e32ecac8498e5efc2330e9eb17cd
efa69e65a1e92288cac6a5b8132a5b2df0d4fcf1
6755 F20101208_AABTUZ mutschlecner_t_Page_030thm.jpg
08f0c6b39f9815c5fd699009f113a336
b3e09b73e7da126b6a1758d198dd792d243ed940
117050 F20101208_AABSRX mutschlecner_t_Page_159.jp2
742582b2fc8a95d3e6b0d4f29c283295
1b1c1091b1ca464bb7205f1062e37b2ba881c78e
50532 F20101208_AABSTA mutschlecner_t_Page_078.pro
7d287d4be73b58b4198adae695504607
8f2032f79b53da999376eb8b0da5db50339ea90b
8545 F20101208_AABSSM mutschlecner_t_Page_121.pro
07f493d15f9237ba6dec6fd20f964aba
fcd1fdfef7b152e3b98e6836c644df2cae8e1cd1
17438 F20101208_AABUBJ mutschlecner_t_Page_146.QC.jpg
82195db6a989a1a66d47069eaf05d8f2
c23e5b24ca83ec8d74c3696760f77794498148f1
3682 F20101208_AABUAV mutschlecner_t_Page_135thm.jpg
b2dde1a7545ff283374eef637aa84eeb
c7ee977aca85e6fb7f79d771843ec491475458c0
5092 F20101208_AABTWD mutschlecner_t_Page_050thm.jpg
3db87205935a604a37c8d717c1168c5b
faa2ec8b7938b90ac05233462b5289866600d3c5
6417 F20101208_AABTVP mutschlecner_t_Page_039thm.jpg
18387be25eb6f3226f244f612f199792
0cb06160740063ec960009f9ea62b1b32bab3a2a
F20101208_AABSRY mutschlecner_t_Page_144.tif
f0e50c15220222fe4c19179a83856168
2dbfa7cc2d256a7849734c1485365357ed427dcf
109947 F20101208_AABSTB mutschlecner_t_Page_080.jp2
307f60c7f6fba40fb682a5d3ffe16bc1
39f7b3ccbcdaa98fdf1a9a80237b22c6455704db
6332 F20101208_AABSSN mutschlecner_t_Page_159thm.jpg
231180c799df46435bb36bf763053301
a87d9472d03e10bc5f173d376c615519523e2d1b
4914 F20101208_AABUBK mutschlecner_t_Page_146thm.jpg
a922cfa9747c8e06109acdac05fb3bec
e3aa3f501d76b7a8badefcb003a558becc9a1f9e
14225 F20101208_AABUAW mutschlecner_t_Page_136.QC.jpg
fae643b083c959b8f05f168ab9b59ba4
c1afdcc29b84c3a09981d6b3a53b071a0f35e62d
6517 F20101208_AABTWE mutschlecner_t_Page_051thm.jpg
3a59fcd34dde97fe822dc7b811460a77
dbec919f15d25249c0f67e01aa987b5349de3691
F20101208_AABSRZ mutschlecner_t_Page_107.tif
044fbed82ca2a8256898a32c793f2c6c
a84567fb03268122e4987269482db962c2eb1ba9
7058 F20101208_AABSTC mutschlecner_t_Page_153.QC.jpg
56522cf6aee823023275217ef56f822d
fd946bef9a2d37446190ff1c9db79b5b13137035
18657 F20101208_AABUBL mutschlecner_t_Page_147.QC.jpg
1b52781b7f2997bbd68f502873a4ec50
ad69382e27d83cff3804277293974db43be346dd
4452 F20101208_AABUAX mutschlecner_t_Page_136thm.jpg
715f600ba78b4c02bcda4b6cc98d1c4f
84c200f9edc4e769cbf265fa7b6b4904d2ab17e1
21083 F20101208_AABTWF mutschlecner_t_Page_052.QC.jpg
7c170e0c8fee25ae27e8efc5d95d9fdb
2a0d6456c51754fa5ddcd890ea6d5b47ea200370
22970 F20101208_AABTVQ mutschlecner_t_Page_040.QC.jpg
0906c0cb8caf5fd0ca695e257f08bfe3
3a4fb883381316cbfde739d7271d6f9880201dbc
6387 F20101208_AABSTD mutschlecner_t_Page_096thm.jpg
1abe550738bebffdc1552001c3fbceb5
bb3c246045a07a5588188a984e810431e54c015f
F20101208_AABSSO mutschlecner_t_Page_063.tif
271ca9aac007e35ffd376dcf2f67fcfa
2e2f66dce35519bbc2328538d3bce799be76bb84
3857 F20101208_AABUCA mutschlecner_t_Page_158thm.jpg
0df42fe92db44cb35c6810c0b4a48786
55732e08f03ff4482c5f5f3ffe574fb981a5d30f
15537 F20101208_AABUBM mutschlecner_t_Page_148.QC.jpg
dc32428d47b54d82fa96f8f5e70d7120
3d5f9e2ff25573c9d18721df44fa69af7a896e9d
4863 F20101208_AABUAY mutschlecner_t_Page_139thm.jpg
a0642fcddccd7a5783ae968f706731f3
0d7d810445ee493f833dd451a7e9ebcadca1266a
5599 F20101208_AABTWG mutschlecner_t_Page_052thm.jpg
114520d623fdd5edc10fcf978a7d41d1
6b833bfea9c1c21730596a18da727aa5a29c8081
6631 F20101208_AABTVR mutschlecner_t_Page_040thm.jpg
99aaa58fac3e4fb28aba7b5b0eb767b7
6d671d4913c5a13f15815c3dfff767acbd93e2a3
F20101208_AABSTE mutschlecner_t_Page_006.tif
1d661758a2898198ceccc5ae73f9f20d
f4b6438c35a9364f71c186e5fd436c1211878116
6595 F20101208_AABSSP mutschlecner_t_Page_084thm.jpg
dda0f14732bb63147378622cbfdaf707
b55e1a04b1e5fdf43a4bd1fd2c598c4e74c052e5
22533 F20101208_AABUCB mutschlecner_t_Page_159.QC.jpg
da27360e544bbf8cf6733192d054708e
db98ff4db7f388d9b74413f856fe578e8392062c
4960 F20101208_AABUBN mutschlecner_t_Page_149thm.jpg
36075d004d6bfa4ee8eef4d8b01bea0f
3fb2f891b4e16fae4911634a9a2c6e9c232d2eb7
18868 F20101208_AABUAZ mutschlecner_t_Page_140.QC.jpg
6a40d6bc9c56a851782e198c32e3ea2f
155253fdac22e03d0d6b02b118fd3bd8c401ed62
24442 F20101208_AABTWH mutschlecner_t_Page_053.QC.jpg
cf2836829b7767adca3241c98e69955a
d995f2fc3b8e922b136fb812ea57c34442e733e3
23329 F20101208_AABTVS mutschlecner_t_Page_042.QC.jpg
c6fa24e14251b98165c2a5048a2a5382
a8aa9eeab60d71714fd1b1345d1e51f321067504
6656 F20101208_AABSTF mutschlecner_t_Page_093thm.jpg
5a333b9e849493567fa5e2cd23c88100
c587f99c63eb33d8ac508afa48794cc552d6ef7d
1970 F20101208_AABSSQ mutschlecner_t_Page_011.txt
a1b3db6e9766be64b972a0cce40ae48e
960c646fad9aef30b10c465e976b7026927c599b
6678 F20101208_AABUCC mutschlecner_t_Page_160thm.jpg
a073f65a8e7676b0b1143ee620fa650f
00751e1a20e162c5f2c078f06c85e158b928e26d
4889 F20101208_AABUBO mutschlecner_t_Page_150thm.jpg
a6fa8b1a549dd842b534c975a324e546
65f1a4ad5c2a45e92b6445133791fab31deb587e
6616 F20101208_AABTWI mutschlecner_t_Page_053thm.jpg
8a6ce9020660f7c6a6371cf2e2d6fff7
1043b8f92fbee7411e517fa8ba54e37141245153
6707 F20101208_AABTVT mutschlecner_t_Page_042thm.jpg
4c56cd9f9ade51eafcf18f9acf65314d
b0ac959a90a827b28dc3cd394a52f2877a1fbbae
607098 F20101208_AABSTG mutschlecner_t_Page_118.jp2
9dc128c61700a5ebd38f146197bf057a
15ba06a8e24546ae3da93108dd372c0510d62860
983139 F20101208_AABSSR mutschlecner_t_Page_139.jp2
b1b97588449a9ba1f1746a344405720b
5d1d5de4932556c1ff362db4aaf97940abb1ada7
22682 F20101208_AABUCD mutschlecner_t_Page_161.QC.jpg
e2e1f0bc82b9827ad133ee1e2282eddf
cdc559b041d3972a9a8c8ce73390495aed4141c3
5084 F20101208_AABUBP mutschlecner_t_Page_151thm.jpg
7c7fbcf16cdbc36f31255edfe4317b63
86930b44bc02cef3462b84f93ee97ddf93c578eb
25329 F20101208_AABTWJ mutschlecner_t_Page_054.QC.jpg
b4bae84cc46da47583f0d8b673bfa8b7
d4d9de2a03b6e273617bc9c601635412586b7a06
6963 F20101208_AABTVU mutschlecner_t_Page_043thm.jpg
088debc7f3f5959cbba4730a72dd7be8
447797b6bda216d5fe44af59a2b40f977edec4a2
74182 F20101208_AABSTH mutschlecner_t_Page_048.jpg
dc3e526dc6c8c13e602003117e050baf
b57c2a9b4c576883b83d649526083a37171bfb5f
87811 F20101208_AABSSS mutschlecner_t_Page_025.jpg
940147fd4f84f2584e66726b8e9ac56a
5b7c080d49a07e73d190ee6394510ca656116d9c
F20101208_AABUCE mutschlecner_t_Page_162thm.jpg
30dc597081594001a93e44540e3124ff
31e08e85127cf94701a38bc9e172cee8ea598a53
16248 F20101208_AABUBQ mutschlecner_t_Page_152.QC.jpg
8c796e7d708328ca6484312dd28642e9
35249014b50fa84ed626693eac284bbe520ec009
25327 F20101208_AABTWK mutschlecner_t_Page_055.QC.jpg
b7854d412ef6e50dcbe728200cff5959
db5bb426c2f9cc135f1f7ce6599e2ff9f643adfb
2421 F20101208_AABTVV mutschlecner_t_Page_044thm.jpg
7c5ab85ca955bf5f5977bcb73f74f3c0
5e054c7556f90c49032b8ae1ef237db4a4385497
16157 F20101208_AABSTI mutschlecner_t_Page_149.QC.jpg
0b202eb86943c85c42a624af7897e850
3d3bec66f1f8d6c913d40a92e87cc8a97af84d30
13928 F20101208_AABSST mutschlecner_t_Page_129.QC.jpg
2d2321a003f6603ee86d5a3409de0579
49b6ec4fb8664e512c601e28cd9a04d3130125f0
192750 F20101208_AABUCF UFE0019642_00001.mets FULL
8e174689daa95979f76bd105a8d444a7
0521a49a599fa64c83644cc6ec4c729f424b8704
4491 F20101208_AABUBR mutschlecner_t_Page_152thm.jpg
796d2733b8e31bd49b4f87d5335db5f9
0f1334940c7591349f689cb87cc05bd8f9c58783
23182 F20101208_AABTWL mutschlecner_t_Page_056.QC.jpg
0dcce2ff0aa9a28fbca40b736a0e4bd0
83aa77f2cbaabb9f2a736ffc1ef39ab28d4a1262
6355 F20101208_AABTVW mutschlecner_t_Page_045thm.jpg
007101a460790c104ddcc71620bcf531
0e3a85b57fa04b9b67cd959f91c9e1d529e0f180
F20101208_AABSTJ mutschlecner_t_Page_056.tif
e87c0ed5a71e263cd6e90dcefcd6a109
bd5a693134f48affc097f329f4833bac74d25364
5357 F20101208_AABSSU mutschlecner_t_Page_105thm.jpg
6f5268cb971f79f341effc77427dbec9
4576bcfee1cd05386268964a22b668cdbad08c9e
2113 F20101208_AABUBS mutschlecner_t_Page_153thm.jpg
db3016d8907aae39d400cdd6b8d6a1ae
ae2dbfc5aa29391401fb5bad9eaeac374ee8cbbe
10379 F20101208_AABTXA mutschlecner_t_Page_066.QC.jpg
2cf8c9a42bc01dc6b87c3d38ed34ba75
ec29be3e963a0dc686e47a3c0ef16a14efab4b7c
6464 F20101208_AABTWM mutschlecner_t_Page_056thm.jpg
5320a776301236bc89c989ef2348e3f1
218639fb580c7e96adec6abc5eaa74358dd9220a
25132 F20101208_AABTVX mutschlecner_t_Page_047.QC.jpg
1cf92095a75510164b72d58ebb21c97e
a0f193e712d0fe1e4378f9501969d6f67f35f79b
1819 F20101208_AABSTK mutschlecner_t_Page_091.txt
18a01ed983242e35f317638d80a246f0
b3d8d7b284f2e7180225fd918a4efe8926812dd9
43461 F20101208_AABSSV mutschlecner_t_Page_052.pro
ea5c0a512a91c353851ff9c113f282f6
4c88e4c081d84ed2a64bf3a8be2fb8e416077d43
13721 F20101208_AABUBT mutschlecner_t_Page_154.QC.jpg
3d017a559a5e238dd72ba5c02a5476f2
1b327f9d84c9df2b52ef576b3621dd4c33d55e04
3451 F20101208_AABTXB mutschlecner_t_Page_066thm.jpg
219a93919c41d282b5af7ad16df2c95b
8103af81749361f458bfe70661f670d91f2fe583
22825 F20101208_AABTWN mutschlecner_t_Page_057.QC.jpg
42c06d35a1b09d380be4b18702f62139
3f7416c3cf2f8e6c09e7a422c19a669c3712ed76
6763 F20101208_AABTVY mutschlecner_t_Page_047thm.jpg
b25bc4c9d69ad56b012cc594d0a448a0
7ef8e8237fe74f896b827f044f9ffd95aa12b52d
116623 F20101208_AABSTL mutschlecner_t_Page_055.jp2
d9c345f6692dd43407e7a1da33e00023
4ba03e89714032639e1901ae657cb7d5c02c5569
76652 F20101208_AABSSW mutschlecner_t_Page_098.jpg
24d165fb5442fbf46fa38ebd1548bb62
511a09d4b1423e161fa2946f87f0d6ca577fc753
4198 F20101208_AABUBU mutschlecner_t_Page_154thm.jpg
b990f210c643c64b8ef0c8fe35d274bc
9529673015052f2953f4d885a7fe1614813b05f6
9951 F20101208_AABTXC mutschlecner_t_Page_067.QC.jpg
674efe4e47b7834db3368348644c41d3
876bfea06b69319ffe53157b1930bb20b2713416
6260 F20101208_AABTWO mutschlecner_t_Page_057thm.jpg
da28fa3d6c2c6c2926dc9d7301995df5
a27e15c3921339b692f05ba1f86b8fc7bc140fa7
24080 F20101208_AABTVZ mutschlecner_t_Page_048.QC.jpg
6edc559036873b09ec1d59911d9113da
12f81e2f7b85dfc16d15abda625a0653a4030969
3251 F20101208_AABSUA mutschlecner_t_Page_158.txt
cac0c0f6362ea600e52f95f977918db5
9f1e53873c8923430d823753177d9376b5759109
977 F20101208_AABSTM mutschlecner_t_Page_010.txt
dbe7438ce9b62048720a1fb0280111ae
8a19e581f96401aa5a5a3db0050e367d2f842797
7183 F20101208_AABSSX mutschlecner_t_Page_063.QC.jpg
8d7e3769be20746d9ac268007a1c76bb
028c24f1b2bc7b8a775da0014f6f484486f2d73a
9236 F20101208_AABUBV mutschlecner_t_Page_155.QC.jpg
f86a608d2495e8cc54ed6e467e7714d5
e8cd2c24ac63f9fb7690b817fbb6e53401044fd0
3260 F20101208_AABTXD mutschlecner_t_Page_067thm.jpg
b35fc69862819102c2a02e8791caa483
37f911263d5eb9777abebac385d51e0b37f3f9de
22614 F20101208_AABTWP mutschlecner_t_Page_058.QC.jpg
486f358e334a45f7d3db5bda49c49edc
d2e09c7d53b66c4ff45bc27366bbae8b2fee70e5
46986 F20101208_AABSUB mutschlecner_t_Page_096.pro
acb04aabff9158a914bcb196ca930986
605e864e91aadd491f083452d6129b47e8bdaca6
49188 F20101208_AABSTN mutschlecner_t_Page_075.pro
71f201125592dfae670fc35cff28ad70
0e25187869994baba01a3101bc575cf8e1428bc8
15463 F20101208_AABSSY mutschlecner_t_Page_131.QC.jpg
3a67060a50c46aab90de74a46513f7c8
7411f77c335332d7743824c0d705f7e7e77c068c
12250 F20101208_AABUBW mutschlecner_t_Page_156.QC.jpg
64dc1f63a17eb95bcf7df09cc9b8588c
b5d872523e869e0242c0a3c2cfb4fd1fa67cfc5a
2728 F20101208_AABTXE mutschlecner_t_Page_068thm.jpg
b6390e5807d467eb59660da6fd7f7e3d
67366bfa76e501aa22c0c2ef140f355730b2b477
6241 F20101208_AABTWQ mutschlecner_t_Page_059thm.jpg
01a90b9eb048506bdb28bf31155d3fbe
ba15a89a8f38c033380087644896cdb7a7875620
29264 F20101208_AABSUC mutschlecner_t_Page_069.jp2
594fac51624d172f07bd7f83c2b302f5
ad091e555751a0a11df54900689e158170640805
2304 F20101208_AABSTO mutschlecner_t_Page_089.txt
601cb1170163741d5a5e7e8da65448ea
8f1a6dda11d508e3983efb8736a41023fcbd7646
F20101208_AABSSZ mutschlecner_t_Page_081.txt
298efd90a4f7be3fbf608456de40de9a
366ed5ef95b1929da70216e9ee28fcbe1e6195bd
4135 F20101208_AABUBX mutschlecner_t_Page_156thm.jpg
90105a9ac0c3a2d7371ca4971269dc7a
9b6b82e473b08c3e0c15582ed3f8e925023c13af
8989 F20101208_AABTXF mutschlecner_t_Page_069.QC.jpg
6c6b4079f6d68f81b7b3fdf111b31bd6
664558eee2e5a3407052ef85841a27323a1146f2
25301 F20101208_AABSUD mutschlecner_t_Page_098.QC.jpg
df37682881647ba7188b4a441675ba25
788ba44618cb1bf63d1d647e0061d49a317f2064
11615 F20101208_AABUBY mutschlecner_t_Page_157.QC.jpg
69cdcf27a040757ba0898cb80159be05
02f1e4add9f17a966b83a1cf303651279e5dc3f6
3308 F20101208_AABTXG mutschlecner_t_Page_069thm.jpg
0eefadb4b02bfbe34eef419c61bebd03
6af489725880720b1a6dfc46d63e72da8f1adbea
25124 F20101208_AABTWR mutschlecner_t_Page_060.QC.jpg
3b479c3f952d38e93d1b6b46881ec8ad
f4153373fb341c50789990e72a2b04d2c33b5238
13264 F20101208_AABSUE mutschlecner_t_Page_099.pro
7b05a360031319dec41132e8747c7cf3
a734fd746b7f0b7c58d79b4cbda47f98e0bb955d
1584 F20101208_AABSTP mutschlecner_t_Page_003thm.jpg
7757bd68451888d2dccb11abcc0cc40f
00ec51d000ac6cda6952d45f1ab0d29b8540ddd1
3882 F20101208_AABUBZ mutschlecner_t_Page_157thm.jpg
e9aef10107cf6e4c5228753538f9a205
1a6f9b8e842dd99bae4107e079c2090b6c8fd07f
8643 F20101208_AABTXH mutschlecner_t_Page_070.QC.jpg
1e89a2c4da95c044209b0cacba5425ca
8e4e0868612cafcd139ea09f93caf42761785d38
17396 F20101208_AABTWS mutschlecner_t_Page_061.QC.jpg
5223cebb992e90cef8b7fa1930488d6d
767261e15bacf8c9eb5edc4ceafe4a7efc903eb8
19253 F20101208_AABSUF mutschlecner_t_Page_137.pro
dfa8f9ab81cd9dec22e98cd7957c91b8
79155d47180a0af52e2519eb523cd157cf6f0aca
108396 F20101208_AABSTQ mutschlecner_t_Page_035.jp2
8d2a5a8f371f1e1daa7115d261103095
420ce496a4c2ed249fd8ac374396b04aae3cbedd
2971 F20101208_AABTXI mutschlecner_t_Page_070thm.jpg
fc012b4a6ecbcb8b521229898750810b
67c1c5d7d1068931a47fe48871c2331961e5aa5b
5288 F20101208_AABTWT mutschlecner_t_Page_061thm.jpg
ac4ab06f55702f4034473f470164e759
4e513a99a1c3cb07bf29be605b9a8a295d5ecdcd
419 F20101208_AABSUG mutschlecner_t_Page_070.txt
0128653da82c56a5ab41ab2f299c1d9e
03fda5d6703178092027839d32bc6cfcd73fd1bd
4624 F20101208_AABSTR mutschlecner_t_Page_122thm.jpg
c79bd71ea89971a8885bb86796d8120d
8e25a72e1f73dd0e2de1f310882e9e4da9b3a10d
1051984 F20101208_AABTAA mutschlecner_t_Page_006.jp2
23036148ee97c691b56de7b06fb08805
1daaa4497ccae08042b0ecca2a4ccd6f67344399
8076 F20101208_AABTXJ mutschlecner_t_Page_071.QC.jpg
c2d0fa86c6a0d84de3940e2a5c4fc400
028fc23cc93e5fea0b1f4b39f9f8b0b3230cd6c7
11943 F20101208_AABTWU mutschlecner_t_Page_062.QC.jpg
437a820d46dba7e35eb93f304802de7c
ddfb1377e767b6864a15738626bdbdd323b9f745
6487 F20101208_AABSUH mutschlecner_t_Page_058thm.jpg
36a84548956c1c7503d42218e9cd4339
122de0f4987eb5c0b4facdf56a522941706ff3dd
2623 F20101208_AABSTS mutschlecner_t_Page_001thm.jpg
89ae0895853964d78adcf585e1b6b852
c7db7a11a0e0227483e2ac067e6dccac8249f9aa
1051985 F20101208_AABTAB mutschlecner_t_Page_007.jp2
8013def8cd61083b47c72b80a1719d85
299af0ca74bfcf7ebd94e55ac627c63a7bbee9c5
2583 F20101208_AABTXK mutschlecner_t_Page_072thm.jpg
eebb316814382acc7e3813411bc8ad55
934c73f14816cdf61c458c5d189d05883db2a157
3659 F20101208_AABTWV mutschlecner_t_Page_062thm.jpg
0fb3828bc249180207b80d8602886626
f09757fb9ea7b86cd1656fa9cb15c8759f9fa9a7
4431 F20101208_AABSUI mutschlecner_t_Page_133thm.jpg
43516f00a1fecd0024f4b99835e45de3
362d012a79e4cdf1c29f89fde7d4b319aa7e744e
87126 F20101208_AABSTT mutschlecner_t_Page_146.jp2
d7d4d54d1c8988cea5061e74c171e470
a65b9a2190ca54106a86b7e10123ac6f0fe7c1dc
1051977 F20101208_AABTAC mutschlecner_t_Page_008.jp2
4f1c1137c233debb132a215bd873b669
f767143f122847e1450f3c09370815caf733e639
9136 F20101208_AABTXL mutschlecner_t_Page_073.QC.jpg
ebb537a8090e974b039a02a145da32ba
9ec8bf0ff57fd9af8f1b5dd5d2f4e08371bda611
8137 F20101208_AABTWW mutschlecner_t_Page_064.QC.jpg
30017409497bd8ffdebe99639446b34d
4b5cd2bc499d343aef933114b01fb9e8d2fd25a2
250142 F20101208_AABSUJ UFE0019642_00001.xml
71c338b1fdbed11d95773aa4e1221b9b
c5830828e8a426994c5f754eb3361f3ad8bea41e
1252 F20101208_AABSTU mutschlecner_t_Page_150.txt
986e10a1f858fc50295a8f4d7b29dd76
43f50142ed5b41909ed91f3b5b339fafdc4ae0b0
1051973 F20101208_AABTAD mutschlecner_t_Page_009.jp2
8590bc979023f1fb1874bf61386eda94
23829c195b5fcfed4045acd6ea4d0ca72fb30c50
24619 F20101208_AABTYA mutschlecner_t_Page_082.QC.jpg
92f90ce256c26a293d163d4425884a24
527260d2ec3840b8e6d8250c4f9bda330f1f84ab
2899 F20101208_AABTXM mutschlecner_t_Page_073thm.jpg
da0f23fcbc6733dba59e09907b5ba47b
6760f533705d86e4f7cb18a7047eabca459df859
2626 F20101208_AABTWX mutschlecner_t_Page_064thm.jpg
7d5bbd6854c9a2dc1545979168c1bd33
564627aa3e3af928dba9314025f76d9b6ee253c3
46514 F20101208_AABSTV mutschlecner_t_Page_057.pro
0037db314c67d59f39ac45600d16c09c
3f4c742bbc069a6e43c70a567d4143ba5114ed6b
98104 F20101208_AABTAE mutschlecner_t_Page_011.jp2
3c70ab5ee31600abba460198d9e586f6
25c909ba085ce70b77c334bd6d23b8f3c28a2478
6705 F20101208_AABTYB mutschlecner_t_Page_082thm.jpg
4f5df610b3eaeaa9299e0c6490f00829
64c486790ef571a397b1f75a0fecc355ea61f267
15634 F20101208_AABTXN mutschlecner_t_Page_074.QC.jpg
424800381aeceded82ca2165d1ec32e1
4aaa37e454f37547bd611e8ef9baef1b7ff8dbd0
11264 F20101208_AABTWY mutschlecner_t_Page_065.QC.jpg
4d8afa0a7948999f07d687f00f7171eb
9cb8a2825f0734bda34b9c6c8d785a441014a046
975 F20101208_AABSTW mutschlecner_t_Page_124.txt
11dc0a8d8f4e1d7ef78d144b2a826187
373e0c8281a127ffd2519676f298d406d74fbc5e
50619 F20101208_AABTAF mutschlecner_t_Page_012.jp2
2d41598dcd2a41b0d7d1cd18ec1b2d30
cfbe79cbf5dfb62d9940599ed6f9e532a2449077
24688 F20101208_AABTYC mutschlecner_t_Page_083.QC.jpg
8222f607b0925f21cdd3cecef2ddb3c1
d707b94b8fcdd2273fad8552dc800849abda7c8b
4455 F20101208_AABTXO mutschlecner_t_Page_074thm.jpg
2c9f917101cc653b79e7795308981736
1a6991cf89975bd3cf65e52b3bfa2966caca324a
3739 F20101208_AABTWZ mutschlecner_t_Page_065thm.jpg
4707af59190237851b84cddb8c968d07
c08012ec6aa8d321a51e7f42614ffec36b66a56c
25429 F20101208_AABSUM mutschlecner_t_Page_001.jpg
fb205d1559515a722e03b39eeabc36cd
e96d901c5c720dabbfec92b65ce5065220e7448f
21610 F20101208_AABSTX mutschlecner_t_Page_130.QC.jpg
3d5f4795a336502edf7d653ca9284cd8
1a992dfaacbacf8d1decb9414aa286d5232a84f1
111667 F20101208_AABTAG mutschlecner_t_Page_013.jp2
0f1fa467f4b566fe845f3c3d4b24434d
c7ef87e09d398fda7c090d6319a3f95a4dd02d5f
77617 F20101208_AABSVA mutschlecner_t_Page_018.jpg
894621d0ccc472c0a06b3c2004b9eac1
a44c4edcaf795886a597983d2851521c20ac0c99
23614 F20101208_AABTYD mutschlecner_t_Page_084.QC.jpg
88ad59894e68b5d7f8f4e292002f30d7
cbf1ab821d351b7c59fd0d44d1fd461289cee97f
23801 F20101208_AABTXP mutschlecner_t_Page_075.QC.jpg
cbb477086ca7a1ca3aff403518536bda
ba8d030691be2a97c532460ba5c750db016cc1f0
71871 F20101208_AABSUN mutschlecner_t_Page_004.jpg
eb37e6f0ae3aeeadcb179833a7020f69
047ea2fdaee1b24cac49e884eeab0317b5d166e7
6154 F20101208_AABSTY mutschlecner_t_Page_069.pro
6d2a6b195182a3d700e3fece8ea12da3
9b6b237e6899f4ab4a87f293014c9d415499398b
97107 F20101208_AABTAH mutschlecner_t_Page_014.jp2
23cf87873c24b29db5f4144563a1ce15
67e4e3f98642c7a79dea5da4d9a04bd93c416968
74599 F20101208_AABSVB mutschlecner_t_Page_019.jpg
4103c50103bbee95111394312ca3e6ba
336e36447684637303f79caec472d83457f90e9c
F20101208_AABTYE mutschlecner_t_Page_085.QC.jpg
9d16981504f01ed31b8635d3242908fb
f5d3237fabe4b460a4aa4a7fbca76f7df9d716e3
6573 F20101208_AABTXQ mutschlecner_t_Page_075thm.jpg
a2f0f76fe460158a71694bf275d7a0bc
a0b834be8c162cfbe921f71fb3bb02edc2c7c1ab
36147 F20101208_AABSUO mutschlecner_t_Page_005.jpg
a47ee4437d88440d801d9204a6e9f586
cf20ee753efaa4cd5b9e0bd2171e6701fb8187f8
23474 F20101208_AABSTZ mutschlecner_t_Page_160.QC.jpg
8c3ea7476979dfbf6842e093ccb8c8f6
3801c72bfdf13ba65d657c6e092fa631d365f429
104992 F20101208_AABTAI mutschlecner_t_Page_016.jp2
af3b66325f50e2a8e06d9c329e92ef79
6b936317168eeb56942b7a971c21b1512c7cf9f2
75553 F20101208_AABSVC mutschlecner_t_Page_020.jpg
0bb58441186fe5434c7111129d87dad8
aaf0e2f5ce9860aa8cae21512c49074a07ddb47b
6447 F20101208_AABTYF mutschlecner_t_Page_085thm.jpg
699ab9fc10ff9013f1a02a0b65a8d3a1
9f4888c4fba6d7f02184e036c03ae363bb14fe50
F20101208_AABTXR mutschlecner_t_Page_076.QC.jpg
5364202babe6d4d840561bc88c515f10
7501702ce12ff514d28febacca6b96982ab2b17a
78899 F20101208_AABSUP mutschlecner_t_Page_006.jpg
2b53a4952a697f95dfb3c20ae2753080
680422ad355252ee731e0f863b3943f32fd4ac64
118718 F20101208_AABTAJ mutschlecner_t_Page_018.jp2
2d14252fdc8bc5305f68f258bf9945a0
e74d84cf64e5a77a9e85ea553313f9bd7d6b0f89
71611 F20101208_AABSVD mutschlecner_t_Page_021.jpg
c067d9887cac54e6fad4e48a55ce30d0
3cd56d31dba0813c844bf16c1e49fa946e01b72f
25599 F20101208_AABTYG mutschlecner_t_Page_086.QC.jpg
412364257fadf4b3d2bd53080dbc76c8
e03601cc35e99bd6a31aaa5e6ddd24200a73f228
112757 F20101208_AABTAK mutschlecner_t_Page_019.jp2
2bfba8faaae7f11c72963ecb86e43209
c8baec659086c3e84e69a0010ce5e0ddd0338fd4
79015 F20101208_AABSVE mutschlecner_t_Page_022.jpg
e1b99166b7dec367294bea4d13a2d7e7
4ea2eac7ec0681130fff8e57b4d0ee0d9549453f
24222 F20101208_AABTYH mutschlecner_t_Page_087.QC.jpg
0ffbdedbf011a0266af9ab3225a308e6
35cfca4a891d985ce9741a53a5c275be0c6d9d5a
6846 F20101208_AABTXS mutschlecner_t_Page_076thm.jpg
242c211fcc7c6e886445256aa6e52a02
b0a7a89f4e48c681f6833bb5c07a5ee530933736
97960 F20101208_AABSUQ mutschlecner_t_Page_007.jpg
a2100103a3fdc109236f6127de88c543
efb8a7251daa0380d9651fbf37c9fc447e4065d5
115251 F20101208_AABTAL mutschlecner_t_Page_020.jp2
d999f9258acc434cf154c8960a01f685
6a9f51921e3f1fc7ffd296e2a0d2981e0a17f1f0
79352 F20101208_AABSVF mutschlecner_t_Page_023.jpg
c72666a8b87bc02de64d24a242200380
17566b086466abd4a50a4a253b328d7ae47b0eb6
26042 F20101208_AABTYI mutschlecner_t_Page_088.QC.jpg
af1954e966db5afc7641bcf3f24d181e
a70f2c98e9eef95de6f6c95804f2b5f96f16117c
25302 F20101208_AABTXT mutschlecner_t_Page_077.QC.jpg
ec740cfa04b82f55e282a002d831827c
506b4f83d876da4531d5a9d90da536dc7a039268
70386 F20101208_AABSUR mutschlecner_t_Page_008.jpg
e6052e2fb508f084cd95d8b7c2674031
d221867eda9c07c91aeaf61e74a7a6db2fc9adb2
113764 F20101208_AABTBA mutschlecner_t_Page_036.jp2
a1cc44b35c3798f4dbdddea8c7fc8814
fde5b5bfa84c4a6b2592d1f6835823c815bb3ece
107192 F20101208_AABTAM mutschlecner_t_Page_021.jp2
b2508b50abaacf85a3fcf94728f7f0c0
25b67310dfa2505851c7884c6382f2e5121478cd
74079 F20101208_AABSVG mutschlecner_t_Page_026.jpg
cccb9ca7da5348111a5f759d5a303943
22244e0638e8f5b29bc86e03b1f97f139ec9367b
6761 F20101208_AABTYJ mutschlecner_t_Page_089thm.jpg
640cd897c1ee564fd6ff5318a740e574
72ec78347fe2970a299b4b5b62afce9048112d0a
7007 F20101208_AABTXU mutschlecner_t_Page_077thm.jpg
10b9c24b5b39c3d1eded3cc8e5de7f4c
291c1d799cde888e79b2185131aea170994806f1
67698 F20101208_AABSUS mutschlecner_t_Page_009.jpg
d2ab17f9cdc850c9496e2f7eb0e084c5
1101c900a9af3beb536e88a3d2d1605adbdcaf9e
113882 F20101208_AABTBB mutschlecner_t_Page_040.jp2
850d34d3f955990d934f82de14756b4c
936ccdf0438179b6ca3ca5ebce0af898d13a328c
120098 F20101208_AABTAN mutschlecner_t_Page_022.jp2
d1b31cd71c04962da5733703d8f7bb23
949bbd8b53d436179827b1fabb8d387019be0b24
72325 F20101208_AABSVH mutschlecner_t_Page_027.jpg
4b9b9396daf0ec2d39a0b2c459695229
de6560f4fa26cfa902faee8eaef77f7907ba525f
24824 F20101208_AABTYK mutschlecner_t_Page_090.QC.jpg
260f79a67d0b69e4fbf43a9cd2c3f918
3701b21d528874a0b99d80f51ad04d172225069c
23859 F20101208_AABTXV mutschlecner_t_Page_078.QC.jpg
3e5368271c55e8e8be99c285dce67e17
673f2886e54aca347f5c7e1d54aa92b9117f80ce
35665 F20101208_AABSUT mutschlecner_t_Page_010.jpg
6bfbc46f7a8d57aa830cc1289b0d7957
7aec4ba5b15faf9425ea4f58d229b6a3c687d5c7
107916 F20101208_AABTBC mutschlecner_t_Page_041.jp2
920190768e2900201b554d731c5d4a6f
f4f7553e31878656772d60fbb6970794860e3e77
120778 F20101208_AABTAO mutschlecner_t_Page_023.jp2
5cbae644cc0e694174b2a8ffd1de5650
5128d81478d92bb2ad98479542b6f39e22152323
72235 F20101208_AABSVI mutschlecner_t_Page_028.jpg
6a3c10b768ae21d3c941afcd1b8db587
040835c944e3f994c04a17f6bbfdb61bee84b460
6804 F20101208_AABTYL mutschlecner_t_Page_090thm.jpg
bbf72d370d83eeb558276374c67ecaeb
9d0c1718e37b0723acfa30131ef1e74cee45d950
6399 F20101208_AABTXW mutschlecner_t_Page_078thm.jpg
d505c46193db8305dd81274cb1d8f06a
9bc2d830f9f1ae68039e18a4ea2a2423263390af
68161 F20101208_AABSUU mutschlecner_t_Page_011.jpg
4176e3791c2b62043c50fa50ab08393d
294929671b562a51d7ff0de261392540c07bf9a1
108957 F20101208_AABTBD mutschlecner_t_Page_042.jp2
f66b25b84f33ed4d73749134123964a9
ef5ca8fc170a8d236a8156501b0d492f6024223b
131603 F20101208_AABTAP mutschlecner_t_Page_024.jp2
d837346417d48849c1c3b1d2ecd6904f
281b750c87fff6e60d0720a1b260a6aa71638c93
76946 F20101208_AABSVJ mutschlecner_t_Page_030.jpg
fbec7769ad8d4672f8f75e88cdf08a41
3df11c6d3e7076ad3b0747e844000d48e947ed84
29183 F20101208_AABTZA mutschlecner_t_Page_102.QC.jpg
c36f79781d1eaeedc6b857073aa6b89e
ef2fe115290c25822f2aa395d2613ae91470b2db
18180 F20101208_AABTYM mutschlecner_t_Page_091.QC.jpg
d4e5dac22fd8a41747b2bcdd44abfcc4
e94a5b111320b4c18617607821bb7fd37fd1bc84
25486 F20101208_AABTXX mutschlecner_t_Page_079.QC.jpg
9654c50497aed261f9bff78ae58cd86a
18e14aca7fa40b285008cc4adb01801074feee6b
36928 F20101208_AABSUV mutschlecner_t_Page_012.jpg
c46df84bbb9a8d7c03ed3ef0ca2987d0
fb39f4541798a16fccd25c3abf028a54c42a6c9f
115915 F20101208_AABTBE mutschlecner_t_Page_043.jp2
185ac4df45ee917bd11d84481918c44e
e6dae82e5bd3ed66f223c958df8eba4a50730a9e
136968 F20101208_AABTAQ mutschlecner_t_Page_025.jp2
40ce28745f89418f4d308a21473c6a65
a8a5e5e98f23bec2b994526c73d65d87c2c0c662
73107 F20101208_AABSVK mutschlecner_t_Page_032.jpg
9d02ebfc29060d3ddea9380d33abb9d0
9675d4b5fa1d36f6659bff7bae6b81a41c7869d2
7702 F20101208_AABTZB mutschlecner_t_Page_102thm.jpg
206c2d5719d6ddfca151db006adede22
4d47fff5604433faf4179eb194a1339bcbb1fefa
5275 F20101208_AABTYN mutschlecner_t_Page_091thm.jpg
cbca3a55667bfb73d452f3efd32f01d3
39586ce126976c46fc479bc487bebbdea5d8810e
6941 F20101208_AABTXY mutschlecner_t_Page_079thm.jpg
99dc7e7a8ee1015df188956118a9898a
ca25a824fdf21da329d398e610cd5fa7eeb9d429
73848 F20101208_AABSUW mutschlecner_t_Page_013.jpg
51a6bf9af5afc0288d64b2ca054ad3f4
0b5177633bcbca6b7c367d302c8c8841cde200ad
29451 F20101208_AABTBF mutschlecner_t_Page_044.jp2
717208ed2a1a79f035f42331c6d22dfe
5fe4440cb08ac28597746b56987c8d651c241fe7
114013 F20101208_AABTAR mutschlecner_t_Page_026.jp2
7fdcadc77f95bebd78499c7b910d6ec5
5c0c8c93761266921d61846ee41e9b0fbc1a130c
72717 F20101208_AABSVL mutschlecner_t_Page_034.jpg
fbb3511ad69488a7be670eb9afd56605
f2410bc9bef07e892b5c49b451b674eaa51e6425
6919 F20101208_AABTZC mutschlecner_t_Page_103.QC.jpg
3d1379c34b909b97dfc5326fb11f8d82
f0561c66d7d67516a27e693b4de68c7640982331
16230 F20101208_AABTYO mutschlecner_t_Page_092.QC.jpg
cd87cdc363fc85e58268a831ee5caf9e
7653a04bcc72f6c49011703b05946b296a2b3492
24344 F20101208_AABTXZ mutschlecner_t_Page_081.QC.jpg
6a263b66d4ecd8d614ce329681826b1c
7e4e414c5a9d94bbd5be106d9bbd92e36246e9b2
64710 F20101208_AABSUX mutschlecner_t_Page_014.jpg
5d8c55cc1398e6bf29b4abaa94de7d33
d8678658c1946412b68eda4eabced180f58d06ba
107486 F20101208_AABTBG mutschlecner_t_Page_045.jp2
bb76292e9fb70a060009fca1daa46845
5eca85b16c0466c3c5c27d10745617c3cd63f341
73473 F20101208_AABSWA mutschlecner_t_Page_051.jpg
4854c99b4f1166178c28146c03fd62db
6cd37e5f07e638df74d132b3c7ec3fb4ac3590c2
111311 F20101208_AABTAS mutschlecner_t_Page_027.jp2
0500b2e41230aed2f92aa032b37423d0
f278a5fe53b55d7c4ff7bb178d06662b06cc1995
71901 F20101208_AABSVM mutschlecner_t_Page_035.jpg
224016f05c2a21dc124483b062a4f5de
7bf35c27a7942f2dd00fabf52086f4f5d23c20c2
2321 F20101208_AABTZD mutschlecner_t_Page_103thm.jpg
6a8cc39f9cbce83ce766cb2f44dd8778
631f56ec4bccbe4b3c2628380d82e7b9697a776e
F20101208_AABTYP mutschlecner_t_Page_092thm.jpg
23f83265865f49514dc3df3dbbff0adb
982d93612d1fa5e0d98046eaf1f409acf0183b40
70241 F20101208_AABSUY mutschlecner_t_Page_015.jpg
9a7fcb964b9f1a35d5f447122ed1d52f
0e05f45e09bfdfd599ca3adfd26be9024d84cf74
105651 F20101208_AABTBH mutschlecner_t_Page_046.jp2
f97e34d98274220c9a82f6f268421226
2a4c8d51fbba905a3be5ae75b43dfa148ef27b78
63310 F20101208_AABSWB mutschlecner_t_Page_052.jpg
b1996aef33634500c406970023945cec
5ba1fcb53b41d18814e2f4ba5fc937ea620b0078
109922 F20101208_AABTAT mutschlecner_t_Page_028.jp2
668154f12caf9cc221f42204ff96088c
c0d1c05c776a24283483fcdc074138fb3e6920bd
74494 F20101208_AABSVN mutschlecner_t_Page_036.jpg
ca74399149d5b3960bd9a398af9a11b8
7bb6d6c4d55819f755abee66f008562b1d6a91b3
12275 F20101208_AABTZE mutschlecner_t_Page_104.QC.jpg
ce329240916786c429c562092f36e468
983136a0a0528ebe573bd5ba566251d62ef328c5
23189 F20101208_AABTYQ mutschlecner_t_Page_093.QC.jpg
27308f651269067da9a0466747c0b5f6
2011a7d17faba1ebe00bf989f9dd2ae01ead44e1
87717 F20101208_AABSUZ mutschlecner_t_Page_017.jpg
c7ee00b9beb21bff56b98fa9cd8666ff
1623cf33c2cc24236aa4ccdf1f5e77cbd42e459a
113613 F20101208_AABTBI mutschlecner_t_Page_047.jp2
f4ab7dbbff882848c1d8c6eedf6a3644
f4da4baedda0faf7e8366bd7e67d460026862f71
75128 F20101208_AABSWC mutschlecner_t_Page_053.jpg
f46d39577b7cda89acc02d7f56859a98
d322653639fa7767fcb118915da09f850c6113f7
112228 F20101208_AABTAU mutschlecner_t_Page_029.jp2
44c8b7156901ac953145b5c7115739cf
166d19c59b51bdf2222978693358dc6a5aee60b6
72149 F20101208_AABSVO mutschlecner_t_Page_037.jpg
b275578a181d4c2e6668dcb732850465
b231fc17f90b51b97056d0b01c0d0a3113a39399
3593 F20101208_AABTZF mutschlecner_t_Page_104thm.jpg
7ef82f600bbc10d52d5afb0d30ac755d
7a8d2e97299763f38055e4162beb2573d125575b
23197 F20101208_AABTYR mutschlecner_t_Page_094.QC.jpg
7f1f03cf98252339df77ff270b9cf1a9
bedcef590a7c0695b0ee9335fa746b1475769b4c
113405 F20101208_AABTBJ mutschlecner_t_Page_048.jp2
d02d65548e35c6bfb9233819d07c6f9e
b66bb14fe338463c2128458c0c84fe50968f5565
77091 F20101208_AABSWD mutschlecner_t_Page_054.jpg
a9916c6c8be3c6443eee02ecec8fc916
b0252f5dd81e657cc4ed701b7d5050da123b63f8
117862 F20101208_AABTAV mutschlecner_t_Page_030.jp2
d23674bde724d8e9e9fd7166c7817765
b39386dff351b521021ffc690e8447eb4a3946a3
74073 F20101208_AABSVP mutschlecner_t_Page_038.jpg
b671872149704dc641fd5637ea534dd7
f27a3b1569d2be1487fda8f5922c8a195508995c
14751 F20101208_AABTZG mutschlecner_t_Page_108.QC.jpg
9b9d99518ea82b02c3e1dbda90213441
097b63dcfe973381586bd027cd174dc35fd358ce
22621 F20101208_AABTYS mutschlecner_t_Page_096.QC.jpg
c1a7592ac77cb1dd6e0d97ea0ecdf42b
69a7df5e98471af3b2d2f553e51342a5053959de
110149 F20101208_AABTBK mutschlecner_t_Page_049.jp2
d189bc6e9a82f60e2db217ea1b0a4b54
51b1df286d66fc807716aa3eb8832b2396bce2ad
77233 F20101208_AABSWE mutschlecner_t_Page_055.jpg
704a80526c395d48604ae01ebbdb0eeb
2857f5ab6e9c8ed31c6fcb8c6c793889eb7c802c
103133 F20101208_AABTAW mutschlecner_t_Page_031.jp2
542524d70e8796988f7da474159860d4
495b1a97368b2d457fca86247e8b4864fcd8bb83
73346 F20101208_AABSVQ mutschlecner_t_Page_039.jpg
1dd2ad4abfc6459399edf2a336135916
4419b770f8320345ec1d1c51135dd3e6fe1e8062
4223 F20101208_AABTZH mutschlecner_t_Page_108thm.jpg
a20c0de847737348c8be8f7b805ab04b
73d2e50fb954bf8f27728b9a81e78c8383043499
77490 F20101208_AABTBL mutschlecner_t_Page_050.jp2
cf1910da8c40a7fe6bc6f4fd47cc9ad0
557cea32ee7ef029a210632098e3c41619e78080
68260 F20101208_AABSWF mutschlecner_t_Page_056.jpg
eb047480a40b4944a766cec7a5ae5eb2
8fa8cf717981c93e1fecf76fac6910ead106dc04
111305 F20101208_AABTAX mutschlecner_t_Page_032.jp2
b7077472773ef57dde2f5395b5615df8
0b7e9e066913169a0148ce6a2eeb6e8487fe9fc3
15432 F20101208_AABTZI mutschlecner_t_Page_109.QC.jpg
510fdafdaaeae8bf173fb7f4e18552a0
ead261da33a710cd56a64c42cc4ff836298d616b
6643 F20101208_AABTYT mutschlecner_t_Page_097thm.jpg
9dd57f83c474fb9afdf98032972a4508
14f1bfff78f51406db8dd8d69ad3980b90637627
109901 F20101208_AABTBM mutschlecner_t_Page_051.jp2
f1b79704bac7730c39632b814a08bb45
7fd7d145319fa7ba6ef69d075861945e082a1ddb
69650 F20101208_AABSWG mutschlecner_t_Page_058.jpg
5dc3ea7d82d0beebabbcc504801346ea
db60093116d462a90b08d114f99065ed41f4f3f3
112717 F20101208_AABTAY mutschlecner_t_Page_033.jp2
733c9647d1f66685f467d1dd80d60f9b
26cea0db91f593c7c1c79f0f1c1c58b3187b013c
73530 F20101208_AABSVR mutschlecner_t_Page_040.jpg
f4a27dd56d32bdbb07a35ab36bbfa3b5
8fcef2b59fe84a56287e1aa36836389979ff9b51
25047 F20101208_AABTCA mutschlecner_t_Page_071.jp2
2d16aa8f1526154261d95238fc87fcdc
fb67bcc7db6466b99eb52c8897e1027f5ed4a9df
4654 F20101208_AABTZJ mutschlecner_t_Page_109thm.jpg
77057b2bb1524c4e203be89bece5a919
894312bfd4913387280f11a3cec4ca3c33a8df99
6810 F20101208_AABTYU mutschlecner_t_Page_098thm.jpg
f5de0f324f82695605659d910a677d4b
ded9b864e99a7d29b21959cfeada55272406eb0e
94704 F20101208_AABTBN mutschlecner_t_Page_052.jp2
a318efd9c9541c705e2b884ff60eb76a
c9e8e9966b30140e331089f141c536c86b2e876f
66442 F20101208_AABSWH mutschlecner_t_Page_059.jpg
2e4f12d804bc01a8b4993aafa851962b
d4860a02bc47c691f50f6337c62e55f8d6cd561f
110999 F20101208_AABTAZ mutschlecner_t_Page_034.jp2
1368b86b3902d92c9cd6ee805834f92a
28be22fecd48559138a7208a8ea56f03a77fe8fd
71363 F20101208_AABSVS mutschlecner_t_Page_041.jpg
80e03eeac27726403d7ce5b643286076
28c5c15afafdc6f506f3fe6bfe55139ee9113f80
30700 F20101208_AABTCB mutschlecner_t_Page_072.jp2
f049a7c6eab0fb668b2047b126cd36eb
e07e3483745cd2ddf475f11ffa6738cdfe6a7e2d
12776 F20101208_AABTZK mutschlecner_t_Page_110.QC.jpg
3fffaaa31ed211a971f076b49cbf091b
77b7534d351ab335fc467e861b7023b1a89a526d
8261 F20101208_AABTYV mutschlecner_t_Page_099.QC.jpg
620ede6d14478a0ec1516c01ace8e06f
2065441dcbfd04d7892ac3fb732f86f420354741
112800 F20101208_AABTBO mutschlecner_t_Page_053.jp2
6281f0864dc84d5081a05a68e21ba076
216a37190b3a46fc9c411fb851486004e1ee40f3
76533 F20101208_AABSWI mutschlecner_t_Page_060.jpg
5b3c0c5129cf9acce6a3dbf411b7caf3
0662788e2ce1681349c773203640f477fc978611
72641 F20101208_AABSVT mutschlecner_t_Page_042.jpg
5b5e934dc28e1f9bc8964fbc8bde00e0
8e9e2ca20f4e262566b0859635f902bd15e959da
32799 F20101208_AABTCC mutschlecner_t_Page_073.jp2
9027d6f7118f177394f81e5a0cb8b823
08006c96d581659db2a000a2197c0cc68b10dfa6
3666 F20101208_AABTZL mutschlecner_t_Page_110thm.jpg
96821ce59a39f2ba72647432dfd951f4
e50f79be4e6aab61be364b6589a4db04b9a5bdfc
2540 F20101208_AABTYW mutschlecner_t_Page_099thm.jpg
fbf1bacc16fad5444506bd7cc0f3ed31
686980208fb50eaf2717fe97ea83804ceac48fb0
117013 F20101208_AABTBP mutschlecner_t_Page_054.jp2
ad45db666e704a7582d3c68a5bd15793
0a620a19e71b8c811a05385893e12cd1c3a24aa5
52728 F20101208_AABSWJ mutschlecner_t_Page_061.jpg
d28d1cddcdf6529a5e054f6319ea1944
357cbb4fc654a39e77a278ddf2638dde040d4298
76847 F20101208_AABSVU mutschlecner_t_Page_043.jpg
7731ac36ff09f1b9b0a8ed7cad6d7197
89f764f5186a8bf2c7a728afe61ea61650eeb8f5
106313 F20101208_AABTCD mutschlecner_t_Page_075.jp2
29f904156c37925effa4192100246d9b
1c48242dfffb1040646e4911863c40dc73cd00a1
10482 F20101208_AABTZM mutschlecner_t_Page_111.QC.jpg
660fa33381298e1a19ad34f57d0f2f4a
52ef6d290c377f9a8105c06c9ece90c3f9e80613
18424 F20101208_AABTYX mutschlecner_t_Page_100.QC.jpg
8704646a2ca629f3c59125b2aa9276ba
493c6dd8df6f89cbd17656ff46a5c6f28493e542
100894 F20101208_AABTBQ mutschlecner_t_Page_057.jp2
7277451aac91e3472a15b65ad4e6b922
df8c52912c823f03a5f8cd0ffa57c5ff8c66f17f
37061 F20101208_AABSWK mutschlecner_t_Page_062.jpg
1cfe3b98523d3f36d83ca9f140a92ce7
4b9135eb551ec5c040686ec946b6b3ff89bcca9a
71206 F20101208_AABSVV mutschlecner_t_Page_045.jpg
b15bc3fb82431a58243d8fde7b735357
0da6e2287c79642756e15807051aaa7684328277
116191 F20101208_AABTCE mutschlecner_t_Page_077.jp2
7e28d9eb97d3420ae5f3f27f3000b0b9
3ba2f09a75bcd697d7d6d7fade8722db7ca61a57
3623 F20101208_AABTZN mutschlecner_t_Page_111thm.jpg
95c991dcafba34654fddcd9d7bd77c4c
e1370cae6cddddb11105ce05913e4526e4d27f87
17899 F20101208_AABTYY mutschlecner_t_Page_101.QC.jpg
d8a69229b8f0f6e7c72a01b98f1cbe3e
d3fd116206993284c2631d1c006a8bf4f2719c89
115742 F20101208_AABTBR mutschlecner_t_Page_060.jp2
2d3567c38fe78857878a4df03c395cbc
ed995b9dcc5e49eb9eb3d311972e8541bc424d78
20989 F20101208_AABSWL mutschlecner_t_Page_063.jpg
1f557448daae7c171d59a2aea60f0708
bbb2eada9042472839074c503ef16127b033e276
71114 F20101208_AABSVW mutschlecner_t_Page_046.jpg
d9a156ea7c5e7959096c5b1a86c2448f
55ae3d1e6d9bf357235cb34de1388f9c8e72e5f6
109821 F20101208_AABTCF mutschlecner_t_Page_078.jp2
d72c6b253dd1c9e52fb1c1c1e327ab93
ba3e7a6cb5dda4f693a68c4a75363bdccefdc92d
3400 F20101208_AABTZO mutschlecner_t_Page_112thm.jpg
bab12cd9cd9cbbb7ebd63f0f4218dae8
d7b532db1423c441350d9052aef5bcb76be54138
4876 F20101208_AABTYZ mutschlecner_t_Page_101thm.jpg
11d3123306e8ccd629bf85e1b2207547
2d6128101a42b5a91103ada03af18d7a2ef06578
79573 F20101208_AABTBS mutschlecner_t_Page_061.jp2
6bc6b501148967b7f123c89fa6372a45
642de9ba37586bca0c6ec11da63316d6ad216f88
23470 F20101208_AABSWM mutschlecner_t_Page_064.jpg
ef1a3dc058843147d7c0fa3cde65a5f1
620039221dd0139553f94bf7fe684e7fbec8dc0f
75167 F20101208_AABSVX mutschlecner_t_Page_047.jpg
dfed816ff6f1a5af7593e35da3fa35e3
2139233bb8c75a5b8017f19b69e2d4bb718a7605
117720 F20101208_AABTCG mutschlecner_t_Page_079.jp2
3abbf670aeb8c2d93c6bd87bc73f79d4
81456fdee8c922d8ae9eb461e8d32f3192511882
72955 F20101208_AABSXA mutschlecner_t_Page_080.jpg
d01f76ef9f5b5c64b49eaa744a88b458
fde1de87bd4852caddede2119fccda582b21f9ae
14068 F20101208_AABTZP mutschlecner_t_Page_113.QC.jpg
c4c8c9bcda495614bdc4179fb19164a0
1c76b96b77d0c3d66b4da99108c28e9a2b8e2a87
51804 F20101208_AABTBT mutschlecner_t_Page_062.jp2
9a6c8a0233d96208b6094af1a54582ee
f066b070ad85764277d0892f64f2e3de4a917f13
35261 F20101208_AABSWN mutschlecner_t_Page_065.jpg
0a96e7ecf1f1e8330cbcd21db01b6075
f554b3d878ea2321ca4f7296fe23af384b857963
72969 F20101208_AABSVY mutschlecner_t_Page_049.jpg
58d7a2c953f763a82c0cc94dcd998c8a
0c58cfd4bf7bb2115ea14030af9cced67b93265f
111439 F20101208_AABTCH mutschlecner_t_Page_081.jp2
c89cc2a811a1a9b9205a9e836fd75af0
ea8ef23b30f0a88ac5a87b56bc4fb18f58dfcdd8
73998 F20101208_AABSXB mutschlecner_t_Page_081.jpg
ab07792d696c2651a5d0a116d708ddef
4170cbdf09bba7cc47c567978c4d8a94bb0146e2
20354 F20101208_AABTZQ mutschlecner_t_Page_114.QC.jpg
d069636aaefeeb4a53606c922d16fa79
9e6688936e4bb0d1b002590bfaab43e153a74cfc
20479 F20101208_AABTBU mutschlecner_t_Page_063.jp2
69b5d22f643ba637a2eca35597cccf58
78f4fdb0618758e9e9e0344ee8d69b6e4c172f4e
32166 F20101208_AABSWO mutschlecner_t_Page_066.jpg
2bd2ec4bab1abae2b558673380c98d79
00a9d6a7740f7580370af3da18a46498a6f17c9f
52061 F20101208_AABSVZ mutschlecner_t_Page_050.jpg
44b12500ca04e37de59b5b2ceca8347c
58633f1ed39c3d9f96572813b646b9462e5b0fe7
118758 F20101208_AABTCI mutschlecner_t_Page_082.jp2
1e1bff146c94565d36324bbd79dea59e
8920e6911234011fe47a7d160be662512868f211
76159 F20101208_AABSXC mutschlecner_t_Page_082.jpg
b9395c1cebc649cdf98d0e8703c77363
a7c1e38f5973960bbda20414f0f7a165a468a07f
5536 F20101208_AABTZR mutschlecner_t_Page_114thm.jpg
8c480ad9f0e1241f260a7b74f1a9e8c5
3770bff85164d1ab7a612aab72eaf1cb824851ad
25945 F20101208_AABTBV mutschlecner_t_Page_064.jp2
99e336080b262aaa4ef805b7e4aff145
f4e75cdfcecb3ce83b977cc608336f4f40f07d8a
29143 F20101208_AABSWP mutschlecner_t_Page_067.jpg
a5d43ac042f1f32c4b924c97e9d3b047
0add3ecd6a61d6fb8e5fd24a4daecb6cf4b06d08
115827 F20101208_AABTCJ mutschlecner_t_Page_083.jp2
d430cf063fa5f8ec7a19fa8c8524188f
845282ca954bb072408bbead79f7caeebf34a31a
76371 F20101208_AABSXD mutschlecner_t_Page_083.jpg
3f5490c1addc82b3723606479ff9d27c
e09d352b27e4d225f0a6d040e9e50562cad6d5e9
8615 F20101208_AABTZS mutschlecner_t_Page_115.QC.jpg
0d73cd24ffba642a2406438f33f5dc68
6ca7129e195d733beb493300f63e7b927858aef4
42583 F20101208_AABTBW mutschlecner_t_Page_066.jp2
0306bb56094c9b1f36dc43916a5140a0
2a8a0a14a971dde0ec02e3200fcaa994e5895fe8
25294 F20101208_AABSWQ mutschlecner_t_Page_069.jpg
cc4b247a0ca937f82571e7788e41acb4
0aea95dac858e4675e27fe8379dd8087ba1c5886
109463 F20101208_AABTCK mutschlecner_t_Page_084.jp2
8187e2b7dc21b089961ea1a3ce944fd6
048be86b132ae1cc8ec210e7fc1e2a1997fd9d95
71882 F20101208_AABSXE mutschlecner_t_Page_084.jpg
9a244094ba96b45765e64393923761f6
3d92d1807b4e7b6cef878af85ab88386341f06bf
3089 F20101208_AABTZT mutschlecner_t_Page_115thm.jpg
5667f711307d0955bf7bb460e86f2db5
ac8d750cb7d422e480123d3ea67ecd213e057f57
36632 F20101208_AABTBX mutschlecner_t_Page_067.jp2
e1d6efc78042bf31bf8859be7edd1025
fd4e3890d3d412429e7e2ffe216a741064e31ee4
25149 F20101208_AABSWR mutschlecner_t_Page_070.jpg
98f30be49ded876efcabd5b7eb205e86
16d04282bb197dcf416d22db18f023456618f7e6
105855 F20101208_AABTCL mutschlecner_t_Page_085.jp2
afef8a5270fc3cd848f95abcfbdf4fdb
5ccce5db3802bb49eafffee802d1aed2ff1da4a2
68990 F20101208_AABSXF mutschlecner_t_Page_085.jpg
9cf26929c5b80c541050f5c94ff24561
19f0d522572f021b26b25ecb6f9038b82200c4b7
28966 F20101208_AABTBY mutschlecner_t_Page_068.jp2
d911e2e892f3288ea1fc2d0d36b8f2ee
ee046e5848e4fdb9e2297aa24af6fdf8a76d1e8d
832552 F20101208_AABTDA mutschlecner_t_Page_107.jp2
c9ccd6c393c537b4fe1be400b209e037
dfa9003324580759097c2b4ccfe3f9a11a1d19af
117895 F20101208_AABTCM mutschlecner_t_Page_086.jp2
67fe9d82a70d2044880b55ed73bd01b1
71fe078c98bc44e432f60a1bda1038b59a6bacc6
77959 F20101208_AABSXG mutschlecner_t_Page_086.jpg
3d210ab096eb0c13b96839d85de14f00
a3e9943c46f3a87413648662edd9cd7adfdbd59b
12075 F20101208_AABTZU mutschlecner_t_Page_116.QC.jpg
2ecfd1eb3eaeecbbf9583368f903efa3
bd897418c0816b510076b3d229500ec82468ff7c
23973 F20101208_AABSWS mutschlecner_t_Page_071.jpg
38850d7f15f7e2038f77b1e9d9e13e2a
2f47c8391aa09e52211d0384df0cb55a989b7382
653474 F20101208_AABTDB mutschlecner_t_Page_108.jp2
9f86e777f5052ec07520191b69431153
2608b92f36e6eb6492d6952829597d42160d3831
113737 F20101208_AABTCN mutschlecner_t_Page_087.jp2
8769e284b4478eb834ae414b7a622950
770357afc70b63f5900548d4477ce873a6c22fb2
75025 F20101208_AABSXH mutschlecner_t_Page_087.jpg
6f53e9cde24db0412e79f17a3fbd49fa
6c8daf1f1d180546c6aacf1fba2155e2d513bd54
26836 F20101208_AABTBZ mutschlecner_t_Page_070.jp2
2bbf8abef17d6a460672693095724ef1
53133ef4f54418f8441f89c6e251260aac1b416f
3676 F20101208_AABTZV mutschlecner_t_Page_116thm.jpg
4f214e4542718271fba272260d6c07c4
21affeab4183c35ef57697ba2338ac74ff841215
26317 F20101208_AABSWT mutschlecner_t_Page_072.jpg
48590783dc0cf63753310284a5d5ce7d
558b74386fd23526ae230a75feed2d9a19878925
629682 F20101208_AABTDC mutschlecner_t_Page_109.jp2
ab9f064417e7085a5878519ea4892851
894f5ae6101afc23371082a41af0005bf2120e21
119394 F20101208_AABTCO mutschlecner_t_Page_089.jp2
d9bce4b8c628b9f2fb3ffb0c6aa62602
7fc7ecb1c36ae8feab893df30425df0350337767
84934 F20101208_AABSXI mutschlecner_t_Page_088.jpg
33aab6f2dbb050f0e7fbe3b22a3a2a05
35a3aa6ce546bd10c225eb3aae4eca22e2854f1b
14796 F20101208_AABTZW mutschlecner_t_Page_117.QC.jpg
244eddc38a814b0e9834f4ed1db0b290
e40ea7d6ce44b967297215202d84d0364b0c15e7
28093 F20101208_AABSWU mutschlecner_t_Page_073.jpg
902729b9ad613ca2d5dc09fbb66e619b
48d59e7431d07766806181d369a9f93e8fff6fa3
533547 F20101208_AABTDD mutschlecner_t_Page_110.jp2
2a4b052f89d51d00f160876cbca66f91
4d7afa8c307fc06803eaab32efa83deebcfd31c2
115392 F20101208_AABTCP mutschlecner_t_Page_090.jp2
4fab8d8897caa0c15e8f4cdb92f1e2b4
0c6bc62327a4c70496548cee42f305d67a477b3a
56287 F20101208_AABSXJ mutschlecner_t_Page_091.jpg
1b914c9f1f3303e018f7a7d315686d21
0a44c586e6e24099c4905ca1fc3621a652b9082d
4303 F20101208_AABTZX mutschlecner_t_Page_117thm.jpg
f866c393e7fd2587301358b0d908866f
d11b1f601e5909dd0a773173bf9586f740374d97
71456 F20101208_AABSWV mutschlecner_t_Page_075.jpg
2b0b384cd7a226591d5d3d0f69bba7e4
46cf3018ad810376e41c66efc2c934c3a963477a
432494 F20101208_AABTDE mutschlecner_t_Page_111.jp2
b1a965c9b19f2d1d59a898d9dbb5c0be
7bb230eaa7db170e4105cbc05c80f73807695d6e
84211 F20101208_AABTCQ mutschlecner_t_Page_091.jp2
4380ac685b1c70a9719eb042b4ebc92c
0d3509cdd788c45d3a3a873a84c8646fbd51d6a4
48712 F20101208_AABSXK mutschlecner_t_Page_092.jpg
183813e7e0c206a32eca74a090a7f5eb
7c70bd6e74521e978fe3a1fffcd6db221a70d7f2
11443 F20101208_AABTZY mutschlecner_t_Page_118.QC.jpg
0872b269ceb789f40615f8cf8e618a9f
910cdaef0abaca0a0b96588e61f5d47a24cd9b0c
75828 F20101208_AABSWW mutschlecner_t_Page_076.jpg
4606f2354a6b6b4c24d12ba223ab65b8
441fd436e55340fc5f8b3efa9d30d669bcfa9198
33177 F20101208_AABTDF mutschlecner_t_Page_112.jp2
19a79440db7b410569914f1138578185
87adc16493ffdbf9b5ef0e79d2a7fb2faf35dadc
72970 F20101208_AABTCR mutschlecner_t_Page_092.jp2
5c03feb92e7afbee9a46211c4d9ff63b
b08df28a940e80af125fa2f745df2c32270bc350
70833 F20101208_AABSXL mutschlecner_t_Page_093.jpg
576d986332ce431c7b9c23a3e4a1bd0b
5b2f7df8932a05fbb5b83d19020b8c7f061e5fcb
3661 F20101208_AABTZZ mutschlecner_t_Page_118thm.jpg
9a77d750584216e64ca1522474706b5b
1bb0caa114f10b82b1298be3c4dfa93e190d4aa8
76312 F20101208_AABSWX mutschlecner_t_Page_077.jpg
376ba754b032a8475dfb146c5f05185b
ae28a3d566cf92e408672b1b055e52829dc400d6
1051970 F20101208_AABTDG mutschlecner_t_Page_114.jp2
378429a97317579c38b6a03f67a965b8
e068c7c863e11e9acd63a37eccebfe3ecd098046
41884 F20101208_AABSYA mutschlecner_t_Page_109.jpg
0d21a081967d9e07a9124877d50f575d
f12b53deda4aec15ddf9d85ec29bdab774e1603a
106989 F20101208_AABTCS mutschlecner_t_Page_093.jp2
ec17324933e33966899ac04e5331e1fe
4f0860638f53012c91579a9ca71080b7485d91e1
73549 F20101208_AABSXM mutschlecner_t_Page_094.jpg
3f27f5640689784dd456c5394f50875a
efd55134791f6cb60eaf9f4f639d1542ed2edf9c
71406 F20101208_AABSWY mutschlecner_t_Page_078.jpg
ad358f9b751bec1c9b9062efea77d703
326bde9b17dcf00449735323d3b0099a4d272cd5
432401 F20101208_AABTDH mutschlecner_t_Page_115.jp2
e8f8683544f3a3111156cfa21ae1421c
7fb9eaab053cc5e26fc74f06cdfbf33a4d1db79d
35217 F20101208_AABSYB mutschlecner_t_Page_110.jpg
a738c7ce0e371740f069f950f3bf8916
2cbb3c304ee60a1091849537213d0622a116bb83
117661 F20101208_AABTCT mutschlecner_t_Page_098.jp2
7c56d04978ced15832791b3b8f4b7e3b
625fc55d6d74229d0f8fe4ac430da19482b81a28
75463 F20101208_AABSXN mutschlecner_t_Page_095.jpg
6abdaee5b69c6748842282bbb5c635f1
d05a415a036b1bcae5d2eec39fad6dd52b57475a
76977 F20101208_AABSWZ mutschlecner_t_Page_079.jpg
449f14d2779df66d5bbb496495db0276
d45e2842e5748a65041d3964be8e49cecfd02007
661433 F20101208_AABTDI mutschlecner_t_Page_116.jp2
d13841b26ad430ab8679d3ce998d30ca
072fbc018ac2e0b19020da5bbcfa17ce0ed81554
28340 F20101208_AABSYC mutschlecner_t_Page_111.jpg
8da265e738dd5936d383ed8f5fcd84da
3f081c59c4392c98252cacae618d1eee2f00a00c
31395 F20101208_AABTCU mutschlecner_t_Page_099.jp2
85d6f01798793ea0ba516f7a2af375e0
9070a14a2c041559a50121ba641fd5117cd55f8b
68379 F20101208_AABSXO mutschlecner_t_Page_096.jpg
ffa3491a84e0eb196c2275c328bf9e24
657baff8a6f80c43fe6bfa457ccf7449e5ef0493
803386 F20101208_AABTDJ mutschlecner_t_Page_117.jp2
1973820fec80be6893b745cbba4c9bda
d3cf7a0e0619ac11519ec734766ea0b8eb9c45fb
29475 F20101208_AABSYD mutschlecner_t_Page_112.jpg
1185175ada4843b57163c6fc2dc5cfdb
5aee089be3f8b582f108c2e0160f5443b3a2b724
56727 F20101208_AABTCV mutschlecner_t_Page_100.jp2
0af50aa46b53570402e82e571b944d60
e78e7c695803ba31f073036591ed3f44618148d6
71067 F20101208_AABSXP mutschlecner_t_Page_097.jpg
cda0642d90fe07249c00ce2da05330c1
4be21cb7253302ed5c800e52e14b7640f88888a3
394777 F20101208_AABTDK mutschlecner_t_Page_119.jp2
b014df4a446f8cd994bcac762fdef209
7eaa39f39c4453da34de6f6b9e4d319c9f8854d4
61588 F20101208_AABSYE mutschlecner_t_Page_114.jpg
4ce976ec717954f89bad2739744c7fa1
4040173bbf17fb061bcff137d24833022a403bb8
51534 F20101208_AABTCW mutschlecner_t_Page_101.jp2
92136fc3793e6d079a432289889cfaf8
b40d75d5dad091bf8f3be4f179a9c2f36a1c176b
24625 F20101208_AABSXQ mutschlecner_t_Page_099.jpg
cce4b7675ff4e62495857140ed3e587b
cfbfe5ac61a390e7b7ad922c35adb8fce9adebf1
774470 F20101208_AABTDL mutschlecner_t_Page_120.jp2
644df5f1acd941fdaf9b8137a5e77860
de33f2adf07b2b0e70f675eeb9174263a9d557e6
28013 F20101208_AABSYF mutschlecner_t_Page_115.jpg
fe0bc0e87b2292812c647dccf3ed9e33
20f36f0743502cb61ac0e69b6c6ff08ef475bf29
1051929 F20101208_AABTCX mutschlecner_t_Page_102.jp2
2b85ad4b0d1943f6b40f92d43bd60fb8
c24a8698e96d199508a8848739a0b5dae5eb03d4
54456 F20101208_AABSXR mutschlecner_t_Page_100.jpg
eb4ce7eb09a64cd9bf9ba72163fcf930
0b32bf9825b1390da0e5df0ead2db92a135f747b
797718 F20101208_AABTDM mutschlecner_t_Page_121.jp2
470c09f03028aa0eac7056fc381a6134
13ba58d1cebd4793483ad6b8419835d8103c14d6
39734 F20101208_AABSYG mutschlecner_t_Page_116.jpg
6b0dc0c2423299383ab7560acfcba7af
c2bb0aa7eed7f04cfa3df0bbf9781b3399a7755d
465911 F20101208_AABTCY mutschlecner_t_Page_104.jp2
5f3b89fbbfb8ab387a2457b73f80bdd2
d7a955af29521af97f22b84ac9a7dd6b550f7b2a
54575 F20101208_AABSXS mutschlecner_t_Page_101.jpg
3733e48152be715c8f0ab6006141902d
3f665a87be3b2efe627e40c965245ace4780755a
64763 F20101208_AABTEA mutschlecner_t_Page_144.jp2
219820fbbe54ece5beb27e78bf748657
6c5789150554cf0ba72642c11b83722aafd5bc2d
1051972 F20101208_AABTDN mutschlecner_t_Page_123.jp2
ac3050459872bca3b28bd7294e9fba78
47085772cb3a8af5273ab3544182a03ef373f50b
45905 F20101208_AABSYH mutschlecner_t_Page_117.jpg
e83311e06e2f20627b60158d265077de
4ea20961a7c3e8bb49efbca54e0fdcc1128fa4fa
606594 F20101208_AABTCZ mutschlecner_t_Page_106.jp2
5caa283232d8e2b87f3f1b26ed673f82
1589c720a0c8d893780f1ac5d2a15dd59e6985c0
97518 F20101208_AABTEB mutschlecner_t_Page_145.jp2
b08cf524ff060e0c1c5119925e3fdbc5
7b416c653212f187327f23e9c6f809284060e3b6
925950 F20101208_AABTDO mutschlecner_t_Page_125.jp2
9be739aa737b2885007152eba039639e
e7b338d4a38a2c5b64f8c992e4819848d1489107
36901 F20101208_AABSYI mutschlecner_t_Page_118.jpg
2eb27e2fc6b35796bb082892ac5eccc6
7847f6ae538a4218464345d73110c54ddd9a8961
108846 F20101208_AABSXT mutschlecner_t_Page_102.jpg
b900021bf52dca0dfc99ef049359a84f
35911ba35723297e4c7c5282e6d6b08b0815819e
73753 F20101208_AABTEC mutschlecner_t_Page_148.jp2
ebea4f8e080033247c491de3d6ee4714
55877bf92dfcbbe53b124a56dcb46b61956df1db
831595 F20101208_AABTDP mutschlecner_t_Page_127.jp2
499d7c38356653a6b35856af63bc1891
f538677ae68af1a3d7bb69d35b193a609ef8db1a
25187 F20101208_AABSYJ mutschlecner_t_Page_119.jpg
8654fa98c8e024cd3d50875f6a306e97
a8e77e9cd613aae8e502d78d58e56b913c557299
20272 F20101208_AABSXU mutschlecner_t_Page_103.jpg
0ca07490db772330acbdf417b1063179
812b8dd4ce377ef3609dc5a4332474b517b5f6f5
63723 F20101208_AABTED mutschlecner_t_Page_150.jp2
89bf1258d7221e31c84a9c8d73cdb63f
b37227264a53bfe7fb5863a591fcdedb4bf20b29
762488 F20101208_AABTDQ mutschlecner_t_Page_128.jp2
2604c7278dd642027072dde3cdd75a90
00fbce51bcbaf97e6c3f40c080886f658446dd73
44352 F20101208_AABSYK mutschlecner_t_Page_120.jpg
888bc9049af65a75a6ed7e7fb792608f
fdaced61fae0de12a6606ff9ba91f88619e5f6a3
33263 F20101208_AABSXV mutschlecner_t_Page_104.jpg
2da25dc00a8aea763c89c6c2bf147440
3873c6f9bcb8eb3bbd429e06a51138a160a33e2d
69248 F20101208_AABTEE mutschlecner_t_Page_151.jp2
76ad04441a2d2ec8ae5c7fecd5b973ef
1168d9e6a02abadc97387788213242c4905ef39a
826430 F20101208_AABTDR mutschlecner_t_Page_131.jp2
0b1aa2529315bd71f2e67441340b7511
dbace2c9c7f255f416a12c6bc0484dcb63a53496
45169 F20101208_AABSYL mutschlecner_t_Page_121.jpg
1688d2f3614275ebf2e330b6e00c0253
5b09045071db64dc86102fbd484a351b541cbd15
49451 F20101208_AABSXW mutschlecner_t_Page_105.jpg
213ae9a40294ba584b8a84a5996d0609
da9cbeaad1560f48f214d6cb26941e14d5a24a81
51126 F20101208_AABTEF mutschlecner_t_Page_152.jp2
370edc9c81d368e7f07e85c8bc5bf304
53cdaf6f7d5102ab08fcffb60913536b542433e8
51449 F20101208_AABSZA mutschlecner_t_Page_138.jpg
eb0c59da4a35938fb01b1fa7d689769a
9f57ecf48bdaa32fbb901d04c5110081cacb0f53
368657 F20101208_AABTDS mutschlecner_t_Page_132.jp2
18dba6b878342477fda7321d68c1fadf
695eebab5534926472aeee61aa3926874815d48b
48540 F20101208_AABSYM mutschlecner_t_Page_122.jpg
1634c01df69324f22680e4c8c0f7b973
87a7e8d46e0c5c20953af688cd8f4da458238781
41498 F20101208_AABSXX mutschlecner_t_Page_106.jpg
23fbf943ee41697c9e19d0113ea1f7c9
801972e4b50c57d23e741bde32682c5983e6af22
12577 F20101208_AABTEG mutschlecner_t_Page_153.jp2
a49407f1f79d90e630511e4b061421fd
d9fed6d75029200b06dc7ecaee43b2e417309873
54661 F20101208_AABSZB mutschlecner_t_Page_139.jpg
52c91c303f2182349f19749b2aa19e3f
94150fe4ddb8d1ad713b5e26987cbd98d25f17fc
686134 F20101208_AABTDT mutschlecner_t_Page_133.jp2
9bea5fe4d0df55879720578a85a3c191
be1438d1c5ed9ab26761faca24bf7eda1f6ec782
62967 F20101208_AABSYN mutschlecner_t_Page_123.jpg
b5a3d4e9fa4ecac4565c3a507d023726
da85968f8cc966bb28c96ad7fb25a2abf95b389c
52550 F20101208_AABSXY mutschlecner_t_Page_107.jpg
b7be3262071604cfbfaa35de37ebdfae
7e29ea942944f29adf960a765947e529cda2cb7a
54081 F20101208_AABTEH mutschlecner_t_Page_154.jp2
30668c18217ffecb8b25f30b98d96e33
05427725542ce4179722ccefe0e3f2f2ac12d3a1
60773 F20101208_AABSZC mutschlecner_t_Page_140.jpg
bf105cf2f7843ec7023730b7857b2f24
d0a6e18530efbdbc7a8fba12e44b57f5a2c3a457
697644 F20101208_AABTDU mutschlecner_t_Page_135.jp2
796f80a32ef8f01cc9f8b0f998278ab8
deda141b404b932b20596502a7e925ea8ec8d3d0
51795 F20101208_AABSYO mutschlecner_t_Page_125.jpg
386fdda289f652a9663b47be9b392f1d
9f2528e996a6c17af53e9e003f97c023e191418b
42720 F20101208_AABSXZ mutschlecner_t_Page_108.jpg
32531ff0b6acc1be9519bec943a9d6f2
5aad7c844f3d6e83260ba63c52878bf153493600
34631 F20101208_AABTEI mutschlecner_t_Page_155.jp2
3faedf876f8c8d011a7805484d847490
5ef6221ab7a2e2319a6aa53c2754ca3d239b1ea6
56583 F20101208_AABSZD mutschlecner_t_Page_141.jpg
9e2aa28aca855546a9d39230e7c9cff5
2044ff01c4c2e05deb86ffcb7a3769e1876b302c
767782 F20101208_AABTDV mutschlecner_t_Page_137.jp2
5510ff3f0a04ad0bbca09d079e8a3b81
1a972c094cde0ac1f3ffbe2d08d0e2ba4c3f8119
36807 F20101208_AABSYP mutschlecner_t_Page_126.jpg
bbb3674cc4d4954282e5f341c8e17797
549faf87a7521e66bbe047a9aff2ed705f9e1ab3
47324 F20101208_AABTEJ mutschlecner_t_Page_156.jp2
c9b82e065ec6fc03b60214352e79cb63
05428cce6ad68cf42a2e13951ee29ec2ba324d5d
55509 F20101208_AABSZE mutschlecner_t_Page_142.jpg
f9334c1eba479e468b1a38c26d98a072
05b2da39fd9a43102cebacf17e67968238dc45fd
903818 F20101208_AABTDW mutschlecner_t_Page_138.jp2
ae68eb067535e37a6054456647b88bd7
01a6e8c23f75346492c31f2a962b0e685f0d8c78
43674 F20101208_AABSYQ mutschlecner_t_Page_128.jpg
35ef3b080cd05d24d5837a4d737541b5
45de34c2ffcaf5d134d3744928632b1cf1b88b75
42234 F20101208_AABTEK mutschlecner_t_Page_157.jp2
87f909659dae549c1c4572bdcd6d8720
d6bb78d2f30d1587116b166f3cb37368be1f055c
24734 F20101208_AABSZF mutschlecner_t_Page_143.jpg
196ec2588eaead791e5b42fce7ce9907
7826d41f6789ae0b6934ed5f0de4f71ad708ea93
1051949 F20101208_AABTDX mutschlecner_t_Page_140.jp2
d9f80c911dd21b4a3fc580326a687f5c
6c3de0f1dfff5e09b9c97b57d476b66ed3f0316a
42983 F20101208_AABSYR mutschlecner_t_Page_129.jpg
eb8d732cd6427dda94cbf619f993d303
f74377c4b860be712a51d3c0b55ec0accb66f3c0
43139 F20101208_AABTEL mutschlecner_t_Page_158.jp2
061b41c5c52bcf5b378119d44ead9eb8
80db0fdbc8c359fc9c4ced0a4bcf92f736bb5f5d
49822 F20101208_AABSZG mutschlecner_t_Page_144.jpg
c5899b3d4a472beab6b2d77682f66581
1fd4c7c648d102e3087808658653e98e6d80a172
1012589 F20101208_AABTDY mutschlecner_t_Page_141.jp2
77041e0dda5a891bd1825cb09aa8422c
0fc745866c6a093d221853fc9018bf4f648b73e6
68370 F20101208_AABSYS mutschlecner_t_Page_130.jpg
b198674c940c36805027f5ce445adedf
aba886a745f354ab8c966e0eccdccee8bfdc1dff
F20101208_AABTFA mutschlecner_t_Page_016.tif
9fb13a611282da1f82a45c7769ea118f
e819250bc708c9d15333f1a76ab423511184b127
120598 F20101208_AABTEM mutschlecner_t_Page_160.jp2
ee68f1ac20b93832b8fff7bb416e3145
f87ff946163f6506a0d8a54c536815593d218b03
69407 F20101208_AABSZH mutschlecner_t_Page_145.jpg
5bedd393645dcbaef0313b7306f4bfc1
1ac675b6b0b651c08e2bc89e35e58d26b4360c8f
953772 F20101208_AABTDZ mutschlecner_t_Page_142.jp2
4d7677c56954467cc46875f078ca8019
1180aa64aa553be67988fe0e4f02f594735cbefa
48375 F20101208_AABSYT mutschlecner_t_Page_131.jpg
dd4aa14aa7bf5212a094c135409c2b9e
8bf36075e88edb0922e0001aa1ce8455f55d0f03
F20101208_AABTFB mutschlecner_t_Page_017.tif
4e8f4a74670cb6622c8617fe86e40f14
26f7b4c22bffbddb8dd7df30f4e20ab11aca7f09
118457 F20101208_AABTEN mutschlecner_t_Page_161.jp2
92578778da25c3b902c60b78d09ebed9
575f10014762d8cea674f47e993fef7b2931408b
61191 F20101208_AABSZI mutschlecner_t_Page_146.jpg
c497df0407c6ddd8ad2232910dde37ca
4f52271ca4798d2bd81ae0709fdf039686dc46b7
F20101208_AABTFC mutschlecner_t_Page_018.tif
920912f296eab12c9caae8081eed95da
cf71ad2aa4a789af92246254d54d0206e8f719d1
109266 F20101208_AABTEO mutschlecner_t_Page_162.jp2
170147c1cb62c2b030f2da4d96af2ee0
cf1ef059afa13a15f42b148a8adafc4ae28b08a8
64631 F20101208_AABSZJ mutschlecner_t_Page_147.jpg
09b66e7a56f0579e427bd05ea5d27dbd
d5de715fbd16baf3505d0b316d0477525aaf4d80
25092 F20101208_AABSYU mutschlecner_t_Page_132.jpg
b6a40d2265dc3965bae185f9e430de9f
1225a8790411760fad9608ecfabd2b8b4f737952
F20101208_AABTFD mutschlecner_t_Page_019.tif
3c8e4e4cdc2ba97b7f4204ab8cea7dc8
e65756b128bf0b173d412153cdb5ac6fe3889771
17823 F20101208_AABTEP mutschlecner_t_Page_163.jp2
456d264b7a78622be9b78b3d6edf79ed
aaf3fa12fe44bca53be385540b26c33601c7c53b
53229 F20101208_AABSZK mutschlecner_t_Page_148.jpg
9ab9d4f3384b53a6a2cee541708cbf48
99d4caaa07102f94ec8d06d2a380500507985302
38917 F20101208_AABSYV mutschlecner_t_Page_133.jpg
bb169f52cf6a66b0e1df553b8b316d88
fb6d9c7c1bd360c62344743f0f1643e0198f62e5
F20101208_AABTFE mutschlecner_t_Page_023.tif
5bb3ac4b7653944fe77b45aa64619486
ccbc9a73cdfac97e014daa0aafb7de5ea9cef762
F20101208_AABTEQ mutschlecner_t_Page_001.tif
5cc4809ed7aac0b14efceb0dbaa787a2
d7e06143cbe8efd5b65cb35c479ccc37b6639434
46974 F20101208_AABSZL mutschlecner_t_Page_149.jpg
487daca0d98954dc51651defd5de172e
8ad59573cc34531e3bc2d3dc24dd6dd8cdad2115
53046 F20101208_AABSYW mutschlecner_t_Page_134.jpg
35a941055ce896ed019708a51e1864d1
de9cc3f033df116915c92bd0db5694aa242cf9aa
F20101208_AABTFF mutschlecner_t_Page_024.tif
0f8a55a08676cf099cbd02a772499912
2a7de9cb5787da500b6ebdfbc089e3bc3c16ce00
F20101208_AABTER mutschlecner_t_Page_002.tif
daa195210fa3b0b5916a154ec0efecfa
5c3398d1a5ece9ee1522d340a11715e5271e2943
45370 F20101208_AABSZM mutschlecner_t_Page_150.jpg
6ec8486c0b8a6d99addb7343e8e763b2
b1139f6fdcce2b69cc657ccc40a4c532532d6c89
41266 F20101208_AABSYX mutschlecner_t_Page_135.jpg
52809a7c4ac9c3c4ba06869783072bf7
c2d063884a827a3b96e11e07c89f9fe66e1b0ab6
F20101208_AABTFG mutschlecner_t_Page_025.tif
d457686035166531d4ba68041f8a2364
22136d0ba97ab2eb159021694c0e9e29afda172b
F20101208_AABTES mutschlecner_t_Page_003.tif
94d1acbe03c2a0e0405f125d6ed72734
df2dc13896251caa1cfed6b25d6234613618c53b
47860 F20101208_AABSZN mutschlecner_t_Page_151.jpg
1d6f7dd873030ad7ddc530c39306b930
64883cff2285ede0e67ea66803e1d4f1def3a49b
45801 F20101208_AABSYY mutschlecner_t_Page_136.jpg
396ee287e11fd7620c4716ec64681ac3
903c9bc9e5e0567def6f1d14b60bb0d47f9f2615
F20101208_AABTFH mutschlecner_t_Page_027.tif
7044420f295eb53836340e18c8b7949d
86e6c4a901dab89d9d8edb7075ffac84989d1357
F20101208_AABTET mutschlecner_t_Page_007.tif
b15485cc21c326c7969109052273e362
e2913aa3a06e325c63fc2db4985b473791fa0e29
41956 F20101208_AABSZO mutschlecner_t_Page_154.jpg
5fbe70262cb91a802a31fa0b8bcbb040
56f3a62c1dc24daeb29aa1df98a5d183fe94e4eb
44732 F20101208_AABSYZ mutschlecner_t_Page_137.jpg
bb09ad1c387202f11d13901b869ef76d
6125fab695a6be0b2a6a3387b7a0f4d28b17d62d
F20101208_AABTFI mutschlecner_t_Page_029.tif
bbba2d1c0fe9c526f0cab76bbb1c9c74
fac89e8245d5ee0a0413188ea07e1854508d0988
F20101208_AABTEU mutschlecner_t_Page_009.tif
24fe7680f5fbc968566b882d6e7c9b3f
610abb0c25b81f4dc369c589531316335cd4b1f3
28441 F20101208_AABSZP mutschlecner_t_Page_155.jpg
651be0c7c4c92e00ffc8f8c183bdd7c9
681441d0bd4ab77e942cf8d5c993e72f47c25132
F20101208_AABTFJ mutschlecner_t_Page_030.tif
be36e7aa240bba701ff90aa6b83895fa
48f41d3f8ddbefa6667100187a367c4edc7ee246
F20101208_AABTEV mutschlecner_t_Page_011.tif
5399eefc86f5da937cfe4035eb95b6e2
884b08fcde9369460db15c9d8e2bfd0aec87f6f8
36202 F20101208_AABSZQ mutschlecner_t_Page_156.jpg
b87c3855604d22b4e87720444ca5efd4
c364b31a9155b4fe2b4b3b3e541f3a19d2c802c4
F20101208_AABTFK mutschlecner_t_Page_031.tif
d28c7688aa148d6a8018df96843ecfaf
769808614d632714b5a4564940eef1c4f6862064
F20101208_AABTEW mutschlecner_t_Page_012.tif
289f57b4c63e4296efe95e1195932118
2a53480abe0114fe1e136c14233f35cd3d16912c
32898 F20101208_AABSZR mutschlecner_t_Page_157.jpg
cb82cad816c7281fb8a8d30fd30e8541
f528227a1f517c3efad2048aa658a4f3bd376f3c
F20101208_AABTGA mutschlecner_t_Page_051.tif
53a64005b1e5e2f641041a191bfbee92
b3b7d292da42d77a8a96a8193a8ad57851f38e30
F20101208_AABTFL mutschlecner_t_Page_032.tif
d2dd5565c34adb04465d24122bd41cee
eae991c291c99a551d4ce4b85ad5c8fbef3a1b22
F20101208_AABTEX mutschlecner_t_Page_013.tif
4a4c417168de73af49b8161e6093fdfb
731d2378a7164907f2f3889a46f8c9e685bc31e7
33668 F20101208_AABSZS mutschlecner_t_Page_158.jpg
7f186efeb459a3d4ce0547989aa16696
01d70f54dc0437fc60bc12be1d4cdb011592f68f
F20101208_AABTFM mutschlecner_t_Page_033.tif
c3fb21d213958f0930338f338545b62f
2e0f9da3b5d94ab772b6c3cfc0ea0073fe67b2d9
F20101208_AABTEY mutschlecner_t_Page_014.tif
749e44f7ad97e69b46f70731ed0c5486
32afda300a213325c58932426d9e1837d16ad77e
75923 F20101208_AABSZT mutschlecner_t_Page_159.jpg
e20b2fdaa809725a7edcb62b5b7ff4af
676b48ae17d94655adc578439275ac98ce1cd90f
F20101208_AABTGB mutschlecner_t_Page_052.tif
07b01ed3a302096a227fbaf7f63ff13e
4cc6a75d79186b334da7405e164f62f84d9e1abc
F20101208_AABTFN mutschlecner_t_Page_035.tif
37ad637d3fbea7b7e394fa9d56cfe201
1cf118d16fafbffc924902df2b70bfd53162fd34
F20101208_AABTEZ mutschlecner_t_Page_015.tif
fbf83eb434f9b32f0aae9b1d904d59c5
c8de5d263cc14f4400c92bcb90bbf2e6ae8188f4
75732 F20101208_AABSZU mutschlecner_t_Page_161.jpg
64e8591a15f4f9aee4737507e913f545
d02c04dd88829b708e8469e1186835e14e7791de
F20101208_AABTGC mutschlecner_t_Page_053.tif
73c8cbb07d21ba80ac6427f1b0e8101a
00e204319b55f83e897ebc1a80799451a2e77332
F20101208_AABTFO mutschlecner_t_Page_036.tif
6c2827f200729651dc5cfe7d91779010
0c5103aaaa81a674167fc7dd8d3457bbed0e53e2
F20101208_AABTGD mutschlecner_t_Page_054.tif
8a00f1f97eb03ca2698cad059418d1d9
b58c0e1cf37b2c3151f2a34060eac60d978019d5
F20101208_AABTFP mutschlecner_t_Page_038.tif
b4ff16b05d821fa558f907f9319f75e9
9868175d8bad0cb7a595d46037ac9bdda0b5eadd
73784 F20101208_AABSZV mutschlecner_t_Page_162.jpg
633eee21eb3d7c898bf341317e18462b
229d2abdcddec9388d6085707b23fb8f87060540
F20101208_AABTGE mutschlecner_t_Page_055.tif
8867a8370cb3be31864932469a0c4e2c
73e59f88db8de2779aed5a1e7933896301355170
F20101208_AABTFQ mutschlecner_t_Page_039.tif
0aecca680800f6f3ac88a1dedd39c077
6ac34058b72200ecec9078868b719498e8e8fea2
27051 F20101208_AABSZW mutschlecner_t_Page_001.jp2
3b959fc1623d1432fa1906ef02ed59df
9b12ee21b017ebb81ff4bd45d2a3ec5448c0aefb
F20101208_AABTGF mutschlecner_t_Page_057.tif
c9ef5f35d34afb6ecb65e8c974e24e4e
8d1ed57444ea9eee5d7fce39d30177be555b9edd
F20101208_AABTFR mutschlecner_t_Page_041.tif
197cb26464108b559764fcd94c729d5f
c088bfb132c319e13341a9e49b3554d009b1fd1d
5888 F20101208_AABSZX mutschlecner_t_Page_002.jp2
351bc2d96221c48feb88e1ed0b2bfb7d
a602ca3cf94b1e36871cef836ed80d0999afdb52
F20101208_AABTGG mutschlecner_t_Page_058.tif
44565bf2909e4689b0ba53070894dab6
0d2a465c3d5bbe95ab2a86c43431a79d4f714c99
F20101208_AABTFS mutschlecner_t_Page_042.tif
3d5a8bc1e4524ca57dc29634a3abb4e7
22e4bcfa1b17d397f628786c8e39319235871045
F20101208_AABTGH mutschlecner_t_Page_059.tif
b72ef0ebc2832873110b4d752e4f098c
29178f8a0bd03d8dc3e811e8678f9ab1b1fcb337
F20101208_AABTFT mutschlecner_t_Page_043.tif
3f3de0f4f13387c001374bc443aea908
a2f061b8d046620ea7de386b309ff92a73ee5cdf
107161 F20101208_AABSZY mutschlecner_t_Page_004.jp2
1fe6613d4ca438852a16968d54f791dd
8f092ec2fd43d77e0549569040dff3a4169b2280
F20101208_AABTGI mutschlecner_t_Page_061.tif
29cfb487fd61dbf4c3a390193e37080f
5e20e9b3ac9f8d37d233e36b6d5beb9cc46d8d32
F20101208_AABTFU mutschlecner_t_Page_044.tif
8777f3026cb94308330b27cb5a9d483e
467c5f05664b921068f051d34a83166335998f68
49461 F20101208_AABSZZ mutschlecner_t_Page_005.jp2
4fe5dd958a5627297deb194b938c6ce2
bcc8d32cc0ffa6568c02f333c1613e5b12f0f6fa
F20101208_AABTGJ mutschlecner_t_Page_064.tif
24eb766e9493924b4030e1f084cb07ec
b53b0e7649382c624d22797a7a17e69626cb2eb3
F20101208_AABTFV mutschlecner_t_Page_045.tif
2524e60dafe54b9ee3de9188b44344ad
234a2ef5092c72deb15cf73326946cefb655dc37
F20101208_AABTGK mutschlecner_t_Page_065.tif
774e579bf55ebd24aca1d7d0d8917469
6a73e687a4e0cd61b2417a40e918e094741a4b7f
F20101208_AABTFW mutschlecner_t_Page_046.tif
e29d0bd0ed792906f501f54999f25343
2e12190463910a31285079a2cc334ca1b2e4a930
F20101208_AABTGL mutschlecner_t_Page_066.tif
87badd7d08534430baf456755c5780fd
26d0c07e4a432af12f7f3b20a5e6314bba0d349a
F20101208_AABTFX mutschlecner_t_Page_047.tif
54365f81853b1608f5886defde9c497a
f562d9cb8637f7eea162334502fef367637dcce0
F20101208_AABTHA mutschlecner_t_Page_083.tif
e4487249c0449f070f763a4e5c2e6a22
edd4c97249f8ceaa83f500f746667213f49a2b40
F20101208_AABTGM mutschlecner_t_Page_067.tif
2587a381c10a3d962774f27e7ebee026
746ef6c0ac03ff8c775b4335815244ee3b9119c2
F20101208_AABTFY mutschlecner_t_Page_048.tif
56df1ea393387e2fc8721180414a2845
c6f151dd8b0fe0bb798953bb9dc924939d619313
F20101208_AABTHB mutschlecner_t_Page_084.tif
4cd2a9b377ba721ce248a29452614740
4cc234f23bfb40ed4e9994c8f40a7d8c20af8f2d
F20101208_AABTGN mutschlecner_t_Page_068.tif
b0373687795031dc41170987c9d9ef10
1b27b72546a69a96853cdec5d665af45ca33ac63
F20101208_AABTFZ mutschlecner_t_Page_049.tif
f689be733e38fc71d93470e904b1e16d
e79257f0cb634fc7fd456a87e59d8f1a4ea8cbad
F20101208_AABTGO mutschlecner_t_Page_070.tif
8ef78ca6e894e48689f064d1247f7e9e
5e1e314366daf3199948d260bdc62055947af114
F20101208_AABTHC mutschlecner_t_Page_085.tif
ba76c2e1ed8edbb7b02ebb3237c7f72a
8fe026a2c29e38b7d977b1a94b6fe7803136b373
F20101208_AABTGP mutschlecner_t_Page_071.tif
ff8a2d723d64f19987eaec8595f11b9a
004544650a96717f638273caed53aba4ae193f07
F20101208_AABTHD mutschlecner_t_Page_086.tif
45395e8fc5a69ecb273f6baf51db49e0
a69f8f98ea3e8489884a36e77f8cad3b2e3f9608
F20101208_AABTGQ mutschlecner_t_Page_072.tif
5c4e02b802a98084bd0fca0c325481d0
dbf0399c2f234d837eebba92aa734da1c30b64c3
F20101208_AABTHE mutschlecner_t_Page_087.tif
f18f4ad43cccb9bb6e6f80163cbde54d
22c15604e600c53481fa2ae99b1afebf5137f508
F20101208_AABTGR mutschlecner_t_Page_073.tif
d6da8bdb241fd4b809b8bb9f60a595aa
c36c03d31db745df4ac2fb3d8ce4dfa9932f3f8f
F20101208_AABTHF mutschlecner_t_Page_088.tif
9fef0562906285dc81aa27cbc93e1604
87fce6b33d8df7880f1442b84dcd0d6133458259
F20101208_AABTGS mutschlecner_t_Page_074.tif
9390af6c0802972938cfeaf728a12c08
af5b038b3089a2bd37e9673e70e6fe864901ddd2
F20101208_AABTHG mutschlecner_t_Page_089.tif
82303e31c6f1b57a438b5093e768f43b
d51bce6c876a6eb62030cf9f6fbcd651b9e84e68
F20101208_AABTGT mutschlecner_t_Page_075.tif
6cb7274a5f45816d15c3f71ac8913a57
68378c73aa160f7f7009c0e40e77983728aca40c
F20101208_AABTHH mutschlecner_t_Page_090.tif
eea1656bf16848b90643836faa0d8266
7c80fc491dbfbdd8b158df309b8343b116f7264c
F20101208_AABTGU mutschlecner_t_Page_076.tif
79b46fdc3f6bf0ed974d7ace606e2d7d
7c93e8fa9c9ce95a2b1756554a479e46e82a9e8c
F20101208_AABTHI mutschlecner_t_Page_092.tif
b593b5f9d76d4c7bdcd82e78cee2b109
4ec9ad88475e9312a7a68d84ccacda7fee4aa740
F20101208_AABTGV mutschlecner_t_Page_078.tif
aa889e2f2444e0e8ddd61375e9f5d35a
f3a9b6865455ded1447a5497437dd8ee423914b7
F20101208_AABTHJ mutschlecner_t_Page_096.tif
15fde4767d6f3840404059677c5553d7
184ed2562998619764df1527461aac2c9ea93095
F20101208_AABTHK mutschlecner_t_Page_097.tif
001feceda42f74235aea21a070291d81
ecbf640d06ad8d6f2bf660d69199628c18b12dcf
F20101208_AABTGW mutschlecner_t_Page_079.tif
dad22c90030c57aea27ecd302e615074
3517ffcb166d1fda6f536ebbc3916373b659b1f1
F20101208_AABTIA mutschlecner_t_Page_119.tif
1b9c19d37eca2fdf9b5cf45dbb64551c
2075c59519cf4c0afa9a955b367d15b5184e92b3
F20101208_AABTHL mutschlecner_t_Page_098.tif
8759e7479abcd2b59ea6d9f93983d429
504aab8de19ec9f8559d27e0fce32f2b9320bc09
F20101208_AABTGX mutschlecner_t_Page_080.tif
24e75a025c8db19d408ce3a09ff675cc
bcbae407af46cd8e8b69a4eee546a9515b21ad28
F20101208_AABTIB mutschlecner_t_Page_120.tif
a2c399d765710a0acc9f727beae8dc63
3b9c9f032d11c30e744e30d9266113979a9a4b0c
F20101208_AABTHM mutschlecner_t_Page_099.tif
2c8d32f5153f568da845cd25aed198ab
0fdf08697ce4188792703b2aeab29fce6b6b3f91
F20101208_AABTGY mutschlecner_t_Page_081.tif
53f4ff3712f44164f07c08c91f1f68ff
61e39e64a7fa4e1fbd3402cbb6c008835e0bab16
F20101208_AABTIC mutschlecner_t_Page_121.tif
6b390b1e74b50096179a5bb50fbaab7c
964a50fd6903d794f4cd0798c61acf75abb901bb
F20101208_AABTHN mutschlecner_t_Page_100.tif
b543ba9964a9d03dacf43658ceb10c48
e766afa592cb5bd14c314cde5ccb08a16d8eece1
F20101208_AABTGZ mutschlecner_t_Page_082.tif
cee91b1e317fdef04803ec8cf73dd619
987c064525c3bd9727767fc10facf551af8f1709
F20101208_AABTHO mutschlecner_t_Page_102.tif
2e19236ea71172a9ad032dd26e89a1ca
143fa0025d5c1aea85ac4e213a1506573b3fcd6e
F20101208_AABTID mutschlecner_t_Page_123.tif
0045ac0ce066f29f30c3586033c4ce6c
ee8d255d999b73b25aeddf18f541dd29926ca6dd
F20101208_AABTHP mutschlecner_t_Page_103.tif
a394ee62479e585339cebec37b9ae4b6
cc4eb687c667ac48fb5a6e78ceeaca28f13edcb4
F20101208_AABTIE mutschlecner_t_Page_124.tif
1cc5d3f8ec40866a647a675106ec9aac
c683d045f59a52224ce1eb63ca0a2b77a2f79637
F20101208_AABTHQ mutschlecner_t_Page_104.tif
0ec2e3dd084c2b71a92a7036630d290a
17fdcdb38859b89719781e7a0d7efa99fb3efbbb
F20101208_AABTIF mutschlecner_t_Page_127.tif
db100c168360d127c7df066d34192c16
c38a0da3be0febff0ac1b5044f6a0bb0c9ab0985
F20101208_AABTHR mutschlecner_t_Page_105.tif
3b6e921d8a6e38bc5f034a921e61ca37
51ab453290c5cae39278e3a03790ff721182d334
F20101208_AABTIG mutschlecner_t_Page_128.tif
242f661d872fd7b9706d5c8f252b2ddf
37c7c199a5f559a9c2b359c5f8d98d6099a4d932
F20101208_AABTHS mutschlecner_t_Page_106.tif
0aad2701ef8a892b153ac6625ba7b176
ce2220f72286e5cf503bb922e304b3576a065471
F20101208_AABTIH mutschlecner_t_Page_129.tif
cb92388a54dedcc707950feb31528659
a400cb397d15b7dcb89545fafd9eedae84e78282
F20101208_AABTHT mutschlecner_t_Page_108.tif
8038cc86491e25df3c2f50b6926e2a3d
41d3a70a1cfdebd44440b7e783c9efaa55e6f60b
F20101208_AABTII mutschlecner_t_Page_130.tif
050d13a0a3d994dae459e0c7861b379f
4cf2c74863c1d1b81815fa9990d14932d509a298
F20101208_AABTHU mutschlecner_t_Page_110.tif
233059b374da4f155ee69aa53a40d9c0
05121ecd8b54d56c54c634b7b490eed70de52af9
F20101208_AABTIJ mutschlecner_t_Page_131.tif
1733b8e10e3e09981807dca625361bec
3b1ed5287e57796a287a1c398b6a57a6fd505f0d
F20101208_AABTHV mutschlecner_t_Page_111.tif
c07312b821d207401d093f23346a74e6
c48eedae0f4ad1bba06ad231139a9416097d8e82
F20101208_AABTIK mutschlecner_t_Page_132.tif
b40bdf40a787fca7833f93c1372148fb
9e181e8015db564f359abc3d466d334ab4a27d39
F20101208_AABTHW mutschlecner_t_Page_112.tif
90cae4da719daf00c08973d1cc548cad
27596989a89a0b2bb3d060f66e80c41c7e842742
F20101208_AABTIL mutschlecner_t_Page_133.tif
e1d86ac2c00feadac3be136d4f3e2d5e
bef6840a469636382ca30963399eaf7564f77a86
F20101208_AABTHX mutschlecner_t_Page_115.tif
12319f353f3d14bdbb4bac0b6b3c9fdc
2eefd355836cccfe47571b2ed324f52bcad9a60b
F20101208_AABTJA mutschlecner_t_Page_149.tif
e2698c727df7130162470364f7213a0c
1ba3f304004bd4f8d1f09d2982d4be45351b3e3f
F20101208_AABTIM mutschlecner_t_Page_134.tif
0d995a68ae1001863239b4719f49c6fc
73e45c1c73990e47b1be4304a89db00ca257ee62
F20101208_AABTHY mutschlecner_t_Page_117.tif
12e0da33c6b22836c1821807ac3c76ea
06fb34d4e64e0dcffc909100fecc1972a22294a1
F20101208_AABTJB mutschlecner_t_Page_150.tif
58c375b579b8f258fbcd8158cd63d534
c4b0eeda4b276573954da83898632e5786aed8c6
F20101208_AABTIN mutschlecner_t_Page_135.tif
caa146e5a9d94a3aad268a3b644aa42e
3d9833c1539f2d0b6cd55540aba713b9f0dbdd26
F20101208_AABTHZ mutschlecner_t_Page_118.tif
3dbb16df75ec8783cd9569f89f18f047
5ff10d210f39d7b6550b0b33e92a2880061f8103
F20101208_AABTJC mutschlecner_t_Page_151.tif
42b21bf063e1fb945c535e1ff03969a2
b6be4d76308f471f6ebb37cae45eb7b4066a68ca
F20101208_AABTIO mutschlecner_t_Page_136.tif
981f57665b619908e891fadc8c7031f0
d63f830766b9eb61427dbc31f6b76499352a49ee
F20101208_AABTJD mutschlecner_t_Page_152.tif
db29fff27645c70ae5594067fbfcf37e
fde8ff13e0212e1e87c074b761cc48c24ebb6e34
F20101208_AABTIP mutschlecner_t_Page_137.tif
c39a86751446a26ad6e6234451aba48b
ed64d93b6b53d469baf4ee2c1226cdb94cc4aba0
F20101208_AABTIQ mutschlecner_t_Page_138.tif
b378fcd45a98f2f57482c833266276b6
f02f666e7ff4176c31e80736122d0830e68e2805
F20101208_AABTJE mutschlecner_t_Page_153.tif
dbaf8d7a0f1bf68fd69bbdd2fde74267
bd0bc6e8615200ae94d164a3846b3ebbc174ac59
F20101208_AABTIR mutschlecner_t_Page_139.tif
10fef6e096b3818480123ea660e8602a
ef7414e174ee7aff293ab86686d242d1c42e759a
F20101208_AABTJF mutschlecner_t_Page_154.tif
4d1a27251f4766d781352ccd1afd66ce
c4e4200243b71a0b251af3ffa7b3dbba49ff42a6
F20101208_AABTIS mutschlecner_t_Page_140.tif
3357d5dd2a53aad30a0ac3c6aaf8e39f
060a9d0c85fece8814d072f36bf803c5b5bcf014
F20101208_AABTJG mutschlecner_t_Page_155.tif
e405aec257b13e73f62c5e9d3f781a4a
f6265dc9fb6ebf09d5da4c23f34d10dc3bb8b5e0
F20101208_AABTIT mutschlecner_t_Page_141.tif
f300e5429ad42d1bed8d63a65f203c08
899caaa02b109948b4f2a882c47c8761c375a917
F20101208_AABTJH mutschlecner_t_Page_157.tif
638f3f621f895801f58ad526cf618f5d
dcfc042aea955bc4e1b354f311dfcb663aa24f48
F20101208_AABTIU mutschlecner_t_Page_142.tif
87b2de7034b2877518e539b668001c07
82ccb038bc3e000d410332dc6fb40ecbb9d75a74
F20101208_AABTJI mutschlecner_t_Page_158.tif
d0c9e8ac0e76c27f18f0b4c625d24740
3bdddcfe00b6dbc9a3dc745eb498c3d750099002
F20101208_AABTIV mutschlecner_t_Page_143.tif
c3d1fdd47ad14b9bdc81709ec90682e5
edaedefb447f340d5d0f22ec504fe9f9295614c6
F20101208_AABTJJ mutschlecner_t_Page_159.tif
9f0ecc3d100ece63f32a9572b51c5ea5
81d927884942ec15ae3080578cc0d2bebdc73232
F20101208_AABTIW mutschlecner_t_Page_145.tif
8566bac92df63805e21e79fd37c0ac3f
db2b5ad7216b657eaf3e2819dd1e330882a8210f
F20101208_AABTJK mutschlecner_t_Page_160.tif
478fc6d7ba90af5b2234727c391ee127
48a926ffdd45f7130606c08941822d5182fdb9c1
F20101208_AABTIX mutschlecner_t_Page_146.tif
ad00ff116663b871adfff23a0997c579
417b174ab7760c20672902389f49c32f5dcad0f6
63120 F20101208_AABTKA mutschlecner_t_Page_017.pro
d4b5bb7ce53d28541cbb38d06dd50448
b57ce669834d442ab0dd389563198bd353382470
F20101208_AABTJL mutschlecner_t_Page_161.tif
bf647bab708049829c365fbdb3dfcc14
a00ab1f96c6bb4d7c7f0bd75bc0bd4ab6e6c59ad
F20101208_AABTIY mutschlecner_t_Page_147.tif
6077595ea5bc237fdefb274a8c6c758c
37b7801da614c21ed855d3bd7e0146ae7a070560
55533 F20101208_AABTKB mutschlecner_t_Page_018.pro
9ac105fff328d7b89d4b3596899763f5
ec9d41f774993c04bbb8f4021a717c0e7dd6717c
F20101208_AABTJM mutschlecner_t_Page_162.tif
a61e912f3374265fa9a90ccf235d9482
df8b686af23ae19e3b31096ae4ff0b91f522b189
F20101208_AABTIZ mutschlecner_t_Page_148.tif
96dda6d79afe2219c184f8c1c317a853
70fbf39653c3b215d43b2d4203351d7728a0ab9b
53970 F20101208_AABTKC mutschlecner_t_Page_019.pro
f82a6c44b7c0fcc3e30ba0f4efd22e40
cb040423a2f60afb43a6c94112f897c9ec5416a3
F20101208_AABTJN mutschlecner_t_Page_163.tif
7b6239436a9af69d11a0e424d9ea52bb
2ff1632a097884db956ea110371022fc126b3279
49905 F20101208_AABTKD mutschlecner_t_Page_021.pro
5867cef0c92cf71ff18279a9b5cf7291
84b5642e689c9c5bcd74b91e0560295a3dab0a8d
1075 F20101208_AABTJO mutschlecner_t_Page_002.pro
5af974dd38b2dee9358d29ebd6eb1b17
104b6501a7c4ff7312f4cde424bcd7b26ef7026e
56170 F20101208_AABTKE mutschlecner_t_Page_022.pro
e4d38f5d2b714428da0b5585428d9e9f
24fc45faea21bd1fe4620287df39d8a2470abfc6
2284 F20101208_AABTJP mutschlecner_t_Page_003.pro
af2ab7b15a854812a7110a53cd472266
2a5290c06e6d6483b64e40ae1341c23139ebad89
100326 F20101208_AABTJQ mutschlecner_t_Page_007.pro
30e48ee7dea1cc1d17efed0dbb7e7b30
4034d736b1bb19e2fc2bb1c4a8e1534ba8b9d37a
57210 F20101208_AABTKF mutschlecner_t_Page_023.pro
f437841bc813123e216ce254109de9ed
d7560b0edb47d96eafadd8a1c71b34cd292d050c
57704 F20101208_AABTJR mutschlecner_t_Page_008.pro
aa78610b482b19bfa0bd5fa7894745ec
f23661ed7201265cf412d2db939f4ab9934c6a2f
63760 F20101208_AABTKG mutschlecner_t_Page_024.pro
5862ab2e904cb5f73ff85b2052f678da
ccc018ca4bf519fe7abad2c3f0a6fff2f188cd3c
51519 F20101208_AABTJS mutschlecner_t_Page_009.pro
4c1a080a976ea090513a7b3723af73cb
61fef665d3dfcedd64b46242773f9ea60ce329ae
65259 F20101208_AABTKH mutschlecner_t_Page_025.pro
03b1aced727f0daa5ff0a30fb78a4ae7
08a2ac50dc43a5d38b7607a717788b27d15bfb3d
22185 F20101208_AABTJT mutschlecner_t_Page_010.pro
8f1e4454fc9cbde1c4eb4bbc19cf004e
06be4575766f3a15d9b339b49b3c1bd316032443
52558 F20101208_AABTKI mutschlecner_t_Page_026.pro
1b060bd9d4d05f2697efed0865d29f9a
df08a6e03612876269b9a374cce134ba425a96cc
45032 F20101208_AABTJU mutschlecner_t_Page_011.pro
97e4fa07c4b95c179c2774983a0988e8
87660c2474d86e9b38d9b251b5f1d873f2fe5440
51110 F20101208_AABTKJ mutschlecner_t_Page_027.pro
0e01dfdd90d2fa9f8b0e540d67203426
882411cd8edef667c009a21e556978ed91a652c3
22451 F20101208_AABTJV mutschlecner_t_Page_012.pro
417fa9a36f968fcee7598433aac31173
ba326f8543797099d203291166058d9ca358a77d
51550 F20101208_AABTKK mutschlecner_t_Page_028.pro
05bbcaea14757b1d85182e9b01482efb
c4229f04ef949769236fdf857e3fb5abc3a0fabb
51236 F20101208_AABTJW mutschlecner_t_Page_013.pro
def5d17d4ad275be3a8deb3140c1a021
58d7d83b388f6f4ffc337962430609c432453e6a
52578 F20101208_AABTLA mutschlecner_t_Page_053.pro
3637b3e1b54bd8943ab04f62e20f84fa
887114f2b9dc1b1e3f816b45c91a7c9a911d0777
51978 F20101208_AABTKL mutschlecner_t_Page_029.pro
20fb00b49a9d38727000fe88c1e404ff
1fc37c8e42d8c1df3ca60540d93c329581bf25c3
45227 F20101208_AABTJX mutschlecner_t_Page_014.pro
92fb52a3adbec9492b64d0644a6af03a
912cd7148b478caf051b1c1740215cdf1a042d80
55060 F20101208_AABTLB mutschlecner_t_Page_054.pro
b03d1a9aea1b59915094eb5c257178f9
4bfd36b5ecf63508872bb0e8d91b94456c3f68ae
54157 F20101208_AABTKM mutschlecner_t_Page_030.pro
e95c0e21ff367f2730fa56cfb0aa826a
0a4c0614b075fde54edd2f8ab49e813685485dcf
49902 F20101208_AABTJY mutschlecner_t_Page_015.pro
b6737ffdf1d9a521a5f82800d4e84e74
3d0c199b463db3d5dcde79fc31d3419f55e0992b
55380 F20101208_AABTLC mutschlecner_t_Page_055.pro
dade2c383fb34d6ce0f8561823ed1a95
631f79dc025f1ad6c0853c5f02ae7682764a8ded
51470 F20101208_AABTKN mutschlecner_t_Page_032.pro
1c09f901ec106487b71e97b205f57e13
a32992ff087ff2e7bf1347ae6f53ee97445b655e
48579 F20101208_AABTJZ mutschlecner_t_Page_016.pro
f0c3b6272d36382c285f21f2fb0400b7
a55f4b445be433186ab735cc0aa43c1b7a6f7606
47551 F20101208_AABTLD mutschlecner_t_Page_056.pro
0090e152b1754fd7645865cf99042a24
e822facf0089f6a9e9e7740673594d2489272730
52287 F20101208_AABTKO mutschlecner_t_Page_033.pro
c4503991ca36e37fe975635cfc9b86c0
7e4ca65d91647715fd0faec43225ec0f651ec33a
47674 F20101208_AABTLE mutschlecner_t_Page_058.pro
316c09c967ca26655c5c6037ebc8e857
3fa8ff47f72cb7b960bff430bb186fd07309d3b1
50117 F20101208_AABTKP mutschlecner_t_Page_035.pro
5d9407f089879005037a36f675a4a7f5
e21502ae78612a3fb4dca419d1a14c122fb2d59d
46593 F20101208_AABTLF mutschlecner_t_Page_059.pro
5b5582c579034a9b439d5126235a26db
6e30d75f991d83cd01f5f4fd91936bfd6e03d85d
52568 F20101208_AABTKQ mutschlecner_t_Page_036.pro
6cb5c3a11e4eaa5bed9d9c833044de67
184eff1ec23f1f01e9bd709d53214d72f2535403
50990 F20101208_AABTKR mutschlecner_t_Page_037.pro
0132f8dd0fde245ebbf870361bc14450
c155bf1bae3d9c0c6c826019687f48b1ca679603
54458 F20101208_AABTLG mutschlecner_t_Page_060.pro
c13ce9e79f79fc55df077fa361821ff2
493683b45526044a3db139aa4df89c61c8eaf11f
53127 F20101208_AABTKS mutschlecner_t_Page_038.pro
7d6451221235c7d698eb1988288bdd76
f42a42124cc871ba20a518574c7320720434d025
36040 F20101208_AABTLH mutschlecner_t_Page_061.pro
cf623ed2372aa491b395db2842932e91
df455b5c9c776f300de569cf27e66aaca8c2c448
49629 F20101208_AABTKT mutschlecner_t_Page_041.pro
05d2c994c76339324b9dd00716643d21
307a175e3009ea20e6e971c6630ca137a00e5e43
22407 F20101208_AABTLI mutschlecner_t_Page_062.pro
f8d149884bab999a2b1189b1388f08f7
05ef6cb08c2d2b7a8eeef8c28d44a6650ed9e0a4
11990 F20101208_AABTKU mutschlecner_t_Page_044.pro
7da0340b91971e69a9b8987437f59422
dd14c238a29a668e486c896e899a177cf1481ce6
7746 F20101208_AABTLJ mutschlecner_t_Page_063.pro
44f3b19329d9d47c367722b71252f214
b7bfa33bba76d3f5f6e0aa8edb218eff0dacaefd
49303 F20101208_AABTKV mutschlecner_t_Page_046.pro
05d2b3970b3fe0f92c7b5d5225af4053
048cd40718ab6a0ea5027faec9c3f3832b0a71d0
11806 F20101208_AABTLK mutschlecner_t_Page_064.pro
48c6cb3c2ba92abf8b9a078c7ff4c169
eec89fd62a75cb7f643482a179959098de9303c3
52861 F20101208_AABTKW mutschlecner_t_Page_047.pro
e8d01b64754cc1090002bdb2ee436ae5
335ba7deab1fb672e7ca1f1f7843d69c8fd736d6
21178 F20101208_AABTLL mutschlecner_t_Page_065.pro
b746063700da095e2acae9b76c3e57d7
93fcfe74bec415c14e2273d2418e8b8d204c12a2
51731 F20101208_AABTKX mutschlecner_t_Page_048.pro
3d04f408866fabd2e2107004a2cdb000
e4c8afbe3e8d6084f048818bc5ca0ffcd32229a6
53506 F20101208_AABTMA mutschlecner_t_Page_087.pro
073efcc06870470e53137a9135bd1a98
3000f3216bfe8512953aa4763db474a6f743a5aa
11725 F20101208_AABTLM mutschlecner_t_Page_066.pro
1a013a7621195f23ad108fad2fea8c1a
7a9f035731edc63e36280b6b32c8664f7ce9e114
34085 F20101208_AABTKY mutschlecner_t_Page_050.pro
1cc2768ca476beb4eb566716438200eb
438bb8b53340f81c247d8124b23566d1c9a7a7a6
63311 F20101208_AABTMB mutschlecner_t_Page_088.pro
9a1083b9d174a4a21ab28c7f3d4ed2af
98202e26d2eaecf35b48736187f4147c27da3a17
13776 F20101208_AABTLN mutschlecner_t_Page_067.pro
fee158e39702bb45530a580161ae231d
6c55e43ffd3701b404bb282ff993035a07104967
50624 F20101208_AABTKZ mutschlecner_t_Page_051.pro
927252366ddbbdb4cf65413aaddb51bd
0eebb890763c14a0cfff8fd186b09e7d53434cc8
56195 F20101208_AABTMC mutschlecner_t_Page_089.pro
7c9509db12f1f8bc878bc392bcf47077
3032b24914a6539f4a678e0c4e31d8bf3c139d6b
18621 F20101208_AABTLO mutschlecner_t_Page_068.pro
0b9f4042d5ec2eb63106f16d8e904934
ac042e230cb8115108a28c4cf6cd4ee7ab4c0b35
53792 F20101208_AABTMD mutschlecner_t_Page_090.pro
cd9589ba28aaa05ac351b7cbbf442cab
c123febfe3caf51f667731eb9bae89259f788252
9231 F20101208_AABTLP mutschlecner_t_Page_070.pro
bd8b1a12704cc43543e3308d69433542
7705d3920ca76df8df203bcca803cd19951679a9
38604 F20101208_AABTME mutschlecner_t_Page_091.pro
17307acbf16dda34a0c181fa6727153e
5daecf08710d5a03b29d16a755721cf01ac06e02
12436 F20101208_AABTLQ mutschlecner_t_Page_071.pro
617551c48c7b14fddc1b1129fa401e60
d3ebcfefb697485f8bae86e0bd6fe7f47ce88b1f
32577 F20101208_AABTMF mutschlecner_t_Page_092.pro
c63f381258473cfaf64ca36fc1fcea84
73be2cbea337b0474c0cc5aaa55aa4ed28004174
16707 F20101208_AABTLR mutschlecner_t_Page_072.pro
c394a74f4378c3f4ee8ef30de3c10baf
757986d7442741bd012ba2461451660168307ae3
49787 F20101208_AABTMG mutschlecner_t_Page_093.pro
9195ee88db77fcf59f8c15fd7776326a
b0fd8e124d07485c2994638b09a2f5a6b8a19ed1
53576 F20101208_AABTLS mutschlecner_t_Page_076.pro
a70e1f9ee643744ccca90c852a10be48
0b96dc11f05d3c7227af36cfdc5ea7e87f73465f
50961 F20101208_AABTMH mutschlecner_t_Page_094.pro
110c0c086196baf03a86aa40adec4ff9
c6bf9c8b88f49c0b75e63fc898445eaefb9ef27a
55513 F20101208_AABTLT mutschlecner_t_Page_079.pro
b8c452aca698ebf87225047b2641f11a
d54474fe4772999e201c6e81463de4d68fad1699
50247 F20101208_AABTMI mutschlecner_t_Page_097.pro
506289cf90dffab058208d8d4eac6245
7a88002e4762ea2b9e79181028044f2625142b6e
51220 F20101208_AABTLU mutschlecner_t_Page_080.pro
dff04ea3447a4eafc7539786a0239749
a341a2be042fed23c07e86ef16bb0186dc9b87b4
54122 F20101208_AABTMJ mutschlecner_t_Page_098.pro
5116c62e91ee3957e6f1e13d2dd7c966
42e2dc57f3dd1b38d6de2110ee125eaa7beae938
55014 F20101208_AABTLV mutschlecner_t_Page_082.pro
82631f14e78c45cdda0d063a9505abef
43c6be66ed8d718eece3a7f3e2dcdb18110afac9
22425 F20101208_AABTMK mutschlecner_t_Page_100.pro
25796c418faa84c7a17bdb1b4ae0bb90
97b6cfdbe5c8fb1d0ed435489647cc767d5b436b
54200 F20101208_AABTLW mutschlecner_t_Page_083.pro
6dc4fd52dfe8448d3564f49460d4fbe8
12034879ae48fa76cd0abfa512239ef5378daf22
15668 F20101208_AABTNA mutschlecner_t_Page_118.pro
ccb25d54d0b32c6d34de2dc91627d91e
527a5f5ebcb58aed0f266ca4533ef8138cc9635b
19886 F20101208_AABTML mutschlecner_t_Page_101.pro
56d273eb85226a4e3123294ed648699c
9124a1a2ba279c82c98d7631cf53d27f14a216a7
51846 F20101208_AABTLX mutschlecner_t_Page_084.pro
9f1b10b8e87a926741f6ce17b148d791
dec30941e6e68ee2479d6014b95f6d0859f47168
9433 F20101208_AABTNB mutschlecner_t_Page_119.pro
0fd3762ff0a93a638bd7031514d862d9
b27c68ecc73ac812ab316204f970440c68f9d6a2
72875 F20101208_AABTMM mutschlecner_t_Page_102.pro
ec84d54b60792ac28ee0b9521a9a8ef9
9dd745adaff392a47cdda95cc29ad934ce7f5682
48652 F20101208_AABTLY mutschlecner_t_Page_085.pro
1336ab984aa8d877808ef670d3b41c8d
95a8f605a472db1c08d07c8b3ed209f585ceb42a
F20101208_AABTNC mutschlecner_t_Page_120.pro
544af97e1e41d6042f76059f77b33458
6944f059d800e006a551ba30394f6bd0e981edcf
7646 F20101208_AABTMN mutschlecner_t_Page_103.pro
c1fd9aa7cad26b7e32c1a9242e3a81db
61b04389ee699f4fcadf2114479bdaff06449002
55594 F20101208_AABTLZ mutschlecner_t_Page_086.pro
59e96d1a59da7e25bef97b78daf5b9f1
4eabf768d9a53ef6de1f3f54f1f198cd90e771ec
15360 F20101208_AABTND mutschlecner_t_Page_122.pro
be56593fc59f1a73e8562b3e4283a22c
57586a6edb5f84847398cacf741bbb43ee525206
5776 F20101208_AABTMO mutschlecner_t_Page_104.pro
3cb2f5c30b9e849958ca6430c2f94cf9
efe40a412d15b2d9dfb78071ecd29d547eb491cf
11685 F20101208_AABTNE mutschlecner_t_Page_123.pro
7428affee8a6a424b82813489a1ef419
f52bf424d1b1533d2256cc83625a0c2f6a35341a
1203 F20101208_AABTMP mutschlecner_t_Page_106.pro
20c5b3d9efb98ea5737250f3de5e482f
69a2b4922464cbccd74163fe251bc1e3d227a1e8
15197 F20101208_AABTNF mutschlecner_t_Page_124.pro
37dcdc7ae800a5c20a54d1a5c0285d98
e3b3ecd06a43d2218c77425a0f06f098a406d0e2
11131 F20101208_AABTMQ mutschlecner_t_Page_107.pro
7843bdb967c9372f6b0494dd1dfc636c
6a012dae7d41a4746074b7fd7d78a4cdb60b6420
1051942 F20101208_AABSKE mutschlecner_t_Page_124.jp2
eb5f015686a37de1696ab717f622c2e2
57a948ffb1caf81148a7220e23144ba44cf10ba9
11238 F20101208_AABTNG mutschlecner_t_Page_125.pro
f955949619ad3be373230918ef9c3d22
717b59736f788558301d1ac8c85fa79fdcc1add0
F20101208_AABTMR mutschlecner_t_Page_108.pro
431e17379640fac73d6fac473a5e6492
1d80765c2ed55e85c343e8827d8ddbd320e3320c
46935 F20101208_AABSKF mutschlecner_t_Page_127.jpg
4450da8fef81dffce7d4b1144781401f
9edd12771353a96fc898483b091818737a239722
17697 F20101208_AABTNH mutschlecner_t_Page_127.pro
c461bedadbd8525bd4593c6faf6a6632
d6f6a424fba6aa7bf3c1c9504a1e4cff5687dcd8
6672 F20101208_AABTMS mutschlecner_t_Page_109.pro
a17b58f34a3740ca599fad014080932d
b6c4d0ea1305ff9b8a2acb7c1a59cb7cb9897bc5
2379 F20101208_AABTMT mutschlecner_t_Page_110.pro
af7e669e1c100ff4fb7cdd393a3f344a
4aad8bb1bc7eb13f2c4136d2ac2592cfb3de7c49
49518 F20101208_AABSKG mutschlecner_t_Page_004.pro
dd3a19eba36f652a1f2e0609b9ca45e9
ab3e427fb5505cbb90b9a0f2e1c3041278f31127
9871 F20101208_AABTNI mutschlecner_t_Page_128.pro
cc702032a58f800f8f5d3e1b7dac76b7
c688c895efdd742bf34fb230db446dbb6bf5602b
4986 F20101208_AABTMU mutschlecner_t_Page_111.pro
c1d88d68c44003af5ce35e34ad7e573b
3d08250daafdf6041fbad332b6df550d57e7a5a1
6514 F20101208_AABSKH mutschlecner_t_Page_036thm.jpg
271b2b7a391e49f6fbc77864b0117c2a
6f6fb7406a39399c3035e11b84d0eb3c93854d5f
9832 F20101208_AABTNJ mutschlecner_t_Page_129.pro
aa1f9e977e3b96bf503742470285eefe
4eeef52a61f2859cb57b8b3c26bad4f9243279ba
13237 F20101208_AABTMV mutschlecner_t_Page_112.pro
fa12ac21b4e2fd4a3070516baf38cca1
3debe1e3df6b349f439b4f38ac997601a5152354
654 F20101208_AABSKI mutschlecner_t_Page_119.txt
8006753d5b9ceff0e67674773ed3a2e7
715cd9cf200599ee9bec198e3698e6f7571eecf9
17800 F20101208_AABTNK mutschlecner_t_Page_130.pro
e176923754ee7f8c36a65eacb4921d6b
6cbb244121a77c8b6ac0e44b47be3167ecd0272a
12257 F20101208_AABTMW mutschlecner_t_Page_113.pro
117bba8639423aaabdf18244275c9d63
1103add3b1bd1c313ff5609ebbb23f1ef54227dd
1557 F20101208_AABSKJ mutschlecner_t_Page_114.txt
4d6ac3de68d40be9039fc36e39a7c962
c7b4a6c32568426a97a3a9c46acdb179defe05b3
15932 F20101208_AABTNL mutschlecner_t_Page_131.pro
5ae0f9069019ed4269777fe35062b4aa
e856c69469743283fb7168526a96d588e9c76c61
38078 F20101208_AABTMX mutschlecner_t_Page_114.pro
15af301776381dff081d6ea79f99d3cd
e67eca68c73e4ae2f40550db1656e23c1a3ed2f2
27922 F20101208_AABTOA mutschlecner_t_Page_150.pro
9a04a11eb5a8b72b6517e9eed378bc8f
8e36a7b828fa7ea02f8ca25372ecaf951a914de3
102695 F20101208_AABSKK mutschlecner_t_Page_056.jp2
cc55f6c7adec009a9d7d84347f4591e7
93121c03bdab8199bcb053abb93aab18b9a53a88
10203 F20101208_AABTNM mutschlecner_t_Page_132.pro
ece34d7694e18f5d1cc5effe770298bd
c7b32b7429d079c6903f4e92c6f939798f8892fa
10575 F20101208_AABTMY mutschlecner_t_Page_115.pro
46dd1af168cb49744bd59b70f729e945
6f897aef1b972b71cf482989246fbea0a238340e
29958 F20101208_AABTOB mutschlecner_t_Page_151.pro
e054a35798b6818cf4625a1d541e0bbe
9a1ef7747ce03ea7f0712fbd9afdd500092a7897
F20101208_AABSKL mutschlecner_t_Page_060.tif
7de3c2bc9dcddb774a16c78a4ea2d82d
3ddfa84f25859957ba716f730c152530a438f4ff
18339 F20101208_AABTNN mutschlecner_t_Page_133.pro
fc7d914556312762d2e33b41ec7b7f28
271066d3bc29215fdae85819d2b1c768ae645a03
12145 F20101208_AABTMZ mutschlecner_t_Page_117.pro
8070f7fb9f614dcdf90f9473be1cab9b
6d1f207ae505d8c151d954f58cb0161d0399a980
877 F20101208_AABSLA mutschlecner_t_Page_116.txt
cd0a9c39b21e85790de4b629d6200f18
77c4ab6f0df5e1bdcb263c4f36c04817ec46cc02
20457 F20101208_AABTOC mutschlecner_t_Page_152.pro
6ca53dfce575d5251f4b28e249d92595
c13e1fdafb5cc902892edf23193ec537d4481c7f
1645 F20101208_AABSKM mutschlecner_t_Page_061.txt
1573cbd85e3735be90f26aaadc42540d
666edc3fee2cb2441630af5e84513fe2bd6f8b67
19991 F20101208_AABTNO mutschlecner_t_Page_134.pro
34ec154c0bc54a66152d2601f1e0353b
be615066486b3b6d1b276c4e1ee0aeec6bc2b070
24828 F20101208_AABSLB mutschlecner_t_Page_139.pro
3f4349eba18008f60d6286c25fd8d94d
183b5827013cec0bddb1b4c370b5143c9c8a4b59
3644 F20101208_AABTOD mutschlecner_t_Page_153.pro
a6138298a10449bbf1aa6d9ff8599f1e
38631f40d39b3b2ebcf0514ce04d98cde5e0d30b
50655 F20101208_AABSKN mutschlecner_t_Page_049.pro
f1625afa83822f48e7979f4a960b3062
2bb2fd4ef2a4d9d0d5db724f9b1b11621866797d
19678 F20101208_AABTNP mutschlecner_t_Page_135.pro
66d165e8f763e71a8ef62bc475281e8f
ab898643847fa6a71d3ab28d374697effd00d12f
F20101208_AABSLC mutschlecner_t_Page_113.tif
3149026d480092c0518a0d29d5512d27
26fe965f5ae30c18e5abfd4589d375e47526bf5c
27316 F20101208_AABTOE mutschlecner_t_Page_154.pro
aa621c451a8016fa2a52a6bb97a81beb
2f9ed8d658959f4d820891a3324153b71af0d17c
8019 F20101208_AABSKO mutschlecner_t_Page_003.jp2
ed33618f3e8278e3742246392237f2e4
83741eba97d5efe3d756cbf3e62378c5e06419e5
20551 F20101208_AABTNQ mutschlecner_t_Page_136.pro
ff4c271210a7eca029e60a0b20672588
10efd0e567b030f6dfc0702a881fdc7b986a9491
F20101208_AABSLD mutschlecner_t_Page_091.tif
d5ce3e7fa0887f57c19baee3e7c9c322
4bcc1f1d7ad07d20d43208417d1557b10849f499
24055 F20101208_AABTOF mutschlecner_t_Page_155.pro
9a50e5b4217219ecb18a635683305398
c7199ae1af3aab7d29b75358ff255308de092690
F20101208_AABSKP mutschlecner_t_Page_125.tif
7d06b95508de3cdde486de62b3a781a0
2fe03d2003c7cbe2009dbf8a67dea079112a3f34
32841 F20101208_AABTNR mutschlecner_t_Page_138.pro
18272dc92c200e52739a08f9f9a239a8
6356892e36e41761439e3f1d22a0159c03381871
6321 F20101208_AABSLE mutschlecner_t_Page_161thm.jpg
6dde13a3607c7abd1043cbf8a96f3e8e
bf58b12c61ad71ca9dfbc32022a38a5a3cf9d078
49807 F20101208_AABTOG mutschlecner_t_Page_156.pro
3ce41ca06fb5b9eded93ae777aefc1f1
3c9efea237a7005b545ab89f501ecc794d819dcf
75942 F20101208_AABSKQ mutschlecner_t_Page_090.jpg
97cb02a1d2330a2af7cfd7bb5c581eed
d49f7c0458f22d496e9272b46189c75b50a56296
15513 F20101208_AABTNS mutschlecner_t_Page_141.pro
fd9007e1c2f5fbadff8830b757c9fa00
67ac3e4479eeb50efc31c781ad1aaccd8a5192da
2148 F20101208_AABSLF mutschlecner_t_Page_060.txt
a24af9d356394fc81a7f11ded83e5b1a
9b1192f511ce708869e95c47728b04b0b711cf57
47486 F20101208_AABTOH mutschlecner_t_Page_158.pro
efe76604404913ed790aba2203d3dd3d
6a645bf0c5c1a6d95af1a04b07477e9d6332c8a7
18984 F20101208_AABSKR mutschlecner_t_Page_140.pro
c3f88b05880d9bf2a83730d20557bf9f
9ddb85612f66a58988c787b46462299894d559a3
27380 F20101208_AABTNT mutschlecner_t_Page_142.pro
c3ff4eebec8b23300d9cba81445c09c9
b11092ed8bfe54d2297afd69e71d7815919364b7
3477 F20101208_AABSLG mutschlecner_t_Page_126thm.jpg
37d47893e8d31d4b7b8e0dbc600529d9
5eff8af38c75aa5fb30cafc74ffb05772fafd454
51114 F20101208_AABTOI mutschlecner_t_Page_159.pro
5ae288e238e05eab8654e676b0765cc6
b030d76b1de6356b96b03a2b44b701c0d6cd3faa
15480 F20101208_AABSKS mutschlecner_t_Page_138.QC.jpg
e54c7b19c25786ab22dc187510461555
3468482eb6b080de5b747a92aad289dba2b3bf1c
7790 F20101208_AABTNU mutschlecner_t_Page_143.pro
5d1c190665be6a95b81db6cef6ecd0b9
eb47d8bf8c1741766d9a004fba98f450596cc1e0
F20101208_AABSKT mutschlecner_t_Page_010.tif
2d06f05bafb9bcbc1d5bc010da30f461
2a9b79062448f652c4001d77f728b5ee52b8b420
32238 F20101208_AABTNV mutschlecner_t_Page_144.pro
cfe3d2c833d624698df7be820f044105
6b7bf5871235da700129c852b095bb7ae7fc3cf2
2116 F20101208_AABSLH mutschlecner_t_Page_051.txt
8be0148a7b08c1fb9ed2a33b3051bb51
dbc7be46318f937fd520304c8c98c359f730b3c7
54537 F20101208_AABTOJ mutschlecner_t_Page_160.pro
0aa09fe0023e6383e5dfe56208dcc800
de6e7f42c9cf3a705b78d4a98099a9715f832ae5
53352 F20101208_AABTOK mutschlecner_t_Page_161.pro
9fd34319b5c0ecf6e9ee537d7eb45598
a97d75ff9de32039b902056d24039807a45cec0e
52183 F20101208_AABSKU mutschlecner_t_Page_145.pro
f2a9fb751644e77712c2b5cfd89a0ba2
c0b678b10cba5e83b28b0fa5cbe6924bf4f7d31d
46511 F20101208_AABTNW mutschlecner_t_Page_146.pro
395875a85f9260ff7b6cec7da02be789
cd2dde5c65b65ee8655539a82e823c964a878598
4364 F20101208_AABSLI mutschlecner_t_Page_137thm.jpg
c49b1328c0175ee38a7f7467725a2c7d
2f4d40b24c2a6ad102fca146c897131a2e372c9e
2157 F20101208_AABTPA mutschlecner_t_Page_019.txt
7aeff68ea16ba888f2b00f8ec7f7f5fb
e3a4610bb02f872e736d9ef801c4d304a5b478aa
F20101208_AABTOL mutschlecner_t_Page_163.pro
490f03c9ea5d74484f288957f6c027f9
19d856474b15dd2d51e444603b5bf3d0da0698dc
F20101208_AABSKV mutschlecner_t_Page_040.tif
4517d499d9c6647967e9cb0efb529951
cccf40e8f91c9a4517c2c71e170bc8841247761f
47648 F20101208_AABTNX mutschlecner_t_Page_147.pro
548383f31391e32e6a1f17c276a548ec
2f0dee5f340264bd9f286e1ec46373b68bca2f58
23292 F20101208_AABSLJ mutschlecner_t_Page_095.QC.jpg
021629ccbdff6d5234998af4a4f75d17
7a1308f8793a58ed150713b2ec8852a3a718c2fb
2104 F20101208_AABTPB mutschlecner_t_Page_020.txt
8ba4ff170d506e1a56b7de8c7881b363
3067407da553e7a99f4980a597b89630a9751bb3
496 F20101208_AABTOM mutschlecner_t_Page_001.txt
094724af33a8d7713d99761be27e9456
83db3237ab65f266b539610d9fb805e30d0665de
23698 F20101208_AABSKW mutschlecner_t_Page_080.QC.jpg
965e294d525d5cdaa6bbb58b370604d6
9877fc15e5dd0dcfb4bf5200ad17fb476dc88244
38559 F20101208_AABTNY mutschlecner_t_Page_148.pro
4df99d43324d66c598b4066cd1bd4e90
9f567b0e5c6676323ec7f4e0bf084fd15ab3f769
F20101208_AABSLK mutschlecner_t_Page_114.tif
011c0e0af12101c0791b582a69608ff8
21aa57a02f82b1e8376b184f89ba46d13bac0c11
1977 F20101208_AABTPC mutschlecner_t_Page_021.txt
345f14ddf0ebe002302b3595989099a8
9536fb05a19b5dc6629ada46a738d4d6c350e44d
93 F20101208_AABTON mutschlecner_t_Page_002.txt
b98abe42a2fc4263d9e69cf48825a5f8
bd798a650a4b9008900ef0081816c616657b5b8e
F20101208_AABSKX mutschlecner_t_Page_122.tif
42ec1187f89a8ec67485dd2427d288b8
bc3ac6ca6e98197cb86465bee919f74975b2bcc0
28860 F20101208_AABTNZ mutschlecner_t_Page_149.pro
52af710f83d9abbcec2f9fc289020b27
449e034a357e0e5164af84e7b00ea402eedea363
14186 F20101208_AABSMA mutschlecner_t_Page_116.pro
4caae11085dc7987e53493edf64f7f84
bb5051ae94a3b7720fe3f5bf29c550613cc5348d
1991 F20101208_AABSLL mutschlecner_t_Page_046.txt
1e7ef8b26d5f0b0cd2033508452b4fa4
37b18a497a8be895024d4421d32dc902a8e2494d
F20101208_AABTPD mutschlecner_t_Page_022.txt
6430b735946e2daf3932d0e844adf7b3
70a00c12068d48b97df17d64def12cbbdb71e498
177 F20101208_AABTOO mutschlecner_t_Page_003.txt
1f79bbbc30ae83d5e2aaee4a2755a7c0
ab484df6d80a8babb09e8d7131ff9885b77a18ed
F20101208_AABSKY mutschlecner_t_Page_116.tif
c24f7d1df0193a8a7f7d12c5a0f2bb54
b7c4860bcc635fcd4dfb37dde65fc0dfe40f1c1a
2278 F20101208_AABSMB mutschlecner_t_Page_095.txt
1fdd3e22bc9a8a7f8b435baab61a2108
c3f9af71d175f7cb1ed45775649726964f05fd9d
13693 F20101208_AABSLM mutschlecner_t_Page_120.QC.jpg
7c3231d7e98301f37ba05dfe2b948edf
b3e25272c1a6f0bb2a45586a246a6c7ed5baa6fc
2331 F20101208_AABTPE mutschlecner_t_Page_023.txt
b8ccf5bf158a1c2e66494eda0b6752a5
00674b1dbc9b4d89072e1c126403584b881d562b
1997 F20101208_AABTOP mutschlecner_t_Page_004.txt
23e94e7215b33d44ce54919832b1d3cb
88f63428090a163a2cb1ec2d80357c49bbe04dbd
7193 F20101208_AABSKZ mutschlecner_t_Page_086thm.jpg
3add4b6c7be852a2cde4b43948949df2
ace8ff279f72a08279c9bd633997bc6832e00393
F20101208_AABSMC mutschlecner_t_Page_093.tif
fcddae2a0a30c2d215d92565fe732b2c
b524614d5c05f1a8238de1866df96fd70fccb22f
F20101208_AABSLN mutschlecner_t_Page_105.pro
c7c8f17cf6a9f7f0959fb3d9d428ebb3
1e3ebf8be157c9c7727355029630b137a5384301
2587 F20101208_AABTPF mutschlecner_t_Page_024.txt
66bcd9f17333ac74e3968e4e2cac17c6
c94d94889ad91fbd9cff32694c5a187d8dac9cac
873 F20101208_AABTOQ mutschlecner_t_Page_005.txt
9d14b6df1930f382b409f2ab6ec6fc29
010dd9f46e10130c28b31642aecb7c3f235a8a3b
F20101208_AABSMD mutschlecner_t_Page_026.tif
18b4bf6b590f600b504ab7008c07d899
7ab8288f1fc3d59cdcaee6d81dc6caabf1a57d1d
6741 F20101208_AABSLO mutschlecner_t_Page_019thm.jpg
86016e2b05f9c9225d7244a7920b9587
c4545fcb0500ba29bf79a3c6b1cf4bd49db247b8
2637 F20101208_AABTPG mutschlecner_t_Page_025.txt
72f8224b90f90f2906eac218f894c615
f74094689aecc47740170169d0d7d62a37a228a9
3452 F20101208_AABTOR mutschlecner_t_Page_006.txt
a20080f4f3688eb098624bf02faafa75
08afa200548c64f8a625cd90593487cda597bc6c
764936 F20101208_AABSME mutschlecner_t_Page_136.jp2
1baffda1c6422887bedad66cbe042eae
6097f6668aa19fa6d4bb0d874eaa699b2bebef78
F20101208_AABSLP mutschlecner_t_Page_062.tif
521d261cade26d1d35b8b48bdc98dfac
063f7c1700e934fd979075c21eb93c44cedd60d6
2074 F20101208_AABTPH mutschlecner_t_Page_026.txt
48a3d5652c873ecb6d0f3dcb17521bcc
e162f35a482125abe3832be78bd6a47242a90e0e
F20101208_AABTOS mutschlecner_t_Page_008.txt
33eebbdc4a527e845f3cfa16b69fe77f
8fd8c9b7fcc502d0f73a44e40ef501cf7660c940
65119 F20101208_AABSMF mutschlecner_t_Page_149.jp2
2a4b7ed442c2d4676ec74f4add91fc4c
66fc9d3a6f5b6d43b78184d0918b9b3933c5a1a3
F20101208_AABSLQ mutschlecner_t_Page_109.tif
28fe4605cab8806e13ca0f89a57f8e86
ea010c68d3d2c92719fb0feb20ae9bab821c09f2
2011 F20101208_AABTPI mutschlecner_t_Page_027.txt
3d6b342b1c1f3d3f7472107e69468e93
3e35f0c4d4d87d50196f5a7c749f0ab14446ec1f
898 F20101208_AABTOT mutschlecner_t_Page_012.txt
08d78d6ce8c526946ef3d1aabd2c19a7
0d2f30f638e38c02005a5bb84b6bc2a57a0e333b
764210 F20101208_AABSMG mutschlecner_t_Page_129.jp2
f7f5ab5c42dc385105368dfca27ac3fc
a27b02b8967bb3329a18a792b0598ff7e42a3c1e
113076 F20101208_AABSLR mutschlecner_t_Page_076.jp2
a2610c76fbcc4e9eea31d77d93b7b564
3201f9a68b6a31bc9d845c5b5502cc28f013b261
2040 F20101208_AABTPJ mutschlecner_t_Page_028.txt
c96c8b0991b201edcd83aad8ebda5949
aa0a92a1e0037153c15ddfcd8a9b61d5af44c8e5
2105 F20101208_AABTOU mutschlecner_t_Page_013.txt
ef66c65a9b0770358149263e757ab7a5
d05b9e81a65833c09efabfc24cb819f3d77db3b0
47809 F20101208_AABSMH mutschlecner_t_Page_031.pro
8de4e5798ad92e325988e33eb76a73ed
0af508b86ccec94705b10ce3bc7e611fc074da7d
2550 F20101208_AABSLS mutschlecner_t_Page_071thm.jpg
c60b4af91452098041d0ef5c085de45e
72fd5ce68acb91887385ea123eb41292adea5ba3
1975 F20101208_AABTOV mutschlecner_t_Page_014.txt
4c366ff553689e36f1e3ec3dd02ba6c4
462605b14aabf0e84664bfee0c3ed1954be54822
114788 F20101208_AABSLT mutschlecner_t_Page_038.jp2
2b3b40ce00a93a8e6279b6d600ea8c1e
d344b9f5b894e4d8e04aba8328cbb59bd6b8063c
F20101208_AABTPK mutschlecner_t_Page_029.txt
656e7e07b5105825a4572c5985ca1429
5c3ab0e3d5094270c70eca7f94e3c21eb3418fc9
1982 F20101208_AABTOW mutschlecner_t_Page_015.txt
ff7f7d82ee17b1379c58dae28fdc8a86
8513bd34bb41b9caacc9f5d45868981540ad828e
F20101208_AABSMI mutschlecner_t_Page_022.tif
cdd4f332eae261c5089af9c93997313f
31dffc8bb0d4d039ff6605fb43bbed74b3f1b470
1018 F20101208_AABSLU mutschlecner_t_Page_062.txt
0e07801ba2ee237349c5d921d82e8654
2e8ac9bc2da662be1a289ab1afc9e758e7f3c5cb
2100 F20101208_AABTQA mutschlecner_t_Page_045.txt
fb9195cdd64a4ca4ab2de44a31ce11fd
ae3d30fc72a2145bbacff13de98426700aa81013
2130 F20101208_AABTPL mutschlecner_t_Page_030.txt
581bbfc4fbc316143af962c2875ffdf9
b1055d3d2b2449c799cb402f6e9751b3081102f4
2095 F20101208_AABTOX mutschlecner_t_Page_016.txt
137b00523fad77075d35cd25ed074b60
e42cacae8d421ed41cca9d73c13485fab0d5dcaf
15148 F20101208_AABSMJ mutschlecner_t_Page_122.QC.jpg
54230c8645ac775663db40435ea80107
8e2eeaa3e4b8a61fc15df844d834f4fa776923cd
76146 F20101208_AABSLV mutschlecner_t_Page_074.jp2
37cb28ea7aa480cc2161b6c3ad21d92e
0d8b2dc07d77142ee0577efad8cdea1f7c3b33c2
2041 F20101208_AABTQB mutschlecner_t_Page_048.txt
eb563bf72806739ec82c7ccbe2fee66a
0df5729ea67844e673eea45599a027063c1a559f
1911 F20101208_AABTPM mutschlecner_t_Page_031.txt
febcc307f33ae9ad741fe623f000bf03
488aa59935c6a6eb17f3872a811b13893a7fed49
2541 F20101208_AABTOY mutschlecner_t_Page_017.txt
de873a354ed54492201eacb095ee556e
7014ced6098d8a147eee243c6592d9df2f99f343
987 F20101208_AABSMK mutschlecner_t_Page_133.txt
701e87f1dec0a2fe73aa0fafa35023d8
a8e9957577d33880cc7116a32ae5a0d27164273b
2088 F20101208_AABSLW mutschlecner_t_Page_049.txt
981eb5d84b6135154d678ef10fd5b75d
166b5afd33e36827e7d4cb6b58a7ac3baa82ccc0
1404 F20101208_AABTQC mutschlecner_t_Page_050.txt
7d66deea5663574045f9cbd3663d05a7
8b4226c2ec77ca34ad5c41dc1a7534f0661a5ede
2038 F20101208_AABTPN mutschlecner_t_Page_032.txt
bf955be2fd902c1a6f1b80e3e9deb0c8
d6a6d148bacc111d2be4baf9aaef73ef88f8fe7c
2208 F20101208_AABTOZ mutschlecner_t_Page_018.txt
ac582ff557f22ca6e94861d954499617
d9a22b1170c39d58777480134493482d896d2960
6479 F20101208_AABSNA mutschlecner_t_Page_041thm.jpg
454bc2d4a41dc74d43341fde61876fc7
322f824e762f868436912012cba0b32535e3da46
4241 F20101208_AABSML mutschlecner_t_Page_148thm.jpg
05fa54a806974c531b5c3dbf8a2bf7b4
07d52bcaa0cb4c83b4384e17f18438f00989a7ec
69924 F20101208_AABSLX mutschlecner_t_Page_016.jpg
37e792ee5717d52018b3acc714825240
84c6d588e0f5cbb0dad161554ea8bf2677a948f4
1865 F20101208_AABTQD mutschlecner_t_Page_052.txt
caac3ddd2e1ea24d529504580b8b60d4
267a825ec98339143ac1bf8778a4ebee30dc49b5
2085 F20101208_AABTPO mutschlecner_t_Page_033.txt
513ff5d937eaed940e8167e1384d175f
c623ed72fcfed077032ec33c711e7ecad49b8098
1587 F20101208_AABSNB mutschlecner_t_Page_074.txt
f1beb7b4a220e88c06fc8ecfa61d162a
35f815b87cc51cdd13b93f00dddb75f56f452807
21567 F20101208_AABSMM mutschlecner_t_Page_005.pro
0f1948da1767c760512568a340a81be7
02d7639a4397062a607d35c287963325afdb6897
460 F20101208_AABSLY mutschlecner_t_Page_121.txt
6b4ee9d563b468144adb3b610a5597ba
b1b6edf95c8bce0efaa0463eccdecb0be15172a6
F20101208_AABTQE mutschlecner_t_Page_053.txt
ba4c7c77c7918a89d58aab0945d27e58
9cf2a35704404561e0e198066332f1b639b1192a
2023 F20101208_AABTPP mutschlecner_t_Page_034.txt
2a427d33bd2646e3361e35357a714f69
f1f0219510dd0927d5d9d248e80a18c90b73795e
50435 F20101208_AABSNC mutschlecner_t_Page_162.pro
5fd4fb35a92fce30aa08bb6b5a7393a3
1ae58563a04c2c69ce6987045889785c121a5085
22103 F20101208_AABSMN mutschlecner_t_Page_059.QC.jpg
9d57b24c89bf4eb420982cf4a4481682
0018b0fcdf8579788d4819d42e4380ac22cd7d55
3208 F20101208_AABSLZ mutschlecner_t_Page_010thm.jpg
c1939d4e74c0402c9f7d2cc3b7804522
beede682dc34d70edb6e7d9dc8533cb7462933f2
2173 F20101208_AABTQF mutschlecner_t_Page_054.txt
b829071ccf6863a4c3cc95d774cb31c9
ac7e6cc3479e1c9f43841f762d15d984e9d63379
1979 F20101208_AABTPQ mutschlecner_t_Page_035.txt
df03b9354329500903350302acbb840c
b61417d4f5c0fd91b978bce0e06fc2960a266a4a
111073 F20101208_AABSND mutschlecner_t_Page_094.jp2
a0f51c42b62e46df96b1165f2897b601
c2b73d8c01216cc94117f71e2da03a4dff60c4d2
5682 F20101208_AABSMO mutschlecner_t_Page_123thm.jpg
a1f096275d4da140074aaae9a24eed0a
6677badcbde9a73b269f199c12a7a662527f5fba
2183 F20101208_AABTQG mutschlecner_t_Page_055.txt
099c900063a5a5f6e7743a27ce5f3250
9f17b01edb5172f1b3db90ba7c3a38a4ba09d09b
F20101208_AABTPR mutschlecner_t_Page_036.txt
43bde31d6b0b4c73145cdc54fdc39299
ff7977166b659f2f0ac2a4a47f295a8ffa554f29
6431 F20101208_AABSNE mutschlecner_t_Page_080thm.jpg
d25399b84a12c66b9784acacdebee0bc
6950df997b4d1c1b46232f271bb12264d0b1b50e
24892 F20101208_AABSMP mutschlecner_t_Page_018.QC.jpg
e44e0adcd63565c61f2f98965f517a6e
43b88958a9f3099f8832cee68e2e281dc856a87c
1925 F20101208_AABTQH mutschlecner_t_Page_056.txt
7db864f76e674d36fdba111503f0c906
fd244bb7c30c9b35ef989be19af9ec4955e0d4ad
2034 F20101208_AABTPS mutschlecner_t_Page_037.txt
69450ed792a431dae42488a61c5a0e42
419097d38767674b1da4f176d672e6ba8e20f447
20418 F20101208_AABSNF mutschlecner_t_Page_006.QC.jpg
14b0542a8108bf94ef54294f0e19f723
494d785f0c8f139f845bd707b401fae52a481bdd
1363 F20101208_AABSMQ mutschlecner_t_Page_002thm.jpg
8c435814d1417b99d0facb4114c89b44
19b860cb18f459c98d50b2d28914b098c03ab81b
1853 F20101208_AABTQI mutschlecner_t_Page_059.txt
67eb0bf5233ed3a56bc50b59fd3577d6
d93fcb915212916fd2914f51d547fd0e174a8c47
2106 F20101208_AABTPT mutschlecner_t_Page_038.txt
a35cc6e65de309816085ad5b64428cab
096dd09f81778e5bf4d1deb9520493e4dcaebb70
22604 F20101208_AABSMR mutschlecner_t_Page_044.jpg
64d1b4c61fe3d89c6ebbd891dc2079ef
c644dba15b8bb253eae8ec1a01cf481672979fab
F20101208_AABSNG mutschlecner_t_Page_028.tif
f5fa713e42d2e44a59c9bc4d3f78b510
eab7e5551e2c589fcdd0536a3db2d47b0821351b
329 F20101208_AABTQJ mutschlecner_t_Page_063.txt
3fa1d1aeb740f0e52255f1e96370311d
972ce5ceb964023c883ffb4b6c872424c0871c47
2050 F20101208_AABTPU mutschlecner_t_Page_039.txt
7a880e2df1f25646de222533af823ef2
dd4d80eb744f15f26d27e3b0f0f68cbbd8178b88
6666 F20101208_AABSMS mutschlecner_t_Page_004thm.jpg
51db1d6902be658decb7925eb52bda21
66dfd58a5c5ab3587c17450b676c43e87060d8a9
F20101208_AABSNH mutschlecner_t_Page_005.tif
59f5bf5fc6486b4a7db074c553dca083
8b90884cdf5df85c6ed6ba00711562ac5073ab02
531 F20101208_AABTQK mutschlecner_t_Page_064.txt
ca66702244096bd86276f638e0ad41d9
a26043b0f2f7d40af579b6118bc1cffee524791c
2136 F20101208_AABTPV mutschlecner_t_Page_040.txt
6fef4424a561c4718371a5a0cb772396
9360c9629141e7c8bf4e1d533b35d876d9b95476
24645 F20101208_AABSMT mutschlecner_t_Page_162.QC.jpg
5b56b76136fe8b33ae462879b5fb64e6
c1e4a107e47682d87001cb9c8c29f8fce27d5087
4247 F20101208_AABSNI mutschlecner_t_Page_007.txt
ec135a0c38701252a6fc445bd2cb8e58
a92063fac83425f98240fc9c02c1d5deb469bf2d
1959 F20101208_AABTPW mutschlecner_t_Page_041.txt
d7198766cc6de4efca9b43234a695d10
5c16bd963025cca1d3427d210fab0fcd77030278
1305 F20101208_AABSMU mutschlecner_t_Page_151.txt
cf2e5f50f5a4a15829a5733e771e11d6
2aee6e5746488bc6761664d486b2be197ddc7b13
2117 F20101208_AABTRA mutschlecner_t_Page_087.txt
9bb5d2498dc7f60d10d8a36ff6c1839a
938c080b7c1bab2297d588100e6be5592840ef58
1115 F20101208_AABTQL mutschlecner_t_Page_065.txt
8b73da3d7f7055bf987f318fcdfd1be2
dfa89be39f12cd50c982b0250a7c382aa85b8893
2002 F20101208_AABTPX mutschlecner_t_Page_042.txt
dcc396261a24516a77b5fe2ee542266a
e2a0c92b4f36602281fe4278e504681ced01dd53
F20101208_AABSMV mutschlecner_t_Page_101.tif
0ea4fc37823af6a2deb0b90254456592
04fd846c17d2eddfc76a76c9c668c99d2fbfd90a
10219 F20101208_AABSNJ mutschlecner_t_Page_002.jpg
5535ac787c027d3780aa0fa12e79ea77
b98a36a97f5d202d14def7d8d5760efcf5f87e15
2576 F20101208_AABTRB mutschlecner_t_Page_088.txt
281f2f1fcfadb973744163eeb254ca68
a04be15addf86130a5b68d2b2df758579c358249
498 F20101208_AABTQM mutschlecner_t_Page_066.txt
8d6170334c38a7371e38e4e94c6ae925
42e149b8c266e7b9254755a5f023fc64729003bf
2131 F20101208_AABTPY mutschlecner_t_Page_043.txt
5c6cf3008b6de07d9533f4416f6f0ce0
e57370afb7f455086100abf27e804bdf7d4b549e
105275 F20101208_AABSMW mutschlecner_t_Page_015.jp2
4197bff8c53455c256567e86a1604341
56871169e122dff8d8bf46f15111f07c5c86449c
53287 F20101208_AABSNK mutschlecner_t_Page_020.pro
52adbc134c52a92bc34bc7b05a15ab0a
120c90bfa05e3e1f67602e153b81e5b3fb5e726c
2119 F20101208_AABTRC mutschlecner_t_Page_090.txt
771afcc5768ba0382c0ca76585838e24
d58d4fddbce00a46309b7dcaf532b78083794c00
575 F20101208_AABTQN mutschlecner_t_Page_067.txt
39b47a796707a4e4d1653bb7f141af87
3a2f7c91d0923c52b35e0f32589bc3381816871d
486 F20101208_AABTPZ mutschlecner_t_Page_044.txt
c547219a194132be5d5973927ea6e61e
4d7f28feec21aafd85c8dea0fbc4d2e754c2a6d1
5060 F20101208_AABSMX mutschlecner_t_Page_147thm.jpg
a4d67e176e1a74f76304c999a7128245
1e63a3caecd32c72e7421de42f323028bdfd3f34
22431 F20101208_AABSOA mutschlecner_t_Page_073.pro
c7afef46b42a13b2b3efca0db99feb63
b25c446acda89acc0037f91b26decbb85e003f73
365404 F20101208_AABSNL mutschlecner_t_Page_143.jp2
5313abda9263ec506085270fbdda7a02
60748c6ec3df5c7af2e0dd2598eaea6695cea621
1379 F20101208_AABTRD mutschlecner_t_Page_092.txt
aa427c1f15c5b97f0a8c714005e9a4a6
6ad288899a0d3aba0b4e372aaa8aa4982a23b09a
1034 F20101208_AABTQO mutschlecner_t_Page_068.txt
a4aaa4c7d8e6f111f735089153e78713
2aa92353da98127209af4cce4015083954befd3e
6874 F20101208_AABSMY mutschlecner_t_Page_054thm.jpg
82faace1c8ddf826b786651c09b7754e
b27c41f6a17ccb7ed0ab59621cb96bd54a7d5506
5791 F20101208_AABSOB mutschlecner_t_Page_163.QC.jpg
6bce6a2f7f95a412a68889fead734938
ae5c1d139308ec54cf97e4bb76ee0d7a31267f89
11518 F20101208_AABSNM mutschlecner_t_Page_003.jpg
a1aae89f7e544a39db0a9d61bd3f6e48
9a081c767990131a03ffba1c63f2206c4e1e1889
2057 F20101208_AABTRE mutschlecner_t_Page_093.txt
27244ecf2964d7f521c9a800d98405fb
683373096b7cca18f1b9f082cc4d5f4341be619b
327 F20101208_AABTQP mutschlecner_t_Page_069.txt
81b44f0bf70ec81fb6c54b777660c4ec
eb06c247ec06ac4343e50e08f8b19fdb5ae86658
4970 F20101208_AABSMZ mutschlecner_t_Page_107thm.jpg
a0c4f5b573468e9d71ecd1818076d458
066f138750f106708b783fe2f59b0db87c8efbc1
14703 F20101208_AABSOC mutschlecner_t_Page_106.QC.jpg
524374aa6919b29a4af1120ce2cf1517
ac716e7022fb030ea46dff303c74e015aa764a53
2125 F20101208_AABSNN mutschlecner_t_Page_009.txt
59ce07ff91afa3f19a0418ea9ea4200e
48a2e3d28ba00df1bb911ec5c5cb126de9a48bc6
2036 F20101208_AABTRF mutschlecner_t_Page_094.txt
8095a9ffac9b5535d5bc89f7b7aaf06b
69e0546d34988024ffc5c9725ed077d9921cfbcd
591 F20101208_AABTQQ mutschlecner_t_Page_071.txt
a9988a7fbfcdbfb34883da8eb0527fa3
1d8f3c7761f4fb1f4b5b10f84b480443e03ab78b
2926 F20101208_AABSOD mutschlecner_t_Page_155thm.jpg
db9a4a0c5f74b1782485f991d477d96c
afb50f880c9c38f73d837bcfb58b3a30eec41062
65582 F20101208_AABSNO mutschlecner_t_Page_124.jpg
1d0758713ef78b42f14cc1a3393992a5
9a56b86fbf553d361d7a9a7c87a7d99fd07e09f9
F20101208_AABTRG mutschlecner_t_Page_096.txt
9fd9ff1676e13bd24bd6085b783af970
c47841f70062f7a0c5f133a5466c91a120806a90
817 F20101208_AABTQR mutschlecner_t_Page_072.txt
80dd3f4816b9c54711a539883b0c9cf2
80acc9ec02554837175aae5353c8d33b05c5f7fb
755414 F20101208_AABSOE mutschlecner_t_Page_105.jp2
5a4a0e8d2fe27d32e4d2d46f6694be4f
a83b153ce7458f6fda0fe280e343f224bbea7285
22929 F20101208_AABSNP mutschlecner_t_Page_097.QC.jpg
3ba9b5b7472a4ec00f966056cb5c06b3
996dbbc8e142bef8f240303ee85c601311d4e0fd
2068 F20101208_AABTRH mutschlecner_t_Page_097.txt
aea8055eaf91768a89cffe736e0920c7
dad76d610ed1b3b7e59cfe88dcbc10a4ef4232bd
1253 F20101208_AABTQS mutschlecner_t_Page_073.txt
d3d2855eb70fc83498a93ff187d7722d
aba10cc41f2c53dd6d23952f1c2b3c443f705a75
F20101208_AABSOF mutschlecner_t_Page_004.tif
b594053e97a4b14a855214eccce1022d
153f44225f5048d1b64cdcc96b8a28355cfd1dd3
6592 F20101208_AABSNQ mutschlecner_t_Page_081thm.jpg
fc5c683ac81fd4e26a8c373e999e8d7a
f69b03929ecfe0a101b399af517ec04bed58fd10
2133 F20101208_AABTRI mutschlecner_t_Page_098.txt
f5c06e3db3655e11bc34ff76e0a02162
90f981a34d9f7cf04069099d50ac730200204e4c
2083 F20101208_AABTQT mutschlecner_t_Page_075.txt
3a2e19261ceba3032ca3531e0c6aeb0f
bf038fc99e6272d15fde259c92dd85946b06d94f
1917 F20101208_AABSOG mutschlecner_t_Page_058.txt
2477d806467bcc47b4e141ca61739784
307877efd1d3389292a841f4ad912404aaa4354c
9149 F20101208_AABSNR mutschlecner_t_Page_001.pro
8017790d6bc915b767886c7c42e4de22
ac30386d27363fc7c9158de9d517803fe5a74c11
1026 F20101208_AABTRJ mutschlecner_t_Page_100.txt
2ddb4c909571585282789ecccd38dcb1
81f2b033757b317ef12db80a5cb854e2d795db21
F20101208_AABTQU mutschlecner_t_Page_077.txt
e78862dcb70872634e13b1bac5d1f11b
b40f1d98c47b7fca97cfb6d2e05d82d4f5c6c021
941880 F20101208_AABSOH mutschlecner_t_Page_134.jp2
d794ec22388afa39032b6528d016062e
9d73f8ff0e1f38696f5adf4857521ac6f27ef249
1850 F20101208_AABSNS mutschlecner_t_Page_057.txt
397f0606b60ed910a9c35859d8cfc87c
b8275ecfa59edaac44febc9bf643ac56ed5fe494
823 F20101208_AABTRK mutschlecner_t_Page_101.txt
646cbd5f45abffdb986ba735cf801d28
d2a7e576e19fe5d43c7998481288d78e8be8a6cd
2181 F20101208_AABTQV mutschlecner_t_Page_079.txt
58697a0550e6ae08cf630da738f243f9
660eabadaf1a59ba4501ead7406d3441c65fae72
50221 F20101208_AABSOI mutschlecner_t_Page_045.pro
7ffc6640be4b634b451b01bc5ea94a90
04136f4d25aea68b21df38c2d732db5aa1c0c50e
132131 F20101208_AABSNT mutschlecner_t_Page_017.jp2
e686a90c36286f1a18a113dff3fc05d5
c4300299b435473e6478ec888b4d50ad1d860ee4
3271 F20101208_AABTRL mutschlecner_t_Page_102.txt
630f43dcda1889b339273c86a995fbc2
2d494d09b7dd51534aec4272304e230957ddff0e
2051 F20101208_AABTQW mutschlecner_t_Page_080.txt
f6edb42ec5dede4c091f746b3a1ffd18
1ee4a193678696e923a6a4d6d2d3a54860a12808
15252 F20101208_AABSOJ mutschlecner_t_Page_150.QC.jpg
2c0ac303ad0a01d01cb222302fff42d2
06ec86bb2c635e62f61be5f0a0f5b3bbd3a278ca
F20101208_AABSNU mutschlecner_t_Page_037.tif
592bfd1ada64a06f2ab60e05b7be33f8
8e60ca6fcadf2a9740e6779302aa0c7a215736b1
F20101208_AABTSA mutschlecner_t_Page_125.txt
0f634a60bac777e81bf1de8da6deccfc
b2a2c7b3a6a663365045673e333adc0a7fcb607d
F20101208_AABTQX mutschlecner_t_Page_084.txt
e5d08cfd54838cb264c22cf01b20f561
69a4fbbed0065e50e51f2fec51854d15a86c55c0
14558 F20101208_AABSNV mutschlecner_t_Page_144.QC.jpg
8aeabb3dfe880a7c431d39b1e41dba66
a6790336c2cbd6d1f96859d627d829bfae68f9f5
1142 F20101208_AABTSB mutschlecner_t_Page_126.txt
fe4e9c8ad1aaba140f7bc73f8039a695
06789d545185b3d792bf63596da54cab51828e2d
416 F20101208_AABTRM mutschlecner_t_Page_103.txt
16f6e50eced0f80ce6666777b388eff0
fd5a3c55d310c8400706224d690a86f845baf8ea
1931 F20101208_AABTQY mutschlecner_t_Page_085.txt
6e0547f490586c2c2e7652649a06870d
5b3a4d553163e6f8f4e96ff6f32dff021332e3e4
24247 F20101208_AABSOK mutschlecner_t_Page_051.QC.jpg
634191a00ced7477ed272bef4a57d1d9
a1d84da82e4151e8ab73923beb236bd2ff2d2b3a
78115 F20101208_AABSNW mutschlecner_t_Page_160.jpg
95d402947b0a125cfb4aa74847d0a305
ea235444978ba3b4eee59a15a6c245d01d59ba0b
1212 F20101208_AABTSC mutschlecner_t_Page_127.txt
2c23af5383aa1eb261ce49f2d70279b2
70f6917ad49546bcf7be0bab9bb352596e740463
284 F20101208_AABTRN mutschlecner_t_Page_104.txt
920b6486c2c925c0ee14f4db5e94e41c
8aa7ae03fc468b2a89e1b96132023d89674edaa1
2196 F20101208_AABTQZ mutschlecner_t_Page_086.txt
7f5d14d306c19aeead6a5f7c83e1bc7d
1a4478f092255d709392bed73edd8660084541e5
25453 F20101208_AABSPA mutschlecner_t_Page_030.QC.jpg
673ea70966662e1b05acca0d844708f0
4c5ff9e5e10b3d0be84194eb9870deff7dc4208a
102234 F20101208_AABSOL mutschlecner_t_Page_096.jp2
c1c15c528f29c3164386b0a4e08d6b26
f4d1dff260f4c934e35c05576c5ae08913c0ce2f
368 F20101208_AABSNX mutschlecner_t_Page_143.txt
8f9ecc978db63ab38a21d55642dcdb6f
4e18a7fdcfb46b90f6acd2c51357332f38ffd740
609 F20101208_AABTSD mutschlecner_t_Page_128.txt
537047fda5e33d20c3eade96063f5689
20a71088ca9b98e69343114746c441f08b029ba4
260 F20101208_AABTRO mutschlecner_t_Page_105.txt
a3a626935af84f377f5743ba55e5ffbe
08f1dcce0f063ca3d23151b105d230c6a9127685
110085 F20101208_AABSPB mutschlecner_t_Page_037.jp2
c9c3106a3238995fcd0adbf48d93f18f
2d36ee4d50d26b5ea6ada9ebdcfd3d94ea6e536e
49800 F20101208_AABSOM mutschlecner_t_Page_042.pro
ce93cd68a0cd5211008061a8b272aced
b7e25af0c5c131fb51a6d9b294fedf0802be3606
50032 F20101208_AABSNY mutschlecner_t_Page_157.pro
9df68be96d21dddadb19f704dfccd8b3
0ce47040502b7f5791202d4f2ade4a916027ed02
577 F20101208_AABTSE mutschlecner_t_Page_129.txt
17463deb4e30535dcccea713b3e5c6eb
bc5a15dea9bb849e512826b9bd28dcc9b0884281
54 F20101208_AABTRP mutschlecner_t_Page_106.txt
deb32f0f374919220c95ea2e21dbe7dd
229982ab219308aae18996f08f2b4e26a86a1b38
F20101208_AABSPC mutschlecner_t_Page_089.QC.jpg
0763fc14a4dabff762e86bb9127f4150
06ae413d2f39c7f5a532d8893012de903c0c106d
7578 F20101208_AABSON mutschlecner_t_Page_044.QC.jpg
325d718c9cbcebfc02c1efe63c3fbe65
62948fbc619d479529ea87c7b52e08914fd7e1dd
482 F20101208_AABSNZ mutschlecner_t_Page_107.txt
dcfa8b9f7576adcb8336ab7641a8724a
22d75090e359cb68066d7f8a1ef7c0cd86913afe








CONSTRUCTION, VALIDATION, AND ADMINISTRATION OF A DIAGNOSTIC TEST
OF CELLO TECHNIQUE FOR UNDERGRADUATE CELLISTS













By

TIMOTHY M. MUTSCHLECNER


A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL
OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT
OF THE REQUIREMENTS FOR THE DEGREE OF
DOCTOR OF PHILOSOPHY

UNIVERSITY OF FLORIDA

2007































2007 Timothy M. Mutschlecner





























The most perfect technique is that which is not noticed at all.
Pablo Casals









ACKNOWLEDGMENTS

This work is dedicated to my dear wife Sarah who had shown unwavering support and

encouragement to me in my studies. In every way she made possible the fulfillment of this goal

which would have be unimaginable without her. To my children Audrey, Megan, and Eleanor I

owe a debt of gratitude for their patient understanding. My parents Alice and Paul, through their

continued reassurance that I was up to this task, have been much appreciated. Dr. Donald and

Cecelia Caton, my parents-in-law, spent many hours editing this manuscript, and I am very

grateful for their skill and encouragement. The professional editing expertise of Gail J. Ellyson

was invaluable.

The transition from music performance to academic scholarship has not always been

easy. Dr. Timothy S. Brophy's expertise in the field of music assessment and his enthusiasm for

the subject was truly the inspiration for what grew from a class paper into this dissertation. As

Chair of my committee he has provided the necessary guidance and direction leading to the

completion of this work. I consider myself very fortunate to have worked under Dr. Brophy's

mentorship.

As members of my supervisory committee, Dr. Art Jennings, Dr. Charles Hoffer, and Dr.

Joel Houston have generously offered their insight in refining this research. I thank them for

their service on my committee and for their support. Gratitude is also extended to Dr. Camille

Smith and Dr. David Wilson for serving as initial members of my committee.

A study of this magnitude would have been impossible without the commitment from

colleagues: Dr. Wesley Baldwin, Dr. Ross Harbough, Dr. Christopher Hutton, Dr. Robert

Jesselson, Dr. Kenneth Law, and Dr. Greg Sauer. Their willingness to allow their students to

participate in my research, made this study possible. The support and insight from these master









teachers was invaluable. Dr. Elizabeth Cantrell and Dr. Christopher Haritatos, as independent

judges, spent many hours viewing video-taped recordings of student performances. Their care in

this critical aspect of my study was much appreciated.

Dr. Tanya Carey, one of the great living cello pedagogues, provided valuable

suggestions for the design of this test. Thanks go as well to the many Suzuki cello teachers who

participated in this research. Their willingness to share ideas and provide suggestions on ways to

improve my test was heartening.

Finally, thanks go to the 30 students who agreed to participate in this research. The

effort they made to prepare and play their best is gratefully acknowledged. Future cellists are in

their debt for being pioneers in the field of assessment in string performance.









TABLE OF CONTENTS

A CK N OW LED GM EN TS ............................................................. ........... .. 4

L IST O F TA B L E S .............................................................................. 9

DEFINITION OF TERM S................... .................. ..................... ......... 10

ABSTRACT............................................................................ ......... .

CHAPTER

1 INTRODUCTION ................. ...................................... .... ......13

Purpose of Study........... ................ ........................................ ......... 14
R research Q uestions................... .................................. .... ...... 14
D elim stations .................................................................................. 14
Significance of the Study ............................ ......... ............... .... ............14

2 REVIEW OF LITERATURE........... ...................................16

Introduction................... .................. .................. .............. 16
Philosophical Rationales............ .... ...................................... ....... ... 16
Bennet Reim er................... ............. .............................. ..... .... 19
D avid E lliott................. ................................. .. ... ............... 2 1
Comparing and Contrasting the Philosophic Viewpoints of Reimer and
Elliott .....................................................................22
Theoretical Discussion.................................................... ... ..........23
Assessment in Music: Theories and Definitions.....................................23
Constructivism and Process/Product Orientation........... .... ...................26
Definitions................. ............................. ........ 27
Research................................ .......................... ................... 29
The Measurement of Solo Instrumental Performance ........... ............................29
John Goodrich W atkins ................. ................. ..................2....... ..29
Robert Lee Kidd................... ................ ...................... ...... 31
Janet M ills ................ ........... .......... ......................... 32
The Use of Factor Analysis in Performance Measurement..................................34
H arold F. Abeles.................. .............. ........................ ....... 34
M martin J. B ergee.......................... .............................. ......... .... 36
The Development of a Criteria-Specific Rating Scale .......................... .............37
The Measurement of String Performance................................. .............. 39
Stephen E. Farnum .................. .................... ....... ................39
Stephen F. Zdzinski and Gail V. Barnes .............................. ... ......... 41
Summary: Implications for the Present Study........... .................................. 42





6









3 METHODOLOGY................... ................... ................. .. ......... 45

Setting and Participants....................................... ............... .. ............ 45
Data Collection.................. ................ .......................... .........45
The W written and Playing Test ............. .. ........................................ 46
The Student Self-Assessment Profile................................................47
Rationale for the Assessment Methodology............ ........................ .........47
Interjudge Reliability................... .................. .................. .. ......... 49
Data Analysis ................... ......................................... ........49
Content V alidity................... .................... ......................... ....... 50

4 RESULTS ................... ................ ........................... ........... 51
Data Analysis ................... ................... ........................ ......... 51
Participants................... ............... ...... .................. ................. 52
Part O ne: The W written Test................... ......................................... ...... 52
Scoring the W written Test................................. ............. ............ 52
Results from the W written Test ............. ........................................... .53
Regression Analysis of Written Test Items ........................... ............55
Part Tw o: The Playing Test.................. ................................. ...........56
Scoring the Playing Test................... ........................................56
Results from the Playing Test..................................... ......................56
Comparison of Left Hand Technique and Bowing Stroke Scores.................57
Comparison of Playing Test Scores and Teacher-Ranking........................57
Interjudge Reliability of the Playing Test...........................................58
Part Three: The Student Self-Assessment Profile.................... ................................58
Repertoire Previously Studied..................... .................................... 58
How Interested Are You In Each of These Areas of Performance:
Solo, Cham ber, and O rchestral?..................................... ........................... 59
Other Areas of Performance Interest?............................. .......................59
What Are Your Personal Goals for Study On the Cello?...................................59
What Areas of Cello Technique Do You Feel
You Need the Most Work On?............................. ...............60
Summarize Your Goals in Music and What You Need
To Accomplish These Goals ............ .............. ... ..................60
Summary of Results................... .................... .... ............................61

5 DISCUSSION AND CONCLUSIONS........... ........................... .............75

Overview of the Study................... ....................................... ..... 75
R eview of the R results ................................. ......................... .............. 75
Observations from the Results of Administering the Diagnostic
Test of Cello Technique................... .................. ........................... 76
The W written Test................... .................. ........................... 76
The Playing Test ..................... ......................78
The Student Self-Assessment Profile................................................81
Discussion of Research Questions.............. ............................. .............84



7










To What Extent Can a Test of Cello Playing Measure a
Student's Technique? ................... .............................. ...... .. ........ .. ....... ....84
To What Extent Can a Criteria-Specific Rating Scale Provide
Indications of Specific Strengths and Weaknesses In a Student's Playing?.......... 85
Can a Written Test Demonstrate a Student's Understanding of
Fingerboard Geography, and the Ability to Apply Music Theory
T o the C ello? .................................. ... ... .................................. 86
Observations on the Playing Test from Participating Teachers..........................88
C om parative Findings .......................................................................... 89
T he F arnum String Scale................................................................. 89
Zdzinski and Barnes ............ ........................ ............... .. ......... 90
Conclusions ............ ............. ... .......... .. 91

APPENDIX

A PILO T STU D Y .............. ....................................................... ....... 93

B V ALID ITY STUD Y ......................................................... ....... .... ......97

C VALIDITY STUDY EVALUATION FORM .............................................100

D INFORMED CONSENT LETTER .....................................102

E THE WRITTEN TEST .............. ............... .......... ...... ..........103

F THE WRITEN TEST EVALUATION FORM................................................112

G THE PLAYING TEST................... ................... ...................... .... ... .. 113

H THE PLAYING TEST EVALUATION FORM.............................................144

I REPERTOIRE USED IN THE PLAYING TEST .......................................... 149

J THE STUDENT SELF-ASSESMENT PROFILE........................................152

K DESCRIPTIVE STATISTICS FOR RAW DATA......................................... 154

L IST O F R EFE R EN C E S ................................................... ....... ... ........ 159

BIOGRAPHICAL SKETCH .............. ......... ....................... ...... ......162









8










LIST OF TABLES


Table Page

4-1 Summary of regression analysis for year in school as a predictor of written,
playing, and total test scores (N= 30) .............. .................. ...........62

4-2 Years of study, frequency and test scores ............ ........................ ...............63

4-3 Summary of regression analysis for piano experience as a predictor of written,
playing, and total test scores (N = 30) ........ ... ...................... ............ ...64

4-4 Item difficulty, discrimination, and point biserial correlation for the written test..........65

4-5 Mean scores of playing test items in rank order.......................... ...............68

4-6 Comparison of teacher-ranking to playing test ranking .................................... 69

4-7 Comparison of researcher's and independent judges' scoring of student
performances of the playing test................. ..........................................70

4-8 Numbers of students expressing interest in solo, chamber, and
orchestral performance (N = 29)................ .................... .......... ........... 71

4-9 Personal goals for studying the cello........... ............ ......... ......... ................72

4-10 Student perception of priorities for technical study............. ..........................73

4-11 Goals in music and means of accomplishing them.................. .................74

K-l Raw scores of the written test items, composite means, and
standard deviations.................. ...................................... ...... 154

K-2 Raw score, percent score, frequency distribution, z score, and
percentile rank of written test scores............ ......... ............................. 155

K-3 Raw scores of the playing test items, composite means, and
standard deviations................... .................. ....................... .... .... .. 156









DEFINITION OF TERMS


* Fingerboard geography the knowledge of pitch location and the understanding of the
spatial relationships of pitches to each other
* Horizontal intervals-intervals formed across two or more strings
* Vertical intervals-intervals formed by the distance between two pitches on a single
string
* Visualization-the ability to conceptualize the fingerboard and the names and locations
of pitches while performing or away from the instrument
* Technique
1) the artistic execution of the skills required for performing a specific aspect of
string playing, such as vibrato or staccato bowing
2) the ability to transfer knowledge and performance skills previously learned to
new musical material
* Target Note-a note within a playing position used to find the correct place on the
fingerboard when shifting









Abstract of Dissertation Presented to the Graduate School
of the University of Florida in Partial Fulfillment of the
Requirements for the Degree of Doctor of Philosophy

CONSTRUCTION, VALIDATION, AND ADMINISTRATION OF A DIAGNOSTIC TEST
OF CELLO TECHNIQUE FOR UNDERGRADUATE CELLISTS

By

Timothy M. Mutschlecner

August 2007

Chair: Timothy S. Brophy
Major: Music Education

The purpose of this study was to construct, validate, and administer a diagnostic test of

cello technique for use with undergraduate cellists. The test consisted of three parts: (1) A

written test, which assessed a student's understanding of fingerboard geography, intervals, pitch

location, and note reading, (2) A playing test, which measured a student's technique through the

use of excerpts from the standard repertoire for cello, and (3) A self-assessment form, through

which students could describe their experience, areas of interest, and goals for study. A criteria-

specific rating scale with descriptive statements for each technique was designed to be used with

the playing test.

The written test, playing test, and self-assessment were pilot-tested with five

undergraduate students at a university in the southeast. A validation study was conducted to

determine to what extent teachers felt this test measured a student's technique. Nine cello

teachers on the college and preparatory level were asked to evaluate the test.

The test was administered to 30 undergraduate cellists at universities located in the

southeastern region of the United States. Strong interitem consistency was found for the written

test (r KR20 = .95). A high internal consistency of items from the playing test was found (a =









.92). Interjudge reliability of the playing test was high, as measured by comparing the

independent evaluations of two judges with the researcher's evaluations using Pearson's r (Judge

A r = .92; Judge B r = .95. Other conclusions drawn from the study include: (1) Piano

experience has a significant positive effect on the results of the playing test (R2 = .15); (2) The

playing test is a good predictor of teacher-rankings of their student in terms of technique; (3)

Year in school, degree program, or years of playing experience were not significant indicators of

students' playing ability as measured by this test.

Participating teachers described this test as a valuable tool for evaluating students and

charting their course of study. They found it to be an efficient means to identify a student's

strengths and weaknesses in cello technique.









CHAPTER 1
INTRODUCTION

Diagnosing a student's playing is a primary function of every music teacher's daily

routine. Boyle and Rodocy (1987) note that "applied music teachers focus instruction on the

basis of their diagnostic evaluations of a performer's strengths and weaknesses. In short,

diagnostic evaluation is a critical and ever present part of any good music program" (p. 11).

Without denying the role and value of traditional means of gathering information subjectively in

the teaching studio, educators agree that "evaluative decisions are better when they have a strong

information base, that is a base including both subjective and objective information" (Boyle and

Rodocy, p. 2). A diagnostic test of cello technique, designed for use at the college-level, could

supplement existing methods of evaluation and provide a greater degree of objectivity in

assessing a student's needs.

The successful teacher has much ability to rapidly determine strengths and weaknesses of

a new student's technique and prescribe exercises, pieces, or new ways of thinking about the

instrument to correct errors in playing. However, deficiencies of technique or understanding

often show up in a student's playing while working on an assigned piece from the standard

repertoire. When this occurs, teachers must then backtrack and correct the deficiencies with

etudes or exercises, or jettison the work for a simpler piece--a demoralizing experience for the

student. Determining the playing level and technical needs of each new student is an immediate

need. Within a few weeks of a college student's entry into a studio, the focus of lessons often

becomes preparation for a degree recital or jury exam. The opportunity to study technique on a

broader scale than what is merely required to prepare an upcoming program can quickly

diminish. A diagnostic test, administered to assess technique, could be a valuable tool in this

process.









Purpose of the Study

The purpose of this study was to design, validate, and administer a diagnostic test of cello

technique for use with undergraduate college-level students.

Research Questions

1. To what extent can a test of cello playing measure a student's technique?

2. To what extent can a criteria-specific rating scale provide indications
of specific strengths and weaknesses in a student's playing?

3. Can a written test demonstrate a student's understanding of fingerboard
geography, and the ability to apply music theory to the cello?

To answer these questions, a diagnostic test of cello technique was administered to thirty

college-level students currently studying the cello. The test results were analyzed using a rating

scale designed for this study (see Chapter 3). Interjudge reliability of the test was measured by

comparing independent evaluations of two judges who viewed video-recordings of five students

taking the test.

Delimitations

This study was not concerned with the following.

* Instruments other than the cello
* Creating an assessment instrument for ranking students, determining a letter grade, or
determining chair placement in ensembles
* Creating a playing test to be used in auditions
* Determining the subject's sight-reading ability
* The measurement of musical aptitude
* The measurement of a student's musicality or expressivity

Significance of the Study

A review of literature indicates that this is the first attempt to systematically measure the

diverse elements of cello technique. The five items used by Zdzinski/Barnes (2002) in their

String Performance Rating Scale: linel, pi ei, nin 1\ lI i. l Effect, Articulation/Tone, Intonation,









Rhythm/Tempo, and Vibrato do not attempt to examine a broad array of technical skills, but

rather provide a general assessment of a student's performance. This present study appears to be

the first to evaluate specific aspects of cello technique.

The results of this study can inform the teaching of strings, particularly the cello, at the

college-level. For example, teachers may find it useful to have a diagnostic tool to evaluate the

technical level of new students. Results from such a test may support or bring into question

conclusions commonly made by teachers based primarily on audition results and/or the student's

performance in initial lessons. Similarly, the test could expose areas of deficiencies in technique

and provide the teacher with indications regarding the etudes exercises or solo materials most

appropriate for study. An assessment of the student's overall playing level can assist the teacher

in choosing repertoire that is neither too easy nor too difficult.

Often errors in cello playing can be traced to a student's lack of clarity about the location

and relationship of pitches on the fingerboard. This understanding of so called 'fingerboard

geography' is measured in the Written Test, as well as an awareness of intervals, fingering skill,

and the ability to read in the three clefs used in cello music. The written test can quickly reveal

if a student is deficient in understanding this ability. Clarification of these areas can bring instant

results that no amount of practice can achieve.

The approach and design of this study could be used to create similar diagnostic tests for

violin, viola, and bass. Though there are aspects of technique that are unique to each of the

instruments in the string family, much of what is explored in this study would be transferable.

Future studies could also include a version designed for high school students.









CHAPTER 2
REVIEW OF LITERATURE

Introduction

Literature dealing with assessment of musical performance tends to fall into two

categories: summative assessments focus on the value of a finished project; formative

assessments focus on data gathered during the process of reaching a goal or outcome (Colwell,

2006). A studio teacher's ongoing process of diagnosis, correction, and reevaluation is an

example of formative assessment in music. A student recital or jury exemplifies summative

assessment in music performance. The diagnostic test of cello technique designed for this study

is a formative assessment, in that it measures a student's performance ability as a certain point on

a continuum that leads to mastery.

This literature review is divided in three parts. Part One examines the philosophical

foundation for this study. Part Two explores assessment theory and provides the theoretical

bases for this research. Part Three reviews research in assessment with particular emphasis on

performance.

Part One: Philosophical Rationales

A philosophic rationale is the bedrock upon which any scholarly inquiry is made. Reimer

(2003) succinctly describes its importance:

The "Why" questions-the questions addressed by philosophy-are the starting point for
all conceptualizations of education, whether in music, other subjects, or education as a
whole. Answers to these questions-questions of value-provide the purposes of
education, purposes dependent on what people in a culture regard to be so important that
education must focus on them (p. 242).

These questions must be asked not only of a given educational curriculum but also of the

means chosen for evaluation of material taught. Simply asking ourselves, "How do we

determine what we know?" brings our educational materials and pedagogy into greater focus.









Subjective as well as objective information shape our systems of evaluation. As Boyle

and Radocy (1987) observe, subjective information tends to vary from observer to observer and

its value in informing decision making is limited. Objective information, by definition, is

relatively unaffected by personal feelings, opinions, or biases. Musical evaluation should not be

limited to gathering only objective data but should include subjective observations as well.

Although certain aspects of musical performance can be measured with scientific precision, such

as vibrato width or decibel levels, the complex multi-faceted nature of music makes the

reliability of any measure less than perfect. This observation need not discourage music

educators, but rather help them recognize the need for stronger objective criteria for evaluation.

A music educator's personal philosophy of assessment is not tangential to their work, but

an essential base from which to define and direct teaching. Brophy (2000) explains the need for

a philosophy of assessment:

A personal assessment philosophy is an essential element in the development of a general
teaching philosophy. Exploring one's reasons for being a music teacher should
inevitably reveal personal reasons and motivations for believing that assessment is
important, including why it is important. The depth of one's commitment to music
education as a profession is also a fairly reliable predictor of one's commitment to
assessment as an important aspect of the music program (p. 3).

Deciding what is important for students to learn and why it is important determines how one

will assess what students know. Attitudes toward assessment directly influence the content and

quality of teaching. Inevitably, a teacher's philosophy of assessment will be most influenced by

how he or she was taught and evaluated as a student. This may help explain the range of

attitudes noted by Colwell (2006):

Evidence from learning psychology reveals that assessment properly conducted makes a
major difference in student learning and when incorrectly used, a corresponding negative
effect. The current hype, however, has not produced much action in the United States,
Canada, or Great Britain. To many music educators, assessment is so much a part of
instruction-especially in achieving goals in performance-that they do not believe more









is needed. Other music educators believe that any assessment is inappropriate as either
too quantitative or too mechanical (p. 210).

That some applied music teachers believe that they have no need for methods to assess

technique beyond their own listening skill is understandable. Most have spent their lives refining

evaluative skills: first, of their own playing, and then that of their students. These teachers may

feel it insulting to suggest that a test is better than they are at diagnosing a student's strengths and

weaknesses. However, these same teachers would not think twice about having a diagnostic test

of their car's electrical system if it were acting strangely. If a diagnostic test of cello technique

could be shown to give a reasonably accurate and rapid assessment of a student's playing level

and particular needs, skeptical teachers might come to appreciate the test's pragmatic value.

Aristotle in his Politics stated what is implied by every music school faculty roster: "It

is difficult, if not impossible, for those who do not perform to be good judges of the performance

of others" (p. 331). These philosophic roots may help to explain why teachers of applied music

are almost always expected to be expert performers. Skills of critical listening required of a

teacher must be refined and molded in the furnace of performance; these listening skills are the

essential abilities that a music teacher cannot do without. Because music performance involves

competence in the cognitive, affective, and psychomotor domains of learning, authentic

assessment must extend beyond single criterion, bi-level tests of the type appropriate for math or

spelling. No single test can measure all factors that go into a performance; at best a single test

may evaluate only a few aspects of a student's playing.

Two contemporary philosophical views on the role of evaluation in music are those of

Bennett Reimer (1989/2003), and David Elliott (1995). Though these scholars share many beliefs

about the role and value of universal music education, they represent two poles of thought









regarding the best way to achieve a musically fluent society. Their differences are philosophic

and concern a definition of the very nature of music.

Bennett Reimer

Reimer makes an important claim when discussing evaluation in music. After raising the

question: "By what criteria can those who partake of the work of musicians evaluate that work,"

he asserts, "...the same criteria applied to their work by musicians all over the word are the

criteria that can be applied to evaluating the results of their work' (pp. 266-267). For example, if

certain technical skills are required for a musically satisfying performance, these same skills can

and should be criteria for evaluation. Reimer's use of the term craft comes close to what

musicians mean when they speak of technique:

Craft, the internalization within the body of the ways and means to make the sounds the
music calls on to be made, is a foundational criterion for successful musicianship. This is
the case whether the musician is a first grader "being a musician," a seasoned virtuoso, or
anything in between. It is the case whatever the music, of whatever style or type, from
whatever culture or time (p. 266).

What is universal is the craft of music making, in all its varieties. However, the expression of

that craft is very distinct: "But crucially what counts as craft is particular to the particular music

being evaluated" (p. 266). Reimer's argument seems to support the validity of designing

assessment measures that are instrument, and even genre, specific.

Bennett Reimer notes: "... everything the music educator does in his job is carrying out

in practice his beliefs about his subject (Reimer, 1970, p. 7)." It is important that the

pedagogical approach a teacher uses reinforces his or her philosophical belief about why we do

what we do in music. If we believe, as Reimer does, that we are fundamentally teachers of

aesthetics through the medium of music, then every aspect of our work should support and

defend this view rather then detract from it.









Instrumental technique is a means to an end, not the end itself. Certainly the virtuosic

pyrotechniques required for some pieces blurs this distinction, but by and large most teachers

would be quick to acknowledge that complete absorption with the mechanics of playing is a

recipe for bum-out and loss of the joy of music-making. Cello teacher Fritz Magg observed that

'Calisthenics' literally comes from two Greek words: kalos, which means beautiful and stenos,

which means strength (Magg, 1978, p. 62). Accepting the principle that the development of

'strength' is a requisite for expression of 'the beautiful' serves as a rationale for designing a test

to assess technique.

Reimer believes that past and present attempts of assessment have two crucial flaws

(2003). First, they are not tailored to a specific musical activity, making the false assumption

that what is tested for is applicable to any and all musical involvements. Reimer states, "The

task for the evaluation community... is to develop methodologies and mechanisms for identifying

and assessing the particular discrimination and connections required for each of the musical

roles their culture deems important (p. 232). Just as Gardner (1983) brought to our attention the

need to define distinct kinds of intelligence, Reimer cautions that we should be wary of assuming

direct transfer of musical intelligence from role to role.

The second weakness of music testing according to Reimer is its almost exclusive

concentration on measuring the ability to discriminate, thereby neglecting to examine the

necessary connections among isolated aspects of musical intelligence (2003). The question of

how meanings are created through connections has been largely ignored, he suggests. This may

be partially attributed to heavy dependence on objective measurement in music research.

Qualitative studies may be better suited for this purpose. Reimer notes that many recent studies

in cognitive science may be applicable to musical evaluation.









David Elliott

Elliott (1995) makes a clear distinction between evaluation and assessment. He notes,

"The assessment of student achievement gathers information that can benefit students directly in

the form of constructive feedback". He sees evaluation as "being primarily concerned with

grading, ranking, and other summary procedures for purposes of student promotion and

curriculum evaluation" (p. 264). For Elliott, however, achieving the goals of music education

depends on assessment. He describes the primary function of assessment as providing accurate

feedback to students regarding the quality of their growing musicianship. "Standards and

traditions" are the criteria by which students are measured in determining how well they are

meeting musical challenges. Elliot leaves it to the reader to define what these standards and

traditions are and more specifically what means are used to determine their attainment.

Elliott's concept of assessment is one of supporting and advancing achievement over

time, noting "the quality and development of a learner's musical thinking is something that

emerges gradually" (p. 264). Elliott is concerned with the inadequacy of an assessment which

focuses on the results on a student's individual thinking at a single moment in time. Real

assessment of a student's development occurs when he or she is observed making music

surrounded by "musical peers, goals, and standards that serve to guide and support the student's

thinking" (p. 264).

Regarding evaluation, Elliott is unequivocal: "...there is no justification for using

standardized tests in music" (p. 265). He sees conventional methods of evaluation as

inappropriate in music because they rely on linguistic thinking. Like Gardner, Elliott insists that

an assessment, if it is to be intelligence-fair, must be aimed directly at the student's artistic

thinking-in-action.









To summarize, Elliott sees assessment as a process-oriented approach to teaching, using

constructive feedback embedded into the daily acts of student music making. Music is something

that people do; music assessment must then occur in the context of music making.

Comparing and Contrasting the Philosophic Viewpoints of Reimer and Elliott

The crux of the difference in music philosophies of Reimer and Elliott revolves around

the role of performance. Elliott sees all aspects of music revolving around the central act of

performing. As stated by Elliott, "Fundamentally, music is i,,iehi/ig that people do" (Elliott,

p.39, italics in original). Reimer notes that processes (music making) produce products (integral

musical works) and that, "performance is not sufficient for doing all that music education is

required to do, contrary to what Elliott insists" (Reimer, p. 51). Reimer sees performance as only

one of several ways musical knowledge is acquired, as opposed to being the essential mode of

musical learning. Elliott defines assessment of student achievement as a means of gathering

information that can be used for constructive feedback. He also values it as a means to provide

useful data to teachers, parents, and the surrounding educational community (p. 264).

However, Elliott is uncomfortable with any use of testing that simply focuses on a

student's thinking at one moment in time. One can imagine him acknowledging the value of a

diagnostic performance test, but only if it were part of a continuum of evaluations. Elliott's

insistence on the central role of performance prevents him from recognizing the value in a

critique of a musician's abilities at a given moment in time. Reimer sees the act of performing

composed music and improvisation as one requiring constant evaluation. Because he is willing to

acknowledge musical products (form) separately from the act of creating or regenerating, he asks

a more incisive question: "By what criteria can those who partake of the work of musicians

evaluate that work?" (p. 265). Considering the myriad styles, types and uses of music, Reimer









concludes that criteria for judging music must be distinctive to each form of music and therefore

incomparable to one another (p. 266). Reimer softens his stance by providing examples of

universal criteria: that is, criteria applicable to diverse musical forms. He does insist, however,

that they must be applied distinctively in each case:

Assessment of musical intelligence, then, needs to be role-specific. The
task for the evaluation community (those whose intelligence centers on issues of
evaluation) is to develop methodologies and mechanisms for identifying and assessing
the particular discrimination and connections required for each of the musical roles their
culture deems important. As evaluation turns from the general to the specific, as I
believe it urgently needs to do, we are likely to both significantly increase our
understandings about the diversities of musical intelligence and dramatically improve
our contribution to helping individuals identify and develop areas of more and less
musical capacity (p. 232).

Reimer accepts the view that there is a general aspect of musical intelligence, but

suggests that it takes its reality from its varied roles. This allows him to see evaluation in music

as a legitimate aspect of musicianship, part of the doing of music that Elliott insists on. His

philosophic position supports creating new measures of musical performance, especially as they

bring unique musical intelligence to light and aid in making connections across diverse forms of

music making.

Part Two: Theoretical Discussion

Assessment in Music: Theories and Definitions

Every era has a movement or event that seems to represent the dynamic exchange

between the arts and the society of that time. Creation of the National Standards for Art

Education is one such event. The Goals 2000: Educate America Act defined the arts as being part

of the core curriculum in the United States in 1994. That same year witnessed the publication of

Dance Music Theatre Visual Arts: What Every Young American .lhluI Know and Be Able to Do

in the Arts (MENC, 1994). It is significant that among the nine content standards, number seven









was: Evaluating music and music performances. Bennett Reimer, one of the seven music

educators on the task force appointed to write the document, discusses the central role of

evaluation in music:

Performing composed music and improvising require constant evaluation, both during the
act and retrospectively. Listening to what one is doing as one is doing it, and shaping the
sounds according to how one judges their effectiveness (and effectiveness), is the primary
doing-responding synthesis occurring within the act of creating performed sounds
(Reimer, 2003, p. 265).

Central to success is the ability to assess one's work. This assessment includes all of the content

standards, including singing, performing on instruments, improvising, and composing.

Evaluation is the core skill that is required for self-reflection in music. When a student is

capable of self evaluation, to some extent teachers have completed their most important task.

Reimer sees the National Standards as the embodiment of an aesthetic ideal, not merely a

tool to give the arts more legislative clout:

The aesthetic educational agenda was given tangible and specific formulation in the
national content standards, and I suspect that the influence of the standards will continue
for a long time, especially since their potential for broadening and deepening the content
of instruction in music education has barely begun to be realized (p. 14).

Reimer and the other members of the task force were given an opportunity to integrate a
philosophy into the national standards that values music education. With this statement they
articulated a philosophy defending the scholastic validity of the arts:

The Standards say that the arts have "academic" standing. They say there is such a thing
as achievement, that knowledge and skills matter, and that mere willing participation is
not the same thing as education. They affirm that discipline and rigor are the road to
achievement-if not always on a numerical scale, then by informed critical judgment
(MENC, 1994, p. 15).

Such statements are necessary in a culture that perniciously sees the arts as extracurricular

activities and not part of the core educational experience of every child.

Reimer has provided a philosophical foundation for assessment in the arts. Others, like

Lehman (2000), observe that, "Our attention to this topic is very uneven. It is probably fair to









say that in most instances evaluation is treated in an incidental manner and is not emphasized in

a systematic and rigorous way" (Lehman, pp. 5-6). As the standards movement grows, fueled by

greater interest in achievement testing in the arts, it is likely that this attitude will change.

Lehman describes how he sees the emerging role of music assessment:

I believe that the standards movement has set the stage for an assessment movement, and
I believe that assessment may become the defining issue in music education for the next
decade. Developing standards and defining clear objectives that flow naturally from
standards make assessment possible where it was often not possible before. But
standards do more than make assessment possible. They make it necessary. Standards
have brought assessment to the center of the stage and have made it a high-priority, high-
visibility issue. Standards and assessment inescapably go hand in hand. We cannot have
standards without assessment (p. 8).

Furthermore, we cannot have assessment without tests that are designed to measure all

kinds of music making, whether it be in bands, orchestras, choirs, or jazz ensembles. Included in

this list should be assessment of individual performance. New ways of more objectively

determining achievement in individual performance are greatly needed.

The need for assessment measures capable of assessing the multiple intelligence present

in the arts has been articulated:

Although some aspects of learning in the arts can be measured adequately by paper-and-
pencil techniques or demonstrations, many skills and abilities can be properly assessed
only by using subtle, complex, and nuanced methods and criteria that require a
sophisticated understanding. Assessment measures should incorporate these subtleties,
while at the same time making use of a broad range of performance tasks (Reimer, p. 15).

When Reimer observes that assessment in the arts is a complex task with subtle shades of

meaning, he is alluding to the ill-structured quality of many of the subject content domains in

music. Spiro, Vispoel, Schmitz, Smarapungavan, and Boeger (1987) define ill-structured

domains as content areas where "there are no rules or principles of sufficient generality to cover

most of the cases, nor defining characteristics for determining the actions appropriate for a given

case" (p. 184, as quoted Brophy, p. 7). Criteria for judgment in performance, therefore, must be









tailored to the idiosyncrasies of the particular instrument, its role as a solo or ensemble member,

the age and/or playing level of student, and the purpose of assessment.

Constructivism and Process/Product Orientation

Brophy defines the constructivist view of knowledge as those situations in which students

draw upon previous experience to understand new situations (2000, p. 10). This occurs when

teachers assess something specific like cello technique. Students are asked to transfer knowledge

and psycho-motor skills from one context: (previous playing experience) to another (performing

new or unfamiliar excerpts). Constructivist theory coincides with one of the definitions of

technique used in this research: the ability to transfer knowledge and performance skills

previously learned to new musical material.

Process-orientation tends to be aligned with a constructivist approach. Inquiry into new

areas of knowledge and understanding does not necessarily have a predetermined outcome.

Learning occurs during the process of exploration. Methods of evaluation in music and

elsewhere have tended to be product-oriented. The need to objectively quantify what has been

learned is an ongoing problem in the arts.

The desire to evaluate student achievement in relation to the attainment of pre-specified

objectives led to the creation of criterion-referenced or objective-referenced tests. These tests

evaluate achievement in relation to specific criteria rather than through comparing one student to

another (Boyle and Radocy, pp. 9-10). These tests, however, have been criticized for measuring

verbal intelligence rather than authentic music making (Elliott, pp. 75-76). It is possible,

however, for tests to be designed that measure components of both the process (technique) and

product (complete musical statement) of making music. Diagnostic tests that evaluate students

as they progress through increasing challenges may give the teacher insight regarding the









students' cognitive and psychomotor abilities. Thus, a diagnostic test in music can be designed

to evaluate both process and product.

Definitions

To understand theoretical rationale behind the evaluation of music ability terminology

must be clear. The term test refers to any systematic procedure for observing a person's

behavior relevant to a specific task or series of tasks. Measurement is a system designed to

quantify the extent to which a person achieves the task being tested. In music, testing usually

involves some form of a scoring system or rating scale. Evaluation means making judgments or

decisions regarding the level of quality of a music behavior or of some other endeavor (Boyle,

1992). The ideal evaluation model has a strong objective data component but encompasses

subjective but enlightened judgments from experienced music teachers (Boyle, p. 247). Boyle

and Radocy claim that evaluative decisions are best made when, "decision makers (a) have a

strong relevant information base, including both subjective and objective information, (b)

consider affective and, where appropriate, aesthetic reactions of (or to) the individual, group, or

endeavor being evaluated, and (c) be made with the primary goal of improving the quality of the

learner's educational experiences" (1987, p. 8). True evaluation must provide information that

enhances the educational experience and does not simply provide data for the purpose of

assigning grades, for determining who is allowed to play, or what the students chair placement

will be.

A diagnostic test is one which focuses on the present and is used to classify students

according to their strengths and weaknesses relative to given skills or knowledge (Boyle and

Radocy, p. 10). Such a test can be used to (a) group students for instruction or (b) provide

individualized instruction that corrects errors or challenges the learner. The diagnostic test of









cello technique created for this study is designed to serve the latter purpose. It falls into the

category of a narrow content focus test, which is defined as intensive in nature (Katz, 1973).

This type of test is appropriate for judging an individual's strengths and weaknesses. It allows

for intra-individual comparisons, such as ability levels of differing skills. Intensive tests provide

the basis for remedial instruction, as well providing indications of the means of improving areas

of weakness.

The purpose of a test largely determines what type of test needs to be chosen or

constructed for assessment purposes. If a test's primary purpose is to discriminate among

individuals, then the test is norm-referenced (Boyle and Radocy, p. 75). An individual

performance is judged in comparison to the performances of his or her peers. This type of test is

appropriate for making comparisons among individuals, groups or institutions.

"Criterion-referenced tests describe student achievement in terms of what a student can

do and may be evaluated against a criterion or absolute standard of performance" (Boyle, p.

253). Such a test is ideally suited to individual performance; the challenge for this test is how to

establish the criteria to be used as a standard. If a performance evaluation uses excerpts

accurately revealing a student's ability in demonstrating specific tasks, then that test has good

content validity; the test materials coincide with the skills being tested.

The focus of performance assessment may be global, i.e. a judgment of its totality, or

specific, i.e. a judgment of only particular aspects of performance. A diagnostic test would be

expected to use criteria that reveal specific aspects of performance, although the evaluation could

still include global statements about overall playing ability. The use of global and specific

approaches are explored in the review of literature at the end of this chapter.









Part Three: Research

The field of testing in string instrument performance is remarkably uncultivated.

However, there is a growing body of literature dealing with performance assessment in general,

and this writing has many implications for the problem addressed in this study. Examination of

this literature will begin with a survey of research in solo instrumental performance, noting the

specific aspects of performance measured and the approaches used. An exploration of the use of

factor analysis as a means of achieving high reliability and criterion-related validity will follow.

This section will close with a review of the research in measurement of string performance.

The Measurement of Solo Instrumental Music Performance

John Goodrich Watkins

The earliest known research in the area of solo instrumental performance was carried out

by Watkins (1942) for his doctoral dissertation at Teachers College, Columbia University.

Watkins constructed an objectively scored, comet rating scale. For this he composed 68 melodic

exercises based on selected cornet methods. Four equivalent forms of the test were designed,

each containing sixteen melodies of increasing difficulty. The measure was established as the

scoring unit and was considered to be played incorrectly if any errors of pitch, time, change of

tempo, expression, slur, rests, holds and pauses, or repeats occurred. After administering the

four preliminary test forms to 105 students, he used item analysis to construct two final forms of

the test. Equivalent forms and test-retest reliability coefficients were high (above .90).

Following this research, Watkins developed the Walkini-Farnum Performance Scale

(WFPS) (1954) for wind instruments and snare drum. This scale, along with the subsequently

constructed Farnum String Scale (Farnum, 1969), constitutes the only readily available

performance measure. As with the Watkins cornet study, this test, administered individually,









requires the performance of a series of passages of increasing difficulty. The student plays with

the aid of a metronome, continuing through the exercises until he or she scores zero in two

consecutive exercises. Again, the scoring unit is the measure, and the examiner is given a

detailed explanation of what constitutes an error. Two equivalent forms were constructed and

153 instrumentalists were tested. Correlations between Form A and Form B of the test have

ranged from .84 to .94. Criterion-related validity based on rank-order correlations ranged

between .68 for drum to .94 for cornet and trumpet.

Concerns have been raised about how well-suited the examples are for particular

instruments (Boyle and Radocy 1987). Some dynamic markings appear artificial and no helpful

fingerings are provided for technical passages. There is no attempt to measure tone quality,

intonation, or musical interpretation. The latter is an inherently subjective judgment but

nevertheless a critical part of an assessment of musical performance. As a result, the test's

content validity has been questioned (Zdzinski and Barnes, 2002).

The WFPS contains highly specific directions for scoring aspects of playing, that

teachers can all agree upon. As a result, it continues to be used by default, as no other measure

provides a similar level of objectivity. A number of investigators have used the WFPS as a

primary measurement tool for their research. Boyle (1970), in an experimental study with junior

high wind players, demonstrated that students who practiced reading rhythms by clapping and

tapping the beat showed significantly greater improvement as measured by the WFPS. More

recently Gromko (2004) investigated relationships among music sight reading as measured by

the WFPS and tonal and rhythmic audiation (AMMA, Gordon, 1989), visual field articulation

(Schematizing Test, Holzman, 1954), spatial orientation and visualization (Kit ofFactor-

Referenced Cognitive Tests, Ekstrom et al., 1976), and academic achievement in math concepts









and reading comprehension (Iowa Tests of Educational Development, Hoover, Dunbar, Frisbie,

Oberley, Bray, Naylor, Lewis, Ordman, and Quails, 2003). Using a regression analysis,

Gromko determined the smallest combinations of variables in music sight reading ability, as

measured by the WFPS. The results were consistent with earlier research, suggesting that music

reading draws on a variety of cognitive skills including visual perception of patterns rather than

individual notes.

The WFPS has its greatest validity as a test for sight reading. Sight reading is a composite

of a variety of skills, some highly specialized. Using only this test to rank students on

musicianship, technique or aptitude would be inappropriate, however. This test design reveals a

certain degree of artificiality; the use of the measure as a scoring unit and choice of ignoring

pauses between measures are somewhat contrived. Nevertheless, Watkins and Farnum

succeeded in developing the most reliable and objective performance testing instrument in their

day.

Robert Lee Kidd

Kidd (1975) conducted research for his dissertation concerning the construction and

validation of a scale of trombone performance skills at the elementary and junior high school

levels. His study exemplifies a trend toward more instrument-specific research. Kidd focused

on the following questions:

* What performance skills are necessary to perform selected and graded solo trombone
literature of Grades I and II?
* What excerpts of this body of literature provide good examples of these trombone
performance skills?
* To what extent is the scale a valid instrument for measuring the performance skills of
solo trombonists at the elementary and junior high school level?
* To what extent is the scale a reliable instrument?









Solos from the selective music lists of the National Interscholastic Music Activities

Commission of the MENC were content analyzed, and 50 performance skills were identified

coinciding with range, slide technique and articulation. Each skill was measured by four

excerpts and administered to 30 junior high school trombonists. These performances were taped

and evaluated by three judges. Results from this preliminary form of the measurement were

analyzed, providing two excerpts per skill area. Equivalent forms of the measure were created,

each using one of the two excerpts selected. This final version was administered to 50 high

school students. Interjudge reliability coefficients were .92 for form A and .91 for form B.

Equivalent forms reliability was found to be .98. Validity coefficients ranged from .77 to 1.0 for

both forms. Zdzinski (1991, p.49) notes that the use of a paired-comparison approach rather than

the use of teacher rankings may have affected validity coefficients.

Kidd concluded that the Scale of Trombone Performance Skills would be useful to

instrumental music educators in their appraisal of the following areas of student progress:

guidance, motivation, improvement of instruction and program, student selection maintenance of

standards, and research. Kidd recognized that the time requirement (thirty-six minutes for

administration, twenty one minutes for judging, and nine minutes for scoring) could make this

version of the scale impractical in a public school situation and acknowledged that some

modifications in the administration and scoring procedures could facilitate the extent of the

scale's use (pp. 93-94).

Janet Mills

Mills (1987) conducted an investigation to determine what extent it was possible to

explain current assessment methods for solo music performances. In a pilot study, she chose six

instrumental music students, aged 15 years or above, who were capable of performing grade-









eight music from a British graded music list. Videotapes were made of their performances and

these were scored by 11 judges. Judges were asked to write a comment about each performance

and give it a mark out of 30 based on the scale of the Associated Boards of the Royal Schools of

Music. Two adjudicating groups were formed consisting of: 1) Music teachers and music

specialist students, and 2) Nonspecialists with experience of musical performance. After the

judging occurred, judges were interviewed about the evaluative criteria. From these interviews,

the following 12 statements or constructs were generated:

* The performer was Nervous/Confident
* The performer Did not enjoy/Did enjoy playing
* The performer Hardly knew/Was familiar with the piece
* The performer Did not make sense/Made sense of the piece as a whole
* The performer's use of dynamics was Inappropriate/Appropriate
* The performer's use of tempi was Inappropriate/Appropriate
* The performer's use of phrasing was Inappropriate/Appropriate
* The performer's technical problems were Distracting/Hardly noticeable
* The performance was Hesitant/Fluent
* The performance was Insensitive /Sensitive
* The performance was Muddy/Clean
* I found this performance Dull/Interesting

In the main part of her study, phase two, Mills taped ten performances, again dividing her

29 judges into the groupings previously mentioned. Judging was done using both the original

30-point overall rating (with comments), as well as with the newly created criteria. Inter-item

correlations and correlations among marks on the 30-point scale were all positive. Correlations

between overall marks and individual items were all negative. Because of the small sample size,

no data on significance could be provided. Nevertheless, this study demonstrates a well designed

method for examining criterion-related validity of newly created evaluative statements with an

existing performance measurement.









The Use of Factor Analysis in Performance Measurement

The tests discussed so far, and others like them, have a fundamental problem with

reliability; the measures employed were typically subjective judgments based on uneven and

unspecified observations. It became increasingly clear to researchers that greater attention

needed to be focused on systematically objectifying the methods used in musical evaluation.

The use of rating scales to replace or substantiate judges' general impressions is an approach that

has been explored by several researchers. Factor analysis of descriptive statements generated for

assessment became an important technique for improving content validity and interjudge

reliability.

Factor analysis comprises a number of techniques that can be used to study the

underlying relationships between large numbers of variables. Common factor analysis reveals

the factors that are based on the common or shared variance of the variables (Asmus and

Radocy, 2006). All methods of factor analysis seek to define a smaller set of derived variables

from a larger collection of data. When applied to performance evaluation, factor analysis can

help to determine systematically common evaluative criteria. Potential benefits include

increased content validity and greater interjudge reliability. The groundbreaking work of Abeles

in the use of factor analysis to develop a highly reliable and valid performance scale for clarinet

led other researchers to use factor analysis in designing their scales. The following studies are

examples of the application of factor analysis to performance measurement.

Harold F. Abeles

Abeles' (1973) research in the development and validation of a clarinet performance

adjudication scale grew from a desire to replace a judge's general impressions with more

systematic procedures. He turned to rating scales because they would allow adjudicators to base









their decisions on a common set of evaluative dimensions rather than their own subjective

criticisms.

In the first phase of the study, 94 statements were generated through content analyses of

essays describing clarinet performance. These statements were also formulated through a list of

adjectives gathered from several studies which described music performance. Statements were

paired with seven a priori categories: tone, intonation, interpretation, technique, rhythm, tempo,

and general effect. The statements were then transformed to items phrased both positively and

negatively; items that could be used by instrumental music teachers to rate actual clarinet

performances. Examples from this item pool are: 1. The attacks and releases were clean. 2. The

clarinetist played with a natural tone. 3. The clarinetist played flat in the low register. The items

were randomly ordered and paired with a five point Likert scale, ranging from "highly agree" to

"highly disagree."

Factor analysis was performed on the evaluation of 100 clarinet performances using this

scale. Six factors were identified: interpretation, intonation, rhythm, continuity, tempo,

articulation, and tone-with five descriptive statements to be judged for each factor. The final

form of the Clarinet Performance Rating Scale (CPRS) was comprised of items chosen on the

basis of having high factor loadings on the factor they were selected to measure and low factor

loadings on other factors. The thirty statements chosen were grouped by factors and paired with a

five-point Likert scale. Ten taped performances were randomly selected and rated using the

CPRS by graduate instrumental music education students. For the purpose of determining

interjudge reliability, judges were divided into groups of 9, 11 and 12 judges. Item ratings from

these judges were again factor analyzed to determine structure stability.









Abeles found that the six-factor structure produced from the factor analysis was

essentially the same as the a priori theoretical structure. This suggested good construct validity.

He concluded that this structure would be appropriate for classifying music performance in

general, as none of the factors seemed to reflect idiosyncratic clarinet characteristics. On the

other hand, Zdzinsky (2002) found that the factors identified to assess stringed instrument, wind

instrument and vocal performance are distinct and related to unique technical challenges posed

by each performance area.

The interjudge reliability estimates for the CPRS were consistently high (.90). Individual

factor reliabilities ranged from .58 to .98, with all factors but tone and intonation above .70.

Criterion-related validity based on correlations between CPRS total scores and judges' ratings

were .993 for group one, .985 for group two, and .978 for group three. Predictive validity (<.80)

was demonstrated between the CPRS and global performance ratings.

Martin J. Bergee

The development of a rating scale for tuba and euphonium (ETPRS) was the focus of a

doctoral dissertation by Bergee (1987). Using methods similar to Abeles, Bergee paired

descriptive statements from a literature, adjudication sheets and essays with a Likert scale to

evaluate tuba and euphonium performances. Judges initial responses led to identification of five

factors. A 30-item scale was then constructed based on high factor loadings. Three sets of ten

performances were evaluated by three panels of judges (N= 10) using the rating scale. These

results were again factor analyzed, resulting in a four-factor structure measuring the items:

interpretation/musical effect, tone quality/intonation, technique, and rhythm/tempo.

Interestingly, factor analysis produced slightly different results then in the Abeles' Clarinet

Performance Adjudication Scale. Technique was unique to this measure, while articulation was









unique to the Abeles measure. Abeles' measure also isolated tone quality and intonation as

independent items. The idiomatic qualities of specific instruments or families of instruments

may result in the use of unique factors in performance measurement.

Interjudge reliability for the ETPRS was found to be between .94 and .98, and individual

factor reliabilities ranged from .89 to .99. Criterion-related validity was determined by

correlating ETPRS scores with global ratings based on magnitude estimation: (.50 to .99).

ETPRS scores were also correlated with a VMENC-constructed wind instrument adjudication

ballot resulting in validity estimates of .82 to .99.

The Development of a Criteria-Specific Rating Scale

T. Clark Saunders & John M. Holahan

Saunders and Holahan (1997) investigated the suitability of criterion-specific rating

scales in the selection of high school students for participation in an honors ensemble. Criteria-

specific rating scales differ from traditionally used measurement tools in that they include

written descriptors of specific levels of performance capability. Judges are asked to indicate

which of several written criteria most closely describes the perceived level of performance

ability. They are not required to express their like or dislike of a performance or decide if the

performance meets an indeterminate standard.

In this study, criterion-specific rating scales were used by 36 judges in evaluating all 926

students seeking selection to the Connecticut All-State Band. These students were between

grades 9-12 and enrolled in public and private high schools throughout the state of Connecticut.

Only students who performed with woodwind and brass instruments were examined in this

study, because the judges were able to use the same evaluation form. The 36 adult judges

recruited in this study were comprised of elementary, secondary, and college-level instrumental









music teachers from Connecticut. All had a minimum of a bachelor's degree in music education

and teacher's certification.

Three aspects of student performances were examined: solo evaluation, scales, and sight

reading. The following specific dimensions of instrumental performance were assessed:

* Solo Evaluation: Tone, Intonation, Technique/Articulation, Melodic Accuracy, Rhythmic
Accuracy, Tempo, and Interpretation
* Scales: Technique, Note Accuracy, and Musicianship
* Sight-Reading: Tone, Note Accuracy, Rhythmic, Technique/Articulation, and
Interpretation

For each performance dimension, a five-point criteria-specific rating scale was constructed

using either "continuous" (sequentially more demanding performance criteria) or "additive"

(nonsequential performance criteria). Each of the criteria were chosen to describe a specific

level of music skill, content, and technical achievement. The Woodwind/Brass Solo evaluation

was comprised of 11 continuous rating scales and four additive rating scales. The overall level

of performance achievement for each student was derived from the sum of the scores for each of

the performance dimensions.

The observed means and standard deviations indicated that judges found substantial

variation in the performances in each dimension and for each instrument. Despite the relative

homogeneity of the student sample, judges demonstrated a high level of variability. Students

were provided specific information about levels of performance strengths and weaknesses. The

median alpha reliability among the 16 instruments was .915, suggesting that there was a

sufficient level of internal consistency among judges. The correlations between each

performance dimension and the total score ranged from .54-.75 with a median correlation of .73.

These correlations suggest that each scale dimension contributed substantial reliable variance to

the total score. Saunders and Holahan concluded that the pattern of correlations provided indirect









evidence of the validity of the criteria-specific rating scales for diagnosing the strengths and

weaknesses of individual performances. The researchers noted that because three kinds of

performances (prepared piece, scales, and sight-reading) were measured, factor analysis would

provide insight into the interdependence of performance dimensions across these types of

playing. Factor analysis would indicate the constructs that guide adjudicators in the evaluation

process as well.

Saunders and Holahan's findings have implications for the present study. Their data

provide indirect evidence that criteria-specific rating scales have useful diagnostic validity.

Through such scales, students are given a diagnostic description of detailed aspects of their

performance capability, something that Likert-type rating scales and traditional rating forms

cannot provide. Such scales help adjudicators listen for specific aspects of a performance rather

than having them make a value judgment about the overall merits of a performance.

The Measurement of String Performance

Stephen E. Farnum

Because of the success obtained and reported with the Walkin,-Farnum Performance

Scale, and its practical value as a sight-reading test for use in determining seating placement and

periodic measurement, it was suggested that a similar scale be developed for string instruments

(Warren, 1980). As a result, the Farnum String Scale: A Performance Scale for All String

Instruments (1969) was published. Both tests require the student to play a series of musical

examples that increase in difficulty. No reliability or validity information is provided in the

Farnum String Scale (FSS). The test manual describes four preliminary studies used to arrive at

sufficient range of item difficulty. Initially Farnum simply attempted to transpose the oboe test

from the WFPS, but he found that there was an inadequate spread of difficulty. New exercises









were written, resulting in a final form of 14 exercises that are designed to evenly increase in

difficulty level.

Like the WFPS, the Farnum String Scale uses scoring based on measure-by-measure

performance errors. The performance errors that can be taken into account are as follows:

* Pitch Errors (A tone added or omitted or played on a wrong pitch)
* Time Errors (Any note not given its correct time value)
* Change of Time Errors (A marked increase or decrease in tempo)
* Expression Errors (Failure to observe any expression marks)
* Bowing Errors
* Rests (Ignoring a rest or failure to give a rest its correct value)
* Holds and Pauses (Pauses between notes within the measure are to be counted as errors)
* Repeats (Failure to observe repeat signs)

The Farnum String Scale manual does not indicate how to use test results, except for the

title page which states: "A Standard Achievement Test for Year to Year Progress Records,

Tryouts, Seating Placement, and Sight Reading" (1969). Grading charts are included as part of

the individual sheets.

Despite the extensive revision process, criticism has been leveled at this test by some,

suggesting that the bowings were not well thought out (Warren, 1980). In examining the

exercises written, the following problems are found: 1. bowings that require excessive retakes, 2.

bowings that are awkward, i.e. non-idiomatic, and 3. bowings that are ambiguous, or not clearly

marked. Clarity in bowing is a concern because bowing errors often lead to other errors,

especially in rhythm. In several of the exercises, arbitrary bowing decisions have to be made

when sight-reading. Since bowing is one of the tested items, students should not be required to

devise bowing solutions that are not clearly marked. Bowing ambiguity represents a flaw in the

test validity.

Boyle and Rodocy observe that, "despite the criticisms that may be leveled against the

WFPS and the FSS, the tests do attain a certain amount of objectivity by providing highly









specific directions for scoring performance aspects about which most experienced teachers could

agree regarding correctness" (p. 176). These tests established a precedent for providing explicit

detail as to what constitutes an error in performance.

Stephen F. Zdzinski & Gail V. Barnes

Zdzinski and Barnes demonstrated that it was possible to achieve high reliability and

criteria-related validity in assessing string instrument performances. In their 2002 study, they

initially generated 90 suitable statements gathered from essays, statements, and previously

constructed rating scales. These statements were sorted into a priori categories that were

determined by previous research. As with the Abeles study, a Likert scale was paired with these

items. Fifty judges were used to assess one hundred recorded string performances at the middle

school through high school level. Results from the initial item pool were factor-analyzed using a

varimax rotation. Five factors to assess string performance were identified:

(interpretation/musical effect, articulation/tone, intonation, rhythm/tempo and vibrato). These

were found to be somewhat different than Abeles (1973) and Bergee (1987) in their scales

construction studies of woodwind and brass performance. This is not surprising, considering the

unique challenges of string instrument and woodwind instrument technique. String instrument

vibrato had items that were idiomatic for the instrument. Likewise, articulation and tone quality

are largely controlled by the right (bowing) side in string performance and were loaded onto a

single factor, as contrasted with wind instrument assessment scales. The authors found that

factors identified to assess string instrument, wind instrument, and vocal performance are

distinct, and related to unique technical challenges specific to the instrument/voice (Zdzinski,

p.253).









Twenty-eight items were selected for subscales of the String Performance Rating Scale

(SPRS) based on factor loadings. The reliability of the overall SPRS was consistently very high.

Reliability varied from .873 to .936 for each judging panel using Hoyt's analysis of variance

procedure. In two studies conducted to establish criterion related validity, zero order correlations

ranged from .605 to .766 between the SPRS and two other rating scales.

The researchers concluded that string performance measurement may be improved

through the use of more specific criteria, similar to those used in their study (Zdzinsky, p. 254).

Such tools may aid the educator/researcher by providing highly specific factors to listen and

watch for when analyzing student performances.

Summary: Implications for the Present Study

Studies carried out in the measurement of instrumental music performance have

increased in reliability, validity, and specificity since the first standardized test for band

instruments-the Waukin,-Farnum Performance Scale of 1954. Surprisingly, along with the

Farnum String Scale, this is still the only readily available published performance measure. One

can conjecture that the use of teacher-made tests account for this, but the more plausible

explanation is music teachers' distrust of any test that would claim to be capable of measuring a

subject as complex and multifaceted as music performance.

The use of descriptive statements that were found through factor analysis to have

commonly accepted meanings has been a significant development in increasing content validity

in performance measurement. As researchers applied the techniques pioneered by Abeles

(1973), they discovered that factors identified for one instrument or group of instruments did not

necessarily transfer directly to another instrumental medium. Statements about tonal production









on a clarinet may not have the same high factor loadings on a string instrument where tone

production is controlled primarily by bowing technique (Zdzinski, 2002).

Through factor analysis the reliability of the new measures improved. However, with

additional research came more questions. In the Abeles (1973) and Zdzinski (2002) studies, only

the audio portions of performances were analyzed by judges. The reasons these researchers

chose not to include visual input is not addressed in their studies, but the fact that they chose to

record results using audio only may have contributed to the higher reliability found in these

studies. Gillespie (1997) compared ratings of violin and viola vibrato performance in audio-only

and audiovisual presentations. Thirty-three inexperienced players and 28 experienced players

were videotaped while performing vibrato. A panel of experts rated the videotaped performances

and then six months later rated the audio-only portion of the performances on five vibrato

factors: width, speed, evenness, pitch stability, and overall sound. While the experienced

players' vibrato was rated higher regardless of what mode of presentation, results revealed

significantly higher audiovisual ratings for pitch stability, evenness, and overall sound for

inexperienced players and for pitch stability for experienced players. The implications are that

visual impressions may cause adjudicators to be less critical of the actual sound produced.

Gillespie notes; "The visual stimuli give viewers additional information about a performance that

can either be helpful or distracting, causing them to rate the performance differently than if they

had simply heard it." He adds, "If the members of the panel see an appropriate motion for

producing vibrato, they may rate the vibrato higher, regardless if the pitch drifts slightly"

(Gillespie, p. 218). At the very least, the study points out the need for the strictest possible

consistency in the content-format given to the judges to assess. If assessment is made from an









audiovisual source or a viewed live performance, the possible effects of visual influence on the

ratings needs to be considered.

Concerns about content validity were uppermost in mind when choosing the excerpts for

the Diagnostic Test of Cello Technique. In the following chapter the development and validation

of these materials is discussed, as well as the measurement used to quantify the data from the

written and playing portions of the test.









CHAPTER 3
METHODOLOGY

The purpose of this study was to construct, validate and administer a diagnostic test of

cello technique for use with college-level students. This test is criterion-referenced and included

both quantitative and qualitative measurements. This study was implemented in the following

stages: (a) development of an initial testing instrument, (b) administration of a pilot test, (c)

administration of a validity study, (d) administration of the final test, and (e) data analyses

procedures for the final test, including an interjudge reliability measurement. This chapter

describes the following methodological elements of the study: setting and participants,

instrumentation, data collection, data analysis, and validity and reliability procedures.

Setting and Participants

Approval for conducting this study was obtained first from the Institutional Review

Board (IRB) of the University of Florida. A copy of the informed consent letter is included in

Appendix D. The testing occurred at the respective schools of the participants, using studio or

classroom space during times reserved for this study.

College-level students (n = 30) were recruited for this study from three private and three

public universities in the southeastern region of the United States. While this demographic does

not include all the regions of the United States, the variability is considered adequate for this test,

which was not concerned with regional variations, if such variations exist, in cello students. The

participants selected were undergraduate cello students, both majoring and minoring in music.

This subject pool consisted of music performance majors (n = 16), music minors (n = 1), double

majors (n = 3), music therapy majors (n = 2), music education majors (n = 6), and music/pre-

med. students (n = 2). Using subjects from a diversity of academic backgrounds assumes that









this test has value as a diagnostic tool for students studying music through a wide variety of

degree programs, not just those majoring in performance.

A letter of introduction that explained the purpose of the study was mailed to the cello

faculty of the six schools. Upon receiving approval from the faculty cello teacher, the letter of

consent along with the Playing Test (Appendix G) was provided for each participant. One copy

of the consent form was signed and returned from each participating student. Following this,

times were arranged for each student to take the Written and Playing Test. Each student received

a copy of the Playing Test a minimum of two weeks before the test date. Included with the

Playing Test was a cover letter instructing the students to prepare all excerpts to the best of their

ability. Attention was directed toward the metronome markings provided for each of the

excerpts. Students were instructed to perform these excerpts at the tempos indicated, but not at

the expense of pitch and rhythmic accuracy.

Data Collection

The Written and Playing Test

Each participant met individually with the primary investigator for forty-five minutes.

The first thirty minutes of testing time was used for the Playing Test. Before beginning to

perform the Playing Test, students were asked to check their tuning with the pitch A-440

provided for them. Students were also asked to take a moment to visually review each excerpt

prior to performing it. Students were asked to attempt to play all the excerpts, even if some

seemed too difficult for them.

The primary investigator listened to and judged the individual student's skill level for

each performance. For each aspect of technique assessed, a five-point criteria-specific rating

scale was constructed. The Playing Test evaluation form (Appendix H) used both "continuous"









(sequentially more demanding performance criteria) and "additive" (nonsequential performance

criteria). When a technique was measured using a continuous rating scale, the number next to the

written criterion that corresponded to the perceived level of skill was circled. When using the

additive rating scale, the primary investigator marked the box beside each of the written criteria

that described one aspect of the performance demonstrating mastery of the skill. Both the

continuous and the additive rating scale have a score range of 2-10 points, as two points were

awarded for each level of achievement or each performance competency. It was theoretically

possible for a student to score 0 on an item using an additive scale if their performance matched

none of the descriptors. Seven continuous rating scales and ten additive rating scales constituted

the Playing Test evaluation form. The overall level of performance achievement for each student

was calculated as the sum of the scores for each area of technique.

The Student Self-Assessment Profile

The last fifteen minutes was devoted to the completion of the Written Test (Appendix E)

and the Student Self-Assessment Profile (Appendix J). To maintain the highest control in

administering the test, the primary investigator remained in the room while the Written Test was

taken, verifying that neither a piano nor cello was referred to in completing the test. The Written

Test evaluation form is provided in Appendix F.

Rationale for the Assessment Methodology

Saunders and Holahan (1997) have observed that traditional rating instruments used by

adjudicators to determine a level of quality and character (e.g., outstanding, good, average,

below average, or poor) provide little diagnostic feedback. Such rating systems, including

commonly used Likert scales, cause adjudicators to fall back on their own subjective opinions

without providing a means to interpret the results of the examination in new ways. Furthermore,









due to their design, these rating scales are incapable of providing much in the way of interpretive

response. As Saunders and Holahan observe, "knowing the relative degree to which a judge

agrees or disagrees that, 'rhythms were accurate,' however, does not provide a specific indication

of performance capability. It is an evaluation of a judge's magnitude of agreement in reference

to a nonspecific and indeterminate performance standard and not a precise indication of

particular performance attainment" (p. 260).

Criteria-specific rating scales are capable of providing greater levels of diagnostic

feedback because they contain written descriptors of specific levels of performance capability. A

five-point criteria-specific rating scale was developed for this study to allow for greater

diagnostic input from judges. Aspects of left hand and bowing technique were evaluated using

both continuous (sequentially more exacting criteria) and additive (nonsequential performance

criteria). Both continuous and additive scales require a judge to choose which of the several

written criteria most closely describe a student's performance. The additive scale was chosen

when a particular technique (such as playing scalar passages) has a number of nonsequential

features to be evaluated, such as evenness, good bow distribution, clean string crossings, and

smooth connections of positions.

Along with the five-point criteria specific rating scale, the Playing Test evaluation form

(Appendix H) provided judges with an option of writing additional observations or comments

about each technique evaluated. While these data are not quantifiable for measurement

purposes, recording the judge's immediate reactions in their own words to a student's

performance may capture an insight into some aspect of performance that the written criteria

overlooks. Because the primary purpose of this test is diagnostic, allowing room for

commentary is important.









Interjudge Reliability

Two adjudicators were recruited to determine interjudge reliability of the Playing

Test. Both judges were professional cellists who teach at the college-level. To decrease

selection bias as a threat to external validity, the adjudicators were chosen from two different

geographical regions and teaching institutions. An introductory DVD was provided, explaining

how to use the Playing Test evaluation form in assessing student performances.

Each judge viewed and listened to DVDs of five separate student performances of the

Playing Test, and rated the performances using the Playing Test evaluation form (Appendix H).

Judges were asked to return the results by a specified date, using a self-addressed stamped

envelope provided. The combined judges' evaluations often individual students were correlated

to the primary investigators evaluation results of these same students.

Data Analyses

Data analyses included item analysis for both the Written and the Playing Test. The

distribution of total scores was described using means and standard deviations. Item difficulty,

as expressed as the proportion of students who answered an item correctly, was determined.

Item discrimination analysis was conducted using the point biserial correlation to reveal the

strength and direction of the relationship between success on a particular item and success on the

total test. Qualitative data from the Observations/Comments portion of the Playing Test were

examined and compared with individual scores.

The content of the Student Self-Assessment Profile was evaluated and correlated to the

data from other sections of the test. Relationships were studied between the student's scores on

the Written and Playing Test and: a) year in college, b) major/minor distinction c) years of study,

d) piano experience, e) extent and content of repertoire, f) degree of interest in performance









areas, g) personal goals for studying the cello, h) expressed area of technique needing

improvement, and i) short term and long term goals in music.

Content Validity

The techniques that were assessed in this study are believed to be essential aspects of left-

hand and bowing techniques for a college-level student. The choice of categories for left-hand

and bowing technique was based on the frequency these techniques are found in the repertoire

for cello, as well as the discussion of them in the following sources: The Ivan Galamian Scale

System for Violoncello, arranged and edited by H. J. Jensen; The four Great Families of Bowings,

by H. J. Jensen (Unpublished Paper); Cello Playing of Today, by M. Eisenberg; Cello Exercises:

A Comprehensive Survey of Essential Cello Technique, by F. Magg; and Dictionary of Bowing

and Pizzicato Terms, by J. Berman, B. Jackson, and K. Sarch.

A validation study was conducted to determine to what extent teachers felt this test

measured a student's technique (Mutschlecner, 2005). Cello teachers (N= 9) on the college and

college preparatory level agreed to participate in this validity study by reading all sections of the

diagnostic test and then responding to questions in an evaluation form. The results of this study

are provided in Appendix B









CHAPTER 4
RESULTS

This chapter describes the procedures used to analyze the data collected and presents the

results of these analyses. Data from the Written Test, the Playing Test, and the Student Self-

Assessment were collected from 30 participants in accordance with the procedures outlined in

Chapter 3. The dependent variables of this study were the Written and Playing Test scores.

Independent variables were (a) year in school, (b) major/minor distinction, (c) years of cello

study, and (d) piano experience.

Data Analysis

Descriptive data for the scores were tabulated and disaggregated by independent variable.

Data were explored using t-tests, regressions, and correlations. Regressions were used to

determine the effect of the independent variables on the obtained test scores. The independent

variables of major/minor distinction, year in school, and piano experience are categorical, and

dummy codes were used to represent these variables in the regression analyses. Item difficulty,

item discrimination, and point biserial correlations were calculated for the Written Test.

Cronbach's Alpha (a) was used to estimate of reliability of individual items on the Playing Test.

The Spearman rank-order correlation was used as a measure of the Playing Test's validity.

Interjudge reliability was calculated using Pearson's r.

Questions on the Written Test were dichotomous, and tests were scored and yielded

continuous data. The Playing Test performances were evaluated using the criteria-specific rating

scale that was revised following the pilot test (see Appendix A for the Pilot Study report). Two

external reliability researchers viewed and evaluated videotapes of 33% (N= 10) of the Playing

Tests. These data were then correlated with the primary investigator's scores of these same

student performances as a measure of interjudge reliability. The participants' cello teachers rank









ordered their students by level of technical skill based on their assessment of the students'

playing technique. These rankings were correlated to those based on the Playing Test results as a

measure of validity. The data analysis was designed to explore the following research questions:

1. To what extent can a test of cello playing measure a student's technique?

2. To what extent can a criteria-specific rating scale provide indications
of specific strengths and weaknesses in a student's playing?

3. Can a written test demonstrate a student's understanding of fingerboard
geography, and the ability to apply music theory to the cello?

Participants

Written and Playing Test scores, and student answers to questions in the Student Self-

Assessment Profile were obtained (N= 30). Participants were undergraduate music majors and

minors studying cello at three private and three public universities (N= 6) in the southeastern

region of the United States.

Part One: The Written Test

Scoring the Written Test

The Evaluation Form used to tabulate the scores for the Written Test is provided in

Appendix F. Items on the Written Test were assigned points using the following system:

(1) Fingerboard Geography: 11 points. (44 pitch locations to identify were divided by 4)

(2) Interval Identification: 8 points.

(3) Pitch Location and Fingering: 32 points. (a single point was assigned for correctly

identifying both pitch and fingering)

(4) Single Position Fingering: 32 points.

(5) Bass, Treble, and Tenor Clef Note Identification: 12 points.

The total possible score for the combined sections of the Written Test was 95 points.









Results from the Written Test

Table K-l (Appendix K) presents the raw scores of the Written Test items and the

composite means and standard deviations. Reliability of the Written Test was obtained using the

Kuder-Richardson formula, revealing the internal consistency of test items: rKR20 = .95. This

result indicates that despite the narrow range of scores, the Written Test has strong interitem

consistency.

Table 4-1 presents the data from a regression analysis for year in school (freshmen,

sophomore, junior, and senior) and the Written, Playing, and combined Test scores. Freshmen

classification emerged as a significant predictor (p < .05) for the Playing Test and combined test

scores. The R-squared value of .28 indicates that freshmen classification accounted for 28% of

the variance in the Playing Test Scores. For the combined Written and Playing Test scores, the

R-squared value of .265 indicates that freshmen classification accounted for 27% of the variance.

With the exception of these findings, year in school does not seem to bear a relationship to

technical level, as measured by the Written and Playing Test.

Exploring the relationship of test scores and student's degree program was complicated,

as there was a mixture of music performance majors, double majors, music education majors,

music therapy majors, and music minors. One school did not allow freshmen to declare music

performance as a major until their sophomore year, insisting they enter the studios initially as

music education majors. If one classified double majors in the music performance category, then

there were 21 music performance majors and nine students in the "other" category. A regression

analysis was conducted with major/minor distinction as a predictor of the written, playing and

total scores. No effect of major or minor distinction was found for the Written Test (R2= .001).

Results were nearly significant for the Playing Test (p = .08)) and not significant for the









combined Written and Playing Tests (p = .15). A student's choice to major in cello does not

appear to be an indication of his or her technical level according to this test.

The 30 cellists participating in this research had studied the cello between five and

sixteen years (Table 4-2). A regression was conducted with years of cello study as a predictor of

the scores. For the Written Test, (B = .037, SEB = .069, 8 = .53) and the Playing Test, (B = .044,

SEB = .024, 8 = 1.82) years of cello playing was not found to be a significant predictor (p = .60;

p = .08). A lack of relationship between years of cello playing and scores may reflect the wide

range of students' innate ability and developmental rate. The relatively small sample size also

means that outliers have skewed the results. Efficient use of practice time is an acquired skill; it

is possible for students with fewer years of experience to surpass those that, while having played

longer, are ineffective in their practice.

Though no data on actual numbers of years of piano experience were collected, exactly

one-half of the participants reported having piano experience, and one-half reported having no

piano experience (ns = 15). A t-test of the means for Written and Playing Test scores was

conducted based on the participants' self-reported piano experience. Both tests were significant.

Students reporting piano experience scored significantly higher on the Playing Test (M= 91.93,

SD = 3.08), t(30) = 115.55, p = .000, than those without piano experience (M= 78.47, SD =

12.71), t(30) = 23.92, p = .000. Students reporting piano experience also scored significantly

higher on the Written Test (M= 129.73, SD = 20.63), t(30) = 24.35, p = .000, than those without

piano experience (M= 116.93, SD = 28.28), t(30) = 16.01,p = .000.

Because significant differences were found in these groups based on reported piano

experience, a regression was conducted with piano experience as a predictor of the scores. For

the Written Test, (B = -2.00, SEB = 4.21, 8 = -.48) piano experience was not found to be a









significant predictor. In the Playing Test, (B = -19.20, SEB = 8.63, 13 = -2.23) piano experience

emerged as a significant predictor (p < .05). The R2 value of .15 indicates that piano experience

accounted for 15 % of the variance in the Playing Test scores. Results are shown in Table 4-3.

Regression Analysis of Written Test Items

In the Interval Identification section of the Written Test, the mean score for those with

piano experience was 7.07 out of 8 possible points as compared with a mean of 5.73 for those

without experience. Through regression analysis piano experience was shown to be a significant

predictor (p = .002) of the Interval Identification scores (B = 1.56, SE B = .41, 1 = 3.81). The R2

value of .528 indicates that piano experience accounted for 53 % of the variance in the Interval

Identification scores. This is a highly significant figure. Students with piano experience clearly

are better at thinking intervallically on the cello.

For the Pitch Location and Fingering section of the test, the means were 31.13 out of 32

possible points for those with piano experience compared with 22.26 for those without.

Regression analysis revealed that this piano experience was nearly significant as a predictor of

these scores (p = .061). Piano experience again emerged as a significant predictor (p = .002) of

the Single-Position Fingering scores (B = 1.80, SE B = .47, 8 = 3.83). The R2 value of .53

indicates that piano experience accounted for 53 % of the variance in the Single-Position

Fingering scores. This section required students to look at notes vertically through a series of

arpeggios and arrive at a fingering, something that pianists are frequently required to do.

Item difficulty, item discrimination, and point biserial correlations were calculated for the

Written Test. Results are presented in Table 4-4. The Interval Identification section had the

highest average difficulty level (.80) of any section of the Written Test. Items on the Bass,

Treble, and Tenor Clef Note Identification section were found to be the least difficult. Item 23









(rpbs= 0.80) and item 31 (rpb, = 0.82) of the Pitch Location and Fingering Section had the two

highest correlations to the total test score. The range of difficulty level (1.0-.80) indicates that

the Written Test is not at an appropriate level of difficulty for undergraduate cellists.

Using Pearson's r, a low positive correlation was obtained between student scores on the

Written and Playing Test (r2 = .16). This suggests little relationship between scores on these

tests. This suggests that the cognitive knowledge required to do well on the Written Test may be

distinct from the psychomotor ability needed to demonstrate the techniques found in the Playing

Test.

Part Two: The Playing Test

Scoring the Playing Test

A discussion of the criteria-specific rating scale used to score the Playing Test is found in

Chapter Three. Ten techniques were evaluated using an additive rating scale which ranged from

0 and 10 points per item. Seven techniques were evaluated using a continuous rating scale with a

range of 2 to 10 points possible. A zero score resulted from none of the criteria being

demonstrated for an additive item. The total possible score for the combined sections of the

Playing Test was 170.

Results from the Playing Test

Reliability was estimated by using Cronbach's Alpha to find the relationship between

individual items on the Playing Test. The results (a = .92) indicate high internal consistency of

test items: this suggests that the means of assessing each technique are well-matched.

Table K-3 (Appendix K) presents the raw scores of the Playing Test items and the

composite means and standard deviations. Table 4-5 lists these items from highest to lowest

based on their mean scores. These data reveal that students scored highest on the detachN bowing









stroke (M= 8.46), and lowest onpizzicato (M= 6.06). Discussion of the significance of these

mean scores is found in Chapter Five.

Comparison of Left Hand Technique and Bowing Stroke Scores

The total mean scores were calculated for the two sections of the Playing Test: Left Hand

Technique (M= 7.21), and Bowing Strokes (M= 7.31). Students performed at very similar level

for both sections and performed uniformly, i.e. higher-scoring students did well on both sections

and lower-scoring students did less well on both sections.

Comparison of Playing Test Scores and Teacher-Ranking

To determine the predictive validity of the Playing Test, teachers from the six music

school participating in this research were asked to rank their students from lowest to highest in

terms of their level of technique. Five of the six teachers responded to this request. These

rankings were compared to the rank-order based on the Playing Test scores. The results are

shown in Table 4-6.

Two teachers (School A and B) ranked their students in exactly the same order as the

Playing Test ranking (r2= 1.0). Using the Spearman rank-order correlation, the correlations of

the other three schools who responded were positive and strong: (r2= 0.65, 0.84, and 0.76

respectively). Results indicate student's performance on the Playing Test closely corresponds to

the level of their technique as perceived by their teachers. The Playing Test is criterion-

referenced and not designed to be used as a norm-reference test. However, the strong positive

correlations of the teacher's rank-order of their students to that of the rank order of the scores on

the Playing Test suggests that this measure is a valid means of determining undergraduate cello

students' technical ability.









Interjudge Reliability of the Playing Test

Two judges were recruited to evaluate five different student performances of the playing

test as described in Chapter Three. Interjudge reliabilities were calculated using Pearson's r.

Correlations were as follows: Judge A and the primary investigator (r = 0.92); Judge B and the

primary investigator (r = 0.95). These results are presented in Table 4-7. The students observed

by judges A and B represented 33% of the total number of students participating. These data,

with its highly significant correlations, appear to confirm the effectiveness of the criteria-specific

rating scale used in this study as a means of recording information about specific strengths and

weakness in a student's playing.

Part Three: The Student Self-Assessment Profile

The Student Self-Assessment Profile (SSAP) was created as another means to gather

diagnostic information about students. Many teachers have developed questionnaires to better

understand the performance background of their students. The self-assessment used in this study

served this function, as well as providing additional information about areas of performance

interest and personal goals. In addition, the SSAP allows students to comment on what aspects

of technique they feel they need to improve. Twenty-nine of the thirty students participating in

this study completed the Student-Assessment Profile. The following subheadings represent

sections of the Student Self-Assessment Profile.

Repertoire Previously Studied

Students listed many of the standard methods and etudes collections for the cello:

Cossman, Dotzauer, Duport, Fuillard, Franchomme, Piatti, Popper, Sevick, Starker, and Suzuki.

Pieces from the standard literature for cello were listed. For a teacher, such information shows









the extent and breadth of a new student's experience and may indicate appropriate directions for

further study.

How interested are you in each of these areas of performance: Solo, Chamber, and
Orchestral?

Table 4-8 lists students' responses to this question. Eighty-three percent of the students

stated they either agreed or strongly agreed to having interest in solo and orchestral performance,

and ninety-three percent expressed the same for chamber music. Noting responses to this section

could be a means for teachers to initiate discussion with students about their plan of study. If a

student's greatest interest was in playing chamber music, his teacher might help to facilitate this

desire. Knowing that a student's primary goal was to win an orchestral audition would dictate in

part the choice of repertoire studied.

Other areas of performance interest?

Students listed the following areas of performing interest: jazz (n = 2), conducting (n =

1), piano accompanying (n = 1), choir n = 1), improvisation (n = 1), bluegrass (n = 1), praise

bands (n = 1), and contemporary performance (n = 2). Teachers provided with this information

might choose to direct students to nontraditional sources of study, such as improvisation

methods, learning to read chord charts, or playing by ear.

What are your personal goals for studying the cello?

Responses to this question are provided in Table 4-9. Five out of the twenty-nine

students (17%) listed "teaching privately" as a goal for study. The second most frequently

mentioned goal was "orchestral performance" (10%). If this study was conducted with the

highest ranking music conservatories in the United States, the researcher suspects that "solo

performance" might be frequently mentioned as well.









What areas of cello technique do you feel you need the most work on?

Answers to this question are presented in Table 4-10. Bow stroke was mentioned by ten

students as needing the greatest attention. Nine students discussed the need to work on

relaxation as they played, specifically referring to left and right hand, shoulder, and back tension.

Many of the techniques assessed in the Playing Test were alluded to such as spiccato bowing or

thumb position. The specificity of many of the areas of technique mentioned may have been due

to the students filling out the SSAP after having taken the Playing Test. The difficulty students

had with playing certain passages caused them to list these techniques as ones to work on. This

appears to be anecdotal evidence that the playing test can cause students to be more self-aware.

Summarize your goals in music and what you need to accomplish these goals.

In answering this question, students described their broad musical objectives, often

discussing career goals. The goals in music were to be written for six month, one, two, four, and

ten-year intervals, but not all students completed each sub-category. Table 4-11 presents the

responses to this section in the students' own words. Many of the goals implied an understanding

between the teacher and the student, such as a two-year goal of memorizing a full concerto.

Acquiring advanced degrees were goals for two of the students. One student's six-month goal

was to "practice more musically than technically." Without agreement between the teacher and

student on such a goal, conflicts could arise: what if the teacher felt the next six months were

best spent drilling technique?

One student's four-year goal was, "To get past the pre-eliminations in an orchestra

audition." The Student Self-Assessment Profile would help to assure that the teacher was privy

to this information. One music major's long-term goal was to, "play recreationally, not as a

career." This belies the assumption that every music-major is planning on a career in music.









Access to this kind of information could prevent misunderstandings developing between a

teacher and student that result from conflicting goals.

Summary of Results

The following summarizes the results obtained in these analyses:

1. The Written Test was found to be too easy for most undergraduate cellists. Lower

scores in the Interval Identification section indicate that some students have

difficulty applying their understanding of intervals to the cello.

2. Strong interitem consistency was found for the Playing Test, indicating high

reliability for this section of the test.

3. Year in school was a significant predictor of Playing Test scores and combined

scores for freshmen students.

4. Music performance majors' scores did not differ significantly from scores earned by

students in other degree programs.

5. The number of years a student had played the cello was not found to be a significant

predictor of the Written or Playing Test scores.

6. Piano experience was found to be a significant predictor of Playing Test scores, and

scores on two sections of the Written Test.

7. Playing Test scores were a significant predictor of how teachers would rank their

students in terms of level of technique.

8. The criteria-specific rating scale developed for this study appears to be a highly

reliable measurement tool based on interjudge reliability.










Table 4-1. Summary of Regression Analysis for Year in School as a Predictor of Written,
Playing, and Total Test Scores (N= 30)


Score


SEB


Written Test


Freshmen
Sophomore
Junior
Senior


Playing Test

Freshmen
Sophomore
Junior
Senior

Total Score

Freshman
Sophomore
Junior
Senior


Note. Written Test Scores: R2 = .027 Freshmen; R2
Senior. Playing Test Scores: R2 =.280 Freshmen; R2
Senior. Total Test Scores: R2 =.265 Freshmen; R2
Senior.
* p <.05


.023 Sophomore; R2
.107 Sophomore; R2
.058 Sophomore; R2


.065 Junior; R2 .008;
.046 Junior; R2 .008;
.091 Junior; R2 .004;


.0069
.0060
.0085
.0031


.010
.0058
.0032
.0014


.0079
.0074
.0060
.0067


.003
.0032
.0028
.0030


.0027
.0029
.0024
.0027


.88
.82
-1.40
-0.47


3.30*
-1.83
-1.16
-0.46


3.18*
-1.32
-1.67
-0.34


.009
.0038
.0040
.0009










Table 4-2. Years of Study, Frequency, and Test Scores


Years of Study Frequency


5
6
7
8
9
9.5
10
11
11.5
12
13
16


Written Test Mean Score


83
75.5
93
95
93.5
87.71
68
81.31
93
87


Playing Test Mean Score


114
108
101.5
126
142
140
141.1
140
108.7
156
104









Table 4-3. Summary of Regression Analysis for Piano Experience as a Predictor of Written,
Playing, and Total Test Scores (N= 30)

Test Section B SE B

Written Test Scores -2.00 4.21 -.48

Playing Test Scores -19.20 8.63 -2.23*

Total Combined Score -21.20 11.74 -1.81

Note. R2 = .008 for Written Test Scores; R2 = .15 for Playing Test Scores; R2 = .10 for Total Test Scores.
* p <.05










Table 4-4. Item Difficulty, Discrimination, and Point Bi-Serial Correlation for the Written Test


Category


Item
Number


Fingerboard Geography


Interval Identification


Item
Difficulty

.97


Item
Discrimination


Point Bi-Serial
Correlation


0.15
0.26
0.37
0.06
0.08
0.06
0.49
0.40


Pitch Location
And Fingering


.25 0.63
.25 0.63
.25 0.52
.25 0.49
.38 0.41
.25 0.47
.38 0.57
.50 0.50
.13 0.78
.25 0.63
.38 0.75
.38 0.40
.38 0.63
.50 0.63
.13 0.33
.38 0.63
.50 0.70
.50 0.70
.38 0.71
.38 0.70
.38 0.75
.38 0.70
.38 0.80
.38 0.65
.38 0.71
(Table continued on next page)










Table 4-4. (continued)


Category


Item
Number


Pitch Location
And Fingering


Item
Difficulty


.80
.83
.80
.83
.83
.83
.80


Item
Discrimination


Point Bi-Serial
Correlation


0.71
0.73
0.73
0.74
0.74
0.82
0.76


Single Position Fingering


0.07
0.07
0.07
0.07
N/A
N/A
N/A
N/A
0.43
0.43
0.16
0.15
0.23
0.16
0.36
0.36
0.06
0.32
0.40
0.35
0.23
0.23
0.23
0.23
0.23
0.12


(Table continued on next page)










Table 4-4. (concluded)


Item
Number


Item Item
Difficulty Discrimination


Single Position Fingering


Point Biserial
Correlation


0.31
0.18
0.23
0.23
0.36
0.36


Bass, Treble, and Tenor Clef Note Identification


0.0
.13
.13
0.0
0.0
.13
.13
0.0
.13
.13
0.0
.13


N/A
0.46
0.43
N/A
N/A
0.02
0.0
-0.04
0.08
0.27
-0.05
0.01


Category


Note. Point Biserial Correlations were not found for the Fingerboard Geography items as 97% of the
students had perfect scores on this section.










Table 4-5. Mean Scores of Playing Test Items in Rank Order

Item Rank Order Mean Score



Detached 1 8.47
Slurred Legato 2 8.23
Arpeggios 3 8.13
Staccato 4 7.93
Vibrato 5 7.93
Portato 6 7.67
Position Changes 7 7.67
Scales 8 7.60
Arp. Chords 9 7.20
Sautille 10 7.13
Thumb Position 11 7.00
Broken Thirds 12 6.80
Martele 13 6.67
Double Stops 14 6.40
Spiccato 15 6.30
Intonation 16 6.20
Pizzicato 17 6.00

Note. Ratings ranged from 2 through 10.










Table 4-6. Comparison of Teacher-Ranking to Playing Test-Ranking


Teacher Ranking


Playing Test Scores


School A







School B


Playing Test Ranking



1
2
3
4
5


School C





School D


School E


0.65


0.84


0.76










Table 4-7. Comparison of Researcher's and Independent Judges' Scoring of Student
Performances of the Playing Test


Student
No


Primary
Investigator


152
136
156
144
134


144.4
9.63
0.92


Student
No


Primary
Investigator


138.8
16.09
0.95


Judge A



162
142
158
142
136


148
11.31



Judge B


138
152
104
98
134


125.2
8.67










Table 4-8. Numbers of Students Expressing Interest in Solo, Chamber, and Orchestral
Performance (N = 29)

Category Strongly Agree Agree Disagree Strongly Disagree

Solo 10 14 5 0
Chamber Music 20 7 2 0
Orchestral 16 8 4 1

Note. Students could indicate interest in multiple categories, resulting in totals exceeding the number of
students completing the form (N = 29).









Table 4-9. Personal Goals for Studying the Cello


Specified Goal Frequency Mentioned (N= 29)

Teaching privately 5
Orchestral performance 3
Chamber music performance 2
Expand repertoire 2
Lifelong hobby, personal enjoyment 2
College-level teaching 1
Obtain advanced degrees with the goal of college teaching 1
Improve concentration 1
Become a fluid improviser 1
Work as a studio musician 1
Ability to convey interpretation of music to others 1









Table 4-10. Student Perception of Priorities for Technical Study


Technique Frequency
Mentioned


Bow Stroke 10
Relaxation, including right and left hand, shoulders and back 9
Vibrato 4
Vibrato in upper positions 2
Thumb position 3
Musicality 3
Sound production/tone 2
Double stops 2
Sautilld 2
Sight-reading 1
Reading in different clefs 1
Rhythm 1
Coordination between right and left hand 1
Proper employment of left hand position and
whole arm movement 1
Extensions 1
Shifting 1
Spiccato 1









Table 4-11. Goals in Music and Means of Accomplishing Them


Six Months:

Catch up to my peers.
To shift easily.
Work strictly on technique, not worrying about pieces or recitals.
Practice more musically than technically.
Have lessons with other teachers.
Improve jazz vocabulary.

One Year:

Keep my scholarships.
To have perfect intonation.
Become an effective music educator (lifelong).
Resolve all tension issues; slow, loose practice-making it a habit.
Increase in difficulty of music.
Work on awareness of bowing choices.
Practice.

Two Years:

To be able to support myself solely through playing and teaching.
I hope to memorize a full concerto and feel comfortable performing.
Much practice; memorization and performance practice will be needed.
Graduate, and find a graduate school with a fabulous teacher.

Four Years:

To get past the prelims in an orchestral audition.
To graduate, get a job as a music therapist, and join the community of a professional
orchestra.
Play recreationally, not as a career.

Ten Years:

To be a guest artist at a major music festival.
Be teaching at a university with a Ph.D. in music.
Be employed in a high school as a music teacher, but still make time to perform and
possibly give private lessons.
Able to teach other cellists.
Gigging professionally.
Be a financially stable musician.









CHAPTER 5
DISCUSSION AND CONCLUSIONS

This chapter presents a discussion of the results of administering the Diagnostic Test of

Cello Technique. Following a review of the purposes and procedures of this study, the findings

of this study are addressed in light of (a) the research questions posed, (b) a comparison of

results with similar studies, and (c) implications for string education. This chapter closes with

conclusions and recommended directions for future research.

Overview of the Study

The purpose of this study was to design, validate, and administer a diagnostic test of cello

technique for use with college-level students. Written and playing tests were designed, pilot

tested, and a validity study was undertaken. Thirty students from six different universities in the

southeastern United States were recruited to participate in this research. Each student completed

a written test, playing test, and a self-assessment profile. A criterion-based rating scale was

developed to evaluate the Playing Test performances. Two university-level teachers were

recruited to judge ten video-taped performances of students taking the Playing Test. Evaluations

from those judges were correlated with the primary researcher's to determine interjudge

reliability.

Review of Results

The independent variables in this study were (a) year in school, (b) major/minor

distinction, (c) years of cello study, and (d) piano experience. Freshmen classification emerged

as a significant predictor of Playing Test scores (p = .003) and total scores (p = .004). No effect

of major/minor distinction was found for the Written Test (R2= .001). Results were nearly

significant for the Playing Test (R2 = .104) and not significant for the combined Written and

Playing Tests (R2= .072). Years of cello study were not significant predictors of test results.









Piano experience was shown to have a significant effect on the Playing Test scores: (p = .034).

Students with piano experience scored 14% higher on the Written Test and 7% higher on the

Playing Test that those without piano experience. The reliability of the Playing Test was high as

shown by coefficient alpha (ru= 0.92). Correlation coefficients obtained between the primary

researcher and the two reliability researchers were positive and strong (Judge A, r2 = 0.92; Judge

B, r2 = 0.95), suggesting that the criteria-specific rating scale designed for this study was

effective.

Observations from the Results of Administering the Diagnostic Test of Cello Technique

The Written Test

Future versions of the Written Test designed for college-students should eliminate The

Fingerboard Geography section, as only one student made errors in filling out this section. This

section should be included for a high school version of the test; the likelihood is that not all

students at this level would be clear about the location of pitches on the fingerboard.

The Interval Identification section as a whole had the highest average difficulty level of

any section of the Written Test based on item analysis. In this section, item six (a major sixth

across two strings) had the highest difficulty level of any item on the test (.70). This item,

however did not discriminate well between high-scoring and low-scoring students (.38). On this

item students most likely erred by not keeping in mind that on the cello, the interval of two notes

lying directly across from each other on adjacent strings is always a perfect fifth. Adding a

whole step to a perfect fifth, results in the interval of a major sixth. This is an example of

something universally known by undergraduate cello students but not necessarily visualized by

them on the fingerboard. This suggests that some students were either unclear about interval

designations or that they do not think intervallically when playing the cello. It is the researcher's









opinion that an awareness of intervals while playing represents a higher-order of thinking than

simply playing note-by-note. Additional research is needed to determine to what extent

intervallic thinking while playing the cello is a distinguishing characteristic of advanced

performers.

In the Interval Identification section of the Written Test, the mean score for those with

piano experience was 7.07 out of 8 possible points as compared with a mean of 5.73 for those

without experience. Piano experience was found to be a significant predictor for this item (p =

.002). Students who play piano are able to identify intervals more easily on a representation of a

cello fingerboard than those without piano experience. In the Single-Position Fingering section

piano experience again was found to be a significant predictor of a student's score (p = .002).

This suggests that students with piano experience may think more clearly about vertical pitch

relationships. String instrument teachers would likely concur, observing that their students who

play piano tend to: 1) be better sight readers, 2) have a clearer sense of pitch and intervals, and 3)

have better rhythmic accuracy. Additional evidence of the positive effect of piano experience on

cello performance would be gained through studies that compared students' length of time

studying both instruments to their performance on the Playing Test.

The Single Position Fingering section may be unclear in its directions. Several students

thought they were being asked for fingering that would allow the notes to be played as a

simultaneous chord, which wasn't possible for some items. The final section (Note

Identification in Three Clefs) had several very low point biserial correlations (0.07) Errors in

this section were almost certainly caused by carelessness and did not reflect a student's ability in

note reading. One single exception was a student who missed all the tenor clef items but got all

the other note identification items right. Complete fluency in note reading is an essential









prerequisite for sight-reading ability. As a result, this section should be included in future

versions of this test.

The Written Test needs to be revised for undergraduate students in terms of difficulty

level. A greater range of scores would likely result if the present version of the test was

administered to high school students. In future versions, using actual passages from the cello

repertoire to evaluate a student's understanding of intervals, fingering, and fingerboard

geography would be in keeping with the testing philosophy of using situated cognition.

The Playing Test

Left Hand Technique (nine items) and Basic Bowing Strokes (eight items) were evenly

dispersed within the range of lowest to highest mean scores (Table 4-5). The choice in this study

to divide technique into left hand techniques and bowing techniques does not reflect in reality

how integrated these two areas are. This study's design did not isolate bow techniques from the

musical context in which they are found. If such a study was conducted, it might reveal that

some students excel in bowing techniques and others in left hand technique. These two areas of

technique are so intermeshed that it would be difficult to isolate them. Bowing serves literally to

amplify what the left hand does. Development of bowing skill, through practice on open strings

without using the left hand, is limited, and is usually, though not always, confined to initial

lessons.

The Playing Test's mean scores revealed that students scored highest on the detached

bowing stroke (M= 8.46), followed by legato bowing (M= 8.33), and arpeggios (M= 8.13).

Detached bowing is the most commonly used bow stroke; legato playing is also very ubiquitous.

One might have expected to find Scales, Broken Thirds and Arpeggios grouped together the

same difficulty category. These three areas of technique are considered the core left hand









techniques: indeed most music is comprised of fragments of scales, arpeggios, broken thirds, and

sequential passages. The excerpts used in the Playing Test to evaluate scales may have been

more challenging to perform than the arpeggios; this may partially explain why scales were not

among the easier items. Another explanation may be that scales are the first item on the test.

Initial nervousness or stage fright may have affected this item more than subsequent ones. The

researcher noted that most students seemed to become free of nervousness shortly after

commencing the test, but these initial jitters may have had a deleterious effect on their

performance of the first item.

In the Pilot Study (Appendix A) broken thirds were the fourth most difficult item. It was

conjectured that broken thirds are under-assigned by teachers, and as a result, not practiced

much. In this present study broken thirds again were found to be difficult for students to

demonstrate. The ability to play (and especially to sight read) broken thirds requires two skills:

1) The capacity to quickly discern if a written third is major or minor, and 2) having an accurate

sense of interval distances on a given string. The correlation of students' scores on broken thirds

to their total Playing Test scores was strong (r = .81), suggesting that students' ability to perform

well in this area may be a good indicator of their overall level of technique.

The difficulty of demonstrating a given technique through excerpts varies. Spiccato

bowing, the third lowest score (M= 6.3), requires a succession of separately bowed notes played

rapidly enough that the bow bounces off the string almost of its own accord. This is not a

technique that is easily demonstrated unless the player is very familiar with the notes. Sautille

bowing, another bounced-bow stroke (M= 7.13) appears to be slightly easier than spiccato.

Though sautille bowing requires a faster bow speed than spiccato, the repetition of pitches meant

the speed of fingering changes is actually slower for these passages, thus easier to play.









The relatively low score for martele bowing is likely due to a lack of understanding as to

what constitutes this bow stroke. The two excerpts used for this item were moderately easy to

play. A large number of students, however did not demonstrate the heavily, accented

articulation, and stopping of the bow on the string, which characterizes this stroke. While many

method books include a description of martele bowing, students are unlikely to have a clear

grasp of how to execute this bowing unless it is demonstrated by a teacher.

The item with the lowest score was pizzicato (M 6.06). The excerpts chosen featured

three separate techniques: (a) arpeggiated chords using the thumb (Elgar), (b) notes with a strong

vibrant tone (Brahms), (c) clear ringing sound in the upper register (Kabalevsky). These

excerpts were not easy to sight read for students who were ill-prepared. This was the final

section in a series of excerpts requiring great concentration; mental and/or physical fatigue may

have been a factor. It is also possible that the study of pizzicato is neglected in lessons.

Intonation was the second lowest score (M 6.20). Judge B assigned the only perfect

score given to a student. It is axiomatic that string players must be constantly vigilant about

playing in tune. Not allowing students to become tolerant of playing out-of-tune is one of the

essential roles of the teacher. Pablo Casals' words on this subject are timeless:

'Intonation', Casals told a student, 'is a question of conscience. You hear when a
note is false the same way you feel when you do something wrong in life. We
must not continue to do the wrong thing' (Blume, 1977, p. 102).

Five students (15%) mentioned intonation when asked, 'What areas of cello technique do you

feel you need the most work on' (see Chapter 4, p. 63). From this study it appears the Playing

Test may help make students more aware of the importance of work on intonation.









The Student Self -Assessment Profile

The premise for designing the Student Self-Assessment Profile is that better information

about a student's background, interests, and goals for study can result in more effective teaching.

It value as a diagnostic tool is in revealing a student's years of study, previous repertoire studied,

and playing experience. The emphasis on identifying personal goals for studying the cello as

well as overall goals in music opens a window into a student's self awareness. Communication

of these goals to a teacher can affect the course of study. Allowing students' goals to influence

their education may result in their feeling more invested in the learning process. The outcome

may be more effective, goal-directed practice. Students are more likely to be motivated by goals

that they perceive as being self-initiated. Awareness of these goals is not necessarily derived by

conventional teaching methods; it comes from a dialogue between the teacher and student. The

Student Self-Assessment Profile can act as a catalyst for such a dialogue.

The personal goal for studying the cello most often mentioned was "teaching privately"

(Table 4-9). When a teacher knows that a student wants to teach the cello as a vocation, his role

becomes more of a mentor, exemplifying for the student the art of teaching. A greater role for

discussion during the lesson may ensue as the need for various approaches to problems becomes

apparent. Perhaps the most important thing a teacher can provide a student aspiring to teach is to

help them become reflective about their own playing; asking themselves why they do something

a certain way. Questions that ask why rather than how take precedence. Two students mentioned

college-level teaching as one of their personal goals. Providing student-teaching opportunities

for these students as well as opportunities to observe experienced teachers at work would be

invaluable.









Goals such orchestral or chamber music performance could have a direct effect on the

program of study if the teacher agreed that these objectives were appropriate and attainable. A

student who has expressed a sincere goal of playing professionally in a major orchestra deserves

to know both the playing standards required and the fierce competition involved. A serious

attempt to address some of the personal goals mentioned here would challenge even the most

veteran of teachers. How do you help a student improve concentration? Become a fluid

improviser? Convey their interpretation of music to others? Addressing these goals as a teacher

means taking risks, varying one's approach, and being flexible.

Over one third of the students who filled out the Student Self-Assessment Profile listed

"bow stroke" as a priority for technical study (Table 4-10). They are in good company; string

musicians agree that true artistry lies in a players' mastery of the bow. Musical issues such as

phrasing, dynamics, and timing are the bow's domain. A cellist's approach to bowing largely

determines their tone, and articulation. These qualities, along with vibrato, are the distinguishing

unique characteristics of an individual cellists' sound.

After "bow stroke" the most commonly noted area of technique addressed was relaxation

or lowering body tension. This is an aspect of technique that musicians have in common with

athletes. Gordon Epperson summarized the observations of many teachers:

What is the chief impediment to beauty of sound, secure intonation, and technical
dexterity? I should answer, without hesitation, excess tension. Sometimes
tension alone is blamed; but surely, we can't make a move without some
degree of tension. It's the excess we must watch out for. (Epperson, 2004, p. 8).

Excessive tension may not always be readily apparent; teachers may not realize students are

struggling with this area unless the issue is raised. Students who mention excessive tension

while playing as a major concern should be directed to a specialist in Alexander Technique.









Work on Vibrato, either in general or in upper positions was mentioned by six students.

Despite sounding like an oxymoron, it is true that an effortless sounding vibrato is very difficult

to make. Dorothy Delay, of the Juilliard School of Music, assigned the first hour of practice to

be spent on articulation, shifting, and vibrato exercises for the left hand, and various bow strokes

for the right (Sand, 2000). Students who express a desire to develop their vibrato should be

guided with appropriate exercises, etudes, and solos.

Other areas of technique are far more easily addressed. A student who mentions sight-

reading or reading in different clefs can be easily directed to materials for study. Applying

oneself to the exercises in Rhythmic Training, by Robert Starer will benefit any student who felt

deficient in rhythm (Starer, 1969). There are materials to address virtually every technical need,

as long as the need is made apparent to the teacher.

The final question of the SSAP asks, "Summarize your goals in music and what you need

to do to accomplish these goals." The words with underlined emphasis were added based on

input from the Validity Study (Appendix B). This phrase is meant to suggest a student's

personal responsibility to follow-through with their stated goals. Table 4-11 is a transcription of

student responses to this question in their own words.

Six-month goals are short term, and reflect a student's semester-long objectives. "Work

strictly on technique, not worrying about pieces or recitals," is one example. Some one-year

goals seem naive: "To have perfect intonation." Goals are the driving forces behind ones

outward acts; playing with perfect intonation may not be attainable but that doesn't mean it isn't

a valid aspiration. One student has shown they understand the need to make some aspects of

playing virtually automatic through repetition: "Resolve all tension issues: slow loose practice-

making it a habit." Music and athletics have in common the need for drilling desired actions.









As Aristotle noted, "We are what we repeatedly do. Excellence then, is not an act, but a habit"

(Aristotle, trans. 1967).

Goal setting is most effective when it is measurable, as with a student's two-year goal of

memorizing a full concerto. Academic ambitions, such as pursuing graduate studies, are

important to share with ones teacher, and can dictate a student's course of study. Teachers may

occasionally be surprised in reviewing their students' long-term goals: One performance major

stated her goal as a cellist was to play recreationally, not as a career. However, most four-year

and ten-year goals were career-oriented. There is value in having students express these goals

concretely; through this activity, students visualize doing something they are presently not able

to do. Goal setting requires a leap of faith.

Discussion of Research Questions

In this section the original research questions are reexamined in light of the results.

These questions are restated below with discussion following.

To what extent can a test of cello playing measure a student's technique?

The extent to which the Playing Test is able to measure an individual cello student's

technique depends on the way a teacher uses it. If students are strongly encouraged by their

teacher to practice the excerpts and are expected to play from them in the lesson, testing error

resulting from unfamiliarity with the music and sight-reading mistakes can be minimized. The

results can come much closer to a true diagnosis of a student's technical level. The comparison

of teacher-ratings to Playing Test ratings (Table 4-7) revealed a high correlation and tended to

confirm the test's validity. It is possible that, in some cases, ranking differences occurred due to

a teacher's bias based on his or her estimation of a student's potential. As one teacher noted in

discussing a student's rating: "It pains me to make this assessment, as I confirm that (student)









has far underperformed both her stated aspirations and potential the last several years" (personal

correspondence, May 2007). One of the primary purposes of this test was to provide a tool that

allows greater diagnostic objectivity, thereby providing a counterbalance to the subjective

impressions that a teacher receives about each student.

Each technique is represented by several excerpts of increasing difficulty. On those

items using an additive scale, the listener can find a descriptive statement that corresponds to the

performance level a given student has demonstrated. In thirty minutes of concentrated listening

the teacher/evaluator is able to come to definite conclusions about a student's ability to

demonstrate seventeen essential areas of technique. As the Playing Test is made up of excerpts

from the standard repertoire for cellists, the teacher is given insight into what pieces are

appropriate for study.

To what extent can a criteria-specific rating scale provide indications of specific strengths
and weaknesses in a student's playing?

Interjudge reliability was positive and strong (Judge A r2 = 0.92, Judge B r2 = 0.95),

suggesting that the criteria-specific rating scale designed for this study was an effective means of

directing the evaluator to listen and watch for specific aspects of technique. A factor analysis of

the descriptive statements generated for the Playing Test evaluation form is recommended.

Statements that were found to have low factor loadings could be replaced, and reliability of this

measure could be increased. One example where improvement might be made is in the criteria

choices provided for Vibrato. There were students who did not really match any of the

descriptors provided for this item; their vibrato was not tense nor too fast, but to the contrary,

was unfocused and too slow.









Can a written test demonstrate a student's understanding of fingerboard geography, and
the ability to apply music theory to the cello?

The answer to this research question is a provisional "yes," noting that the results of such

a test do not necessarily predict how well a student plays. Additional research is needed to

determine to what degree intervallic understanding or fingerboard visualization is part of the

practical core knowledge of an advanced cellist.

While scores on the Written Test ranged from 62% to 100% correctly answered, the

difficulty level for all items was found to be low. However, it is good that the Fingerboard

Geography section was filled out flawlessly by 29 out of the 30 students. Any real confusion

here would be a signal that something was seriously lacking in a student's understanding of half

steps, the chromatic scale, or the relationship of strings tuned a fifth apart from each other. The

Written Test may be seen as a kind of barrier examination; if students score below 90%, review

of these content domains is indicated. Item difficulty could be increased by more challenging

interval identification and pitch location items.

Perhaps a means to achieve more authentic assessment of fingering skills would be to

have students provide fingerings for passages from actual orchestral, chamber, or solo music

written for the cello. The challenge in this would be the number of "acceptable" choices.

Nevertheless, a teacher might gain more insight about a student's situated cognition, that is, the

thinking process 'at' the cello, by using this approach, Ensuing results could become the basis

for discussion about why one fingering might be better than another.

The point biserial correlations from the Interval Identification section indicate that some

students, who otherwise had high scores, were less successful on this section. However, seven of

the nine students who made perfect scores in this section also were the top scorers of the whole

test. Of the nine students who correctly identified all the intervals, eight had piano experience.









Piano experience emerged as a significant effect on student's scores on the Interval Identification

section through regression analysis (p = .002). It is suspected that discussions of intervals rarely

occur in the teaching of string instruments. A student's understanding of intervals derived from

music theory classes may not automatically transfer to the cello fingerboard and cello music.

The use of fingerboard representations to test interval understanding may have favored

visual learners. This test does not extend beyond mere interval identification to the more

important skill of seeing a written interval and being able to imagine how it will sound. This

skill, traditionally tested in sight-singing classes, is very valuable to instrumentalists but is often

underdeveloped. Future versions of the test might include having students play passages based

on a series of intervals rather than given pitches.

Student's Written Test scores do not have a strong correlation to their Playing Test scores

(r2 = 0.16). The Written Test may measure a theoretical understanding that, while valuable, does

not directly influence a student's demonstration of the techniques found in the Playing Test. A

comparison of students' scores on the Written Test and a sight reading test such as the Farnum

String Scale (Farnum, 1969), might be found to have a higher correlation. Pitch Location and

Fingering, as well as the Single Position Fingering section, require the students to demonstrate a

skill that is required for effective sight reading, namely, coming up with efficient fingerings.

Additional research is needed to explore to what extent an understanding of fingerboard

geography and music theory, as applied to the cello, affects a student's playing. It can be

hypothesized that there is a cognitive skill set that accompanies the psychomotor skills of string

playing. Better understanding of the kind of situated cognition required to think well on a string

instrument would be valuable to teachers and students.









Observations on the Playing Test from Participating Teachers

After the research for this study was complete, participating teachers were asked to

comment on the value of the test as a diagnostic tool. In one particular case, a teacher had his

students play from the Playing Test during lessons at the beginning of the semester. He

comments on the beneficial aspects of using the excerpts within his studio:

In terms of using the playing test as a studio project, it was helpful in several
ways. First, it was great to have a community project that I could get everyone involved
in working on. Secondly, it was useful to have excerpts that were shorter than any etude
I might assign (I do sometimes assign etude excerpts, however) but focused on a small
sub-set of technical problems. For some students, certain excerpts were a lot harder than
others (though they all struggled on the double-stop section of the Dvorak concerto!)
which meant it was also a process of self-discovery. Finally, in some cases I later
assigned repertoire included in the excerpts, and students were able to build upon the
work they'd already done, learning some of the trickier parts (Personal communication,
May 2nd, 2007).

The reference to self-discovery corroborates evidence gathered through the Student Self-

Assessment Profile (SSAP) that the Playing Test can result in greater student self-awareness of

their playing. The number of comments found in the SSAP referring back to techniques

encountered in the Playing Test suggests that the test can indeed make students more self-aware

of their strengths and weaknesses. That the test could influence the choice of repertoire assigned

to students was also demonstrated. The positive value the test had uniting the studio in a

"community project" was unexpected. If students worked on this common repertoire and played

it for each other in cello class, the test could function as a means to connect members of a studio

and to learn from each other.

The completeness of the Playing Test's content and its capacity to quickly assess a

student's skill level was noted by another teacher:

I found the test to be a very thorough and comprehensive survey of all of the basic issues
in cello technique, using excerpts drawn mostly from the standard repertoire, so that at
least some of them should already be familiar to any cello student. By asking an









intermediate-to advanced level student to play through these excerpts (or even just one or
two excerpts under each technical element), with or without prior preparation, the teacher
should be able to quickly (in about thirty minutes or less) identify the student's strengths
and weaknesses in any of the essential aspects of cello technique. (Personal
communication, May 23rd, 2007)

Another participating teacher confirmed the diagnostic worth of the test and its usefulness

in setting goals:

I feel the diagnostic test designed by Tim Mutschlecner is a valuable tool for
evaluating students and charting their course of study. Students come to a
teacher's studio with such a wide diversity of skill and backgrounds that
any aid in assessing their abilities is welcome. Thank you for your original and
worthwhile test. (Personal communication, May 10th, 2007).

This teacher addresses what the test results have shown; students enter college with a wide range

of experience and preexisting abilities. One of the student participants, a freshman, scored

higher on the Playing Test than five out of six seniors. This exemplifies why the test has

questionable value as a norm-referenced measure. When ranking students, one teacher observed

that comparing students was like comparing "apples and oranges." The playing test provides a

set of criteria that can supplement a teacher's performance standards and expectations.

Comparative Findings

The Farnum String Scale

When discussing the Farnum String Scale (FSS) in Chapter Two, it was observed that the

test requires the student to play a series of musical examples that increase in difficulty. This

approach was adopted in the Diagnostic Test of Cello Technique (DTCT). Unlike the FSS,

musical examples were taken directly from actual music written for the cello. The rationale for

this was that using real music increased the test's capacity for authentic assessment; students

would be playing the actual passages where the techniques in question would be found. The

downside to this was the potential of distracters, aspects of the excerpts that would mask a









student's real ability with a given technique. In some cases, for example the double-stop excerpt

from the Dvorak concerto, other challenges in playing the passage may have adversely affected a

student's ability to demonstrate the technique. However, after administering the test and

receiving positive feedback from students as well as teachers, it is felt that the benefits of using

real music far outweigh the disadvantages. Students liked the fact that they were playing from

standard works for cello and ones that they would quite possibly study someday, if they hadn't

already. This illustrates a weakness of the DTCT if it is used normatively. Unlike the FSS

passages, which would be unfamiliar to all test takers, students approach the DTCT with varying

degrees of familiarity with the excerpts. It would be unfair and ill-advised to use this test as a

means to compare students among themselves or to assign grades. Each student's performance

of the test must be judged solely on the criteria defined in the evaluation form.

One university professor declined to have his students participate in this study because

the bowings and fingering were not always the ones that he taught. Although he was alone in

this objection, it does demonstrate a dilemma that this kind of test design faces: If the test-maker

provides ample fingerings and bowings, there will be students who have learned these passages

differently and will be thrown off. If few or none are provided, it will create much more work

for the average student to play these excerpts. The best compromise may be to seek bowings and

fingerings that are most commonly used, even while instructing students that they are free to

develop their own choices.

Zdzinski and Barnes

The design of this study owes much to the string performance rating scale of Zdzinski

and Barnes (2002). The success they found in using a criteria-specific rating scale was validated

in this research. High interjudge reliability correlations (Judge A r2 = 0.92, Judge B r2 = 0.95)









indicate that drawing a judge's attention to specific aspects of the playing is an effective way to

increase consistency in evaluating music performances. Additive rating scales, as used by

Saunders and Holahan, (1997) eliminate the use of unspecific numerical ratings such as those

commonly used in Likert scales. By requiring a judge to listen for specific evaluative criteria,

rather than trusting in their general impressions of a music performance, reliability is increased.

Conclusions

The following conclusions can be drawn from the results of this study.

1. Results from the Interval Identification section of the Written Test indicate

that not all students recognize intervals confidently on the cello.

2. The excerpts used in the Playing Test are a valid and reliable way to measure

a undergraduate cellist's technique.

3. Piano experience improves how well student's perform on the Playing Test.

4. The Playing Test is a good predictor of teacher-rankings of their students in

terms of technique.

5. The criteria-specific rating scale used in this study is a reliable

instrument for measuring a student's technique.

6. A student's year in school, degree program, or years of cello study

are not strong indicators of their playing ability.

Recommendations for future research in the area of string instrument teaching and

assessment are:

1. A high school version of this test should be developed for use in diagnostic

evaluation and teaching.









2. This test can be used as a model for violin, viola, and bass

diagnostic tests of technique.

3. Future studies should explore the relationship of theoretical knowledge and

performance ability on the cello.

As testing increasingly becomes a major focal point in discussions on improving

education, questions regarding the value and purpose of assessment will increasingly be raised.

Diagnostic evaluation, because of its capacity to inform teaching, is an important component of

music education, including applied music. Tools like the Diagnostic Test of Cello Technique

help clarify for both teachers and students what needs to be learned. Along with existing

approaches to evaluation, music educators will continue to seek better objective means to assess

musical behavior.

Normative assessment has limited value in the arts; students come from such diverse

backgrounds and experiences that their work must be judged by established criteria, not from

comparison. The effectiveness of instrumental teaching depends on how clearly performance

objectives are communicated to the student. Well-defined performance criteria results in clear

objective goals. In music, as in life, when the target is clear, it is easier to hit the mark.









APPENDIX A
PILOT STUDY

A pilot study was carried out (Mutschlecner, 2004) which provided indications of ways to

improve an initial form of the Diagnostic Test of Cello Technique. Five undergraduate music

majors studying at a school of music located in the southeastern region of the United States

volunteered to participate in the pilot study. Four out of the five students were cello performance

majors. One was a music education major. These students were met with individually at the

school of music in a studio space reserved for this use.

The students were first given the Self-Assessment Profile to fill-out. Following

this, students were given the Written Examination, which took between ten and fifteen minutes

for them to complete. The Written Examination used in the pilot study was shorter than the one

developed for the present study. It included: a fingerboard chart, horizontal and linear (on one

string) interval identification, note identification in three clefs, and single-position fingering

exercises.

In the pilot study students were not given the Playing Examination ahead of time but

were required, essentially, to sight-read the excerpts. However, students were advised that this

was not a sight-reading test per se, but rather a test to assess their ability to demonstrate specific

technical skills and bowings. The students were encouraged to take some time to visually

familiarize themselves with the excerpts, and were told they could repeat an excerpt if they felt

that they could play it better a second time, an option only chosen twice. The students took

between thirty and forty-five minutes to complete the playing portion of the test. The pilot

study's version of the Playing Examination was shorter then the present study, measuring fewer

categories of left hand and bowing technique and not using as many excerpts for each technique.









Results of the Written Examination showed that the students had no difficulty with the

questions asked. What errors there were amounted to careless mistakes. This suggests that the

Written Examination did not discriminate well for cello students at this level. These results led

the researcher to increase the difficulty level of the present study

The rating instrument used for the Playing Examination was a five-point Likert scale

which included brief descriptions as to what each performance level represented. Student

performances of the Playing Examination ranged between 74.7% and 93.3% of a perfect score.

The student who had the weakest score was a music education major. Students in general did

slightly better in the Basic Bowing Strokes section of the exam than in the Left Hand Technique

section (91% compared to 86%). This was not surprising: The musical excerpts used to

demonstrate left hand technique were of necessity more difficult, and less easy to sight-read.

The lowest combined score was for theportato bowing. This was defined in the Playing

Examination as:

A series of broad strokes played in one bow with a smooth slightly separated sound
between each note. The bow does not stop as in the slurred staccato. Each note is to be
clearly enunciated with a slight pressure or 'nudge' from the index finger and upper arm.

Despite this extended definition students were unable to consistently demonstrate this bowing.

The evidence suggested that this stroke is not being taught or discussed to the same extent as

other bowings.

The next three lowest combined scores afterportato bowing were for position changes,

string crossings, and broken thirds. Well-performed position changes and string crossings may

be part of the identifying characteristics of an advanced player. The researcher suspects that

broken thirds are not practiced much and not emphasized by teachers, thus explaining the lower









scores in this area. Results from the Playing Examination indicate the need to increase the

difficulty level.

The results of the Student Self-Assessment Profile included the following responses:

How interested are you in each of these areas of performance?

I am interested in solo performance.
1 Strongly agree 1 Agree 3 Disagree 0 Strongly disagree

I am interested in chamber music performance.
4 Strongly agree 1 Agree 0 Disagree 0 Strongly disagree

I am interested in orchestral performance.
3 Strongly agree 2 Agree 0 Disagree 0Strongly disagree


What was most unexpected was the number of students who chose "disagree" for the

statement: I am interested in solo performance. One would have expected performance majors to

at least agree with this statement, if not strongly agree. They may have been influenced by the

choice of the word, "performance," and were thinking about whether they enjoyed the

experience of solo playing which, by connotation, meant auditions, juries and degree recitals.

These students may have been reading "solo" as meaning "solo career," and responded likewise.

In the Student Self-Assessment Profile students responded to the question, "What are

your personal goals for studying the cello," in a variety of ways such as:

(a) Would like to have a chamber group and coach chamber groups.
(b) To play anything that is set before me-I don't want to have limits in terms of
technique. To be able to convey to the audience what I feel when I play.
(c) Perfect intonation before I graduate, attempt to win the concerto competition.
(d) To get an orchestra gig, have a quartet/quintet, and teach students on the side.
(e) I want to be able to use the cello in all sorts of ways including orchestral,
chamber, rock & roll, and studio recording.

These answers are very specific and focused. A teacher, informed about these goals, could

modify teaching to address some of these goals. For example, students that have expressed an

interest in teaching would find discussions on how one might teach a particular skill very









valuable. If a student expresses the desire to be able to play, "anything set before me," they

would be likely to respond enthusiastically to a rapid, intense survey of a wide variety of cello

literature. For the student who specifically mentions perfecting intonation as a goal, there are

studies and approaches that would be recommended.

The question, "What areas of technique do you feel you need the most work on?" elicited

even more specific responses such as shifting, general knowledge of higher positions, fluid bow

arm, relaxing while playing, exploring musical phrasing, etc. These responses help give the

teacher a window into the student's self-awareness. They could become excellent starting

points for examining technique and would go far in helping technical study be goal-directed

rather than a mechanical process.

The final section of the Student Self-Assessment Profile had the students summarize their

goals for six months, one, two, four, and ten year periods. Responses showed students had clear

ideas about what they wanted to do after school, such as orchestral auditions or graduate school.

One revision made for the present study was to ask students what they needed to do to

accomplish their goals. A personal commitment in the plan of study is essential for insuring the

student's motivation to accomplish the goals formulated by both the teacher and himself. For

example, if a student seriously wants to compete for an orchestral job, preparation must began

long before the position opening is announced, through study of orchestral excerpts, a concerto,

and the Suites for Unaccompanied Cello by J.S. Bach. It is incumbent upon the teacher to

discuss these kinds of issues with students who express ambitions to play professionally in an

orchestra.









APPENDIX B
VALIDITY STUDY

A validity study was conducted following the pilot study to determine what extent

teachers felt this test measured a student's technique (Mutschlecner, 2005). Cello teachers (N=

9) on the college and college preparatory level agreed to participate in this validity study by

reading all sections of the diagnostic test and then responding to questions in an evaluation form

(Appendix C).

In answer to the question, "To what extent does this test measure a student's technique,"

responses ranged from "Very extensively," and, "Rather completely," to, "The written part tests

knowledge, not technique." Fifty six percent of the teachers felt the test measured a student's

technique in a significant way. Sixty seven percent of the respondents suggested that sight-

reading difficulties might mask or obscure an accurate demonstration of a student's technical

ability. As one teacher said, playing the excerpts "... shows if they have worked on this

repertoire. If they are reading it, it shows their reading ability." Two teachers came up with the

same solution: Provide the playing test to students early enough for them to develop familiarity

with the passages which they are asked to play. This would not eliminate the inherent advantage

students would have who had studied the piece from which the excerpt was derived, but it could

mitigate some effects, such as anxiety or poor sight-reading skill, which adversely affects

performance. These suggestions were implemented in the present study.

Criticism of the Written Examination included the concern that, "some fine high school

students ready for college might not know intervals yet." In response to this, a new section of

the Written Examination was developed (Pitch Location and Fingering) that measures a student's

capacity to locate pitches on a fingerboard representation without the use ofintervallic









terminology. The Interval Identification and Single Position Fingering sections of the pilot test

were extended to provide greater accuracy in measurement of these skills.

Forty four percent of respondents agreed that the excerpts chosen for the Playing

Examination were a valid way of determining a student's competence in left hand and bowing

technique. Several teachers suggested the addition of specific excerpts to reveal other aspects of

a student's technique such as pizzicato, and passages with greater variety of double stops

(simultaneously playing on two strings). These suggestions were implemented in the present

study. Part two of the Playing Examination (Basic Bowing Strokes) was expanded to include

Accented DetachN, Flying Spiccato, and Pizzicato.

Reaction to the choice of excerpts used in the Playing Examination included the

suggestion that a better assessment of a student's abilities would be to arrange the material in

progressive order from easiest to hardest and then see at what point the student began to have

difficulty. Ordering and expanding the range of difficulty of the excerpts would provide useful

information about the student's playing level so that repertoire of an appropriate difficulty-level

could be assigned. The present study applied these recommendations by finding additional

excerpts and making them sequentially more demanding. An effort was made to find excerpts in

each category that could be played by undergraduate cellists.

Seventy eight percent of the teachers responded positively to the Student Self-

Assessment Profile. Comments included, "I really like the Student Self-Assessment page. I

think that it is not just valuable to the teacher but important that the students examine their own

situations as well." One teacher remarked, "It seems the profile would be a useful tool to gauge

the goals and general level of a new student." A teacher proposed having some more open ended

questions as well, noting that, "There is music beyond solo, chamber and orchestral." As a









result, a line asking for other areas of performance interest was added. The study indicated that

teachers are either using a similar tool in their studios or would consider doing so.

The responses from teachers who participated in the validity study support the premise

that the diagnostic test of cello technique is a legitimate way to gather information about a

student's technical playing ability. The recommendations of these teachers were taken into

account in developing this present test.









APPENDIX C
VALIDATION STUDY EVALUATION FORM

The Diagnostic Test of Cello Technique: Validation Study

Evaluation Form

Instructions: Please read all parts of the test before responding to these questions.


1. To what extent does this test measure a student's technique?





2. What changes to the test construction do you feel would make the test more valid?





3. What changes in content do you feel would make the test more valid?





4. To what extent does the content of the Written Examination: i.e. Fingerboard Geography,
Horizontal Intervals, Linear Intervals, Clef Identification and Single Position Fingering
demonstrate a basic essential knowledge of music theory as applied to the cello?




5. Would you consider using the Written Examination as a means of assessing a new student's
knowledge of music theory as applied to the cello?
Why or why not?









6. Are the excerpts chosen for the Playing Examination a valid way of determining a student's
competence in-

a) Left hand technique?



b) Bowing technique?




7. If you feel a particular excerpt is not a good predictor of a student's ability, what alternative
passage do you recommend using?





8. Would you consider using the Playing Examination as a means of assessing a new student's
technique?
Why or Why not?




9. How would you use information gathered from the Student Self-Assessment and Goal Setting
Profile in working with your students?




10. To what extent would you be willing to participate in future Field Testing of this test through
administering it to a portion of the students in your studio?




Please include any additional comments here:










APPENDIX D
INFORMED CONSENT

Tim Mutschlecner
2035 NW 18' Lane
Gainesville, FL 32605
E-mail: stmutschlecner@excite.com
December 1st, 2006

Dear Participant,

I am Timothy Mutschlecner, a doctoral student in Music Education at the University of
Florida, where I am doing a study of a diagnostic test of cello technique. The test
contains three parts. If you choose to participate, I will mail you a Student Self-
Assessment Profile which you will fill out prior to our meeting. Then, during a pre-
arranged time you will take a 10 minute written test. The written test contains sections
on Fingerboard Geography, Interval Identification, Pitch Location and Fingering, Single-
Position Fingering, and Bass, Treble, and Tenor Note Identification. Next, you will take
a 45 minute playing test consisting of short passages demonstrating your ability to play
scales, arpeggios, broken thirds, double stops, position changes, arpeggiated chords,
thumb position, vibrato, intonation, slurred legato, d6tach6/accentuated dttacht, martel6,
portato, staccato, slurred staccato, spiccato, flying spiccato, sautill6, and pizzicato. Your
performance may be videotaped and view by other judges.

The combined test will take no longer than one hour to complete. Your choosing to
participate or not to participate in this study will not affect your grade in cello studies.
Test results will be kept confidential and scoring will be reported anonymously. Your
identity will not be revealed in the final manuscript

There are no anticipated risks, compensation or other direct benefits to you as a
participant in this survey. You are free to withdraw your consent to participate, and may
discontinue your participation in this study at any time without consequence.

If you have any questions about this research protocol, please contact me at (352) 377-
1076; Email: stmutschlecner@excite.com, or my faculty supervisor, Dr. Timothy S.
Brophy, at (352) 392-0223 ext 222; Email: tbrophy@arts.ufl.edu. Questions or concerns
about your rights as a research participant may be directed to the UFIRB office,
University of Florida, box 112250, Gainesville, FL 32611; ph. (352) 392-0433.

Your test results will be anonymously reported in my dissertation for partial fulfillment
of the requirements for the degree for Doctor of Philosophy. Participating in the test
implies your consent and that you have read the statement below:

I have read the procedure described above for the Diagnostic Test ofCello Technique. I
voluntarily agree to participate in the test and I have received a copy of this description.


Printed Name

Signature_ Date
Approved by
University of Florida
Institutional Review Board 02
Protocol # 2006-U-0965
For Use Through 10/25/2007









APPENDIX E
THE WRITTEN TEST

The Diagnostic Test of Cello Technique

Written Test


Timothy M. Mutschlecner



* Fingerboard Geography

* Interval Identification

* Pitch Location and Fingering

* Single-Position Fingering

* Bass, Treble, and Tenor Clef Note Identification





STUDENT'S NAME








Fingerboard Geography
Fill out this representation of the cello fingerboard with the appropriate
pitch name in each circle. Note: The interval between any adjacent
circles on a string is a half step.

CGDA




Ex:










Interval Identification


Identify each interval using the following abbreviations: m 2", M 2nd,
m 3rd, M 3rd, P Aug. 4A /Dim. 5t, P 5L, m 6, M 6,, m 7e, M 7th, P 8.
(m--minor, M=Major, Aug.=Augmented, Dim.=Dnimnished, P=Perfect)










Interval Identification (continued)












Pitch Location and Fingering


Write the note name in the outlined circles and provide a fingering that
would allow the notes to be played in a single position. Use the
reference pitch to determine location on the fingerboard. The interval
between any adjacent circles on a string is a half step.

C GDA C GDA

iCTeji rrfff


(reference pitch)


CGDA


-


- -


CGD A








Pitch Location and Fingering (continued)


CGDA
BSBH15


CGDA


CG D A
"nK









Single -Position Fingering

Indicate the fingering that would allow the four notes in each example
to be played in a single position. Use no open strings and no extensions
(stretches between 1 and 2).











#1
a a^








#2









#3
n nn a


#4 -








Single-Position Fingering (continued)


-0-


S"
i~f~.~i


- -


,L^s









Bass, Treble and Tenor Clef Note Identification





Write the correct pitch name under each note.


I- 1


II,
F P U


hr~


I
~f~r


aid%


min ff'


mfa4










APPENDIX F
THE WRITTEN TEST EVALUATION FORM


Adjudicator's Code


Student's Name

Grade Level

Degree Program

Audition Day _


Audition Time


Test Section


Total Points


Student's Score


Fingerboard Geography 11 points
(divide total t
Interval Identification 8 points

Pitch Location and Fingering 32 points

Single Position Fingering 32 points

Bass, Treble, and
Tenor Clef Note Identification 12 points

Total Possible Score

95


y 4)


Total Student's Score and %







APPENDIX G
THE PLAYING TEST

THE DIAGNOSTIC TEST OF CELLO TECHNIQUE

PLAYING TEST

Contents:

1. Left Hand Technique

Scales
Arpeggios
Broken Thirds
Double Stops
Position Changes
Arpeggiated Chords Across Three or Four Strings
Thumb Position
Vibrato
Intonation

2. Basic Bowing Strokes

Slurred Legato
Detach6/Accentuated D6tache
Martele
Portato
Staccato/Slurred Staccato
Spiccato/Flying Spiccato
Sautill6
Pizzicato









The Diagnostic Test of Cello Technique


For the Student:

The following is a series of excerpts primarily taken from the standard solo repertoire for

cello. The selections are chosen as representative of certain technical skills or bowing styles.

Part one (Left Hand Technique) contains passages which demonstrate: Scales, Arpeggios,

Broken Thirds, Double Stops, Position Changes, Arpeggiated Chords Across Three or Four

Strings, Thumb Position, Vibrato, and Intonation. Part two (Basic Bowing Strokes) features

passages which demonstrate essential bowing styles: Slurred Legato, Ditachd/Accentuated

Ditachi, Marteli, Portato, StaccatolSlurred Staccato, SpiccatoFlying Spiccato, Sautilli, and

Pizzicato. Short definitions of terms and explanations of how to execute bowings are provided.

Metronome markings are given as examples of typical performance speeds. Make your goal to

perform these excerpts at the tempos indicated, but not at the expense of pitch or rhythmic

accuracy.

This test is designed to help determine a player's level of competency in specific areas of

technique. It is not a measure of sight-reading ability; what the examiner wishes to see and hear

is a demonstration of a particular skill, such as playing in thumb positions or legato bowing.

Focus on demonstrating this aspect of the excerpt to the best of your ability. If what is being

asked for is unclear to you, please ask the examiner for additional clarification.









The Diagnostic Test of Cello Technique


Section Two: Playing Examination

Part One: Left Hand Technique




Scales
oaks

[Excerpt from Concerto in G Minorfor Two Cellos, RV 531,by A. Vivaldi, Ia
movement]

Allegro =106
J =106 4





Alfred Publishing



[Excerpt from Danse Rustique, Op. 20, No. 5 by W.H. Squire]


S=--98
Allegro _








Stainer & Bell Limited











115












Scales (continued)

[Excerpt from Sonata Op. 69, in A Major by L. van Beethoven, 1' movement]


S=65
Allegro, ma non tanto




GC. Schinner Inc.




[Excerpt from Concerto in Bb Major by L. Boccherini/Grutzmacber, I movements

Allegro moderate
J -58


63 >_p--3 0---$


P w Mm
cresc.
14 -
64 f ~ ~~ Ff~F C ~C ~C ngr.~


CAlfred Publishing


[Excerpt from tligy. Op. 24 by G. Faur6]


io

a /I 1_ f ii -esa -- "

I"^ 1f^ *^Ssi ^/


S


Alfred Pubishing

CAlfred Publishing


Molto Adagi
)-84


i













Arpeggios
[Excerpt from Etude, Op.120, No. 13, by F. Dotzauer]
Dotzauer, Op 120, 9 18
Allegro J -" 1 t > 1

2







f P /
1 4 t 4 1 2



VCarl Fisher, LLC
jExcerpt from Sonata in G Major, by G.B. Sammarinni I' movement]

Allegro non toppo J -80




CAlfred Publishing
[Excerpt from Fantasy Pieces, Op. 73 by R. Schumann, movement]

Zart und mit Asdruck o *
International Music Company












Broken Thirds


[Excerpt from The Ivan Galamian Scale System for Violoncello, arr. by Hans Jorgen
Jensen]


J =60
20 0
2 a._*, .


4 2 2 1. 2 2 2 4
0 At 0fl, 24,


0 0


0 0 0


CECS Publishing


[Excerpt from Concerto in D Major Op. 101, by J. Haydn, I" movement]


Allegro moderato ) =106



-p
p


HaydnlGendron CONCERTO FOR VIOLONCELLO IN MAJOR, OP. 101
C 1954 Schott Music
All rights reserved
Used by permission of European American Music Distributors LLC
Sole U.S. and Canadian agent for Schott Music


B-Flat Major
a 0


iK;bil~t~L~=~l~--~r-~q(













Double Stops

[Excerpt from Sonata in G Major by G.B. Sammartini, 1' movement]


J =76
a tempo V 4


f-


4
2


dim.


Y


V Y


Alfred Publishing










[Excerpt from Suite No 3: Allemande, by J.S. Bach] =106






.- n-il-






CBirenreiter Music Corporation


2


-^


=~F==~L3=~=F~F


E"- '" F-[ f


e > r


4=ft













Double Stops (continued)

[Excerpt from Concerto in C Major, Hob. VIIb. 1 by J. Haydn, 3" movement]


Allegro molto J =152-162
4


;6 4


t~t


P f I .-J 'J
P fF F- L:


A
S" V


-M -.


Alfred Publishing




[Excerpt from Concerto in B Minor, Op. 104 by A. Dvorak, 2" movement]


freely: 90-102

nrt .V


V Quasi Cadensa._ -

IJ.P ^^35 ^k. 5


pt^1 1- f pzz
pp r -' z p c /P z
pp i t17 0' ^ iZZ*-'P f'


International Music Company


fi; V


A


P F-. -* r &















Position Changes

[Excerpt from Concerto in G Major, Op. 65, No. 4 by G. Golterman, 3rd movement]


S=152 3



/ 4 4


HO3 L -- rN -


[Excerpt from The Swan, by C. Saint-Saens]


Alfred Publishing


CAlfred Publishing
















Position Changes (continued)

[Excerpt from Sonata No. I in E Minor, Op. 38 by J. Brahms. I1 movement


J =56 Allegro non toppo


22 3 1


'-~ a2t


r -A Z r ---


p doles -
0 *_ 4 t


CInternational Music Company


[Excerpt from An Organized Method of String Playing, by Janos Starker, p. 33]


Played all on the D string except the last 3 measures.


J =120


V i S
i S


32a


44


D string








2 2
ti 3 4 1' i1 i i 1



C1965 Peer International Corporation


crese.


4- 1 a
4Z
















Arpeggiated Chords Across Three or Four Strings


[Excerpt from Concerto in G Major, Op. 65, No. 4 by G. Golterman, 3"Y movement

j. =160 Pin animator


It-f '


a


r N


f CAlfred Publishing


cresc----
msec.


[Excerpt form Sheherazade, Op. 35 by N. Rimsky-Korsakov, 1s movement]

J =52 OEdwin F Kalmus & Company Incorporated

UH TuttlL0

W = : 1.J 1


C3:


3


~fE















Arpeggiated Chords (continued)


[Excerpt from Concerto in Bb Major by L Boccherini/Gnrtzmacher, 1 movement]

Allegro moderato -90 a


2 1 4 4


3,-
SEE -M-
- a -=


poco p P

=3 =1, 1 i.


p

,r r a^ ,


cresc.

B 7 tN t>


- n


2
a2~


- -L -. _m 1 4 -L


rit.


Alfred Publishing


[Excerpt from Concerto in E Minor, Op. 85 by E. Elgar, 4 movement] J =112-120

Allegro aimato 2 3

Novello and Company Limited


S3 -0 O q r c









ItRM ,- N3 1















Thumb Position

[Excerpt from Concerto No. 2 in D Major by J. B. Breval, Rondo]


Allegretto J =116

^ .


7legiero


I


I1e


m /









I I r
F-9 CAlfred Publishiug



[Excerpt from Concerto in Bb Major, by L. Boccherini/Grutzmacher, 1' movement]

Allegro moderato J =62




molto cresc.




f PP


6 1 d


fc. ft' 2i *- e ,r &>^


OAlfred Publishing














Thumb Position (continued)

[Excerpt from Sonata in A Major, Op. 69 by L van Beethoven, 3" movement]


Allego vivace 80



-p m




(pp) ()cresc.. -



-i II I

CG. Schirmer Inc.




[Excerpt from Scherzo, Op. 12 by D. van Goens]



Vivace molto e con spirit =|140-160

Iw L 92 9 2







cresc.
02 0 2

E -*m :_eflA l : Z:m a r r sam


CAlfred Publishing




























126













Vibrato

[Excerpt from Chanson Triste, Op. 40, No. 2, by P. I. Tchaikovsky]

J=86 TT
Allegro non troppo =86 I 3
4.I r 2 V

p con moto espresso
n 2 l.
2 4






-J 1

Alfred Publishing






[Excerpt from Sonata in G Minor by H. Eccles, I~ movement]



Largo 3 V
3 3 V
S4 3 I_ 0 Y

II I "
mf

3 I --.....---.. 2 4 1


2 4 creS. 1 1
II---...........


44
& 0 I----_


Alfred Publishing


~Jei~L-~e~c' i I l--~F~- '-


fr_














Vibrato (continued)

[Excerpt from Eligie, Op. 24 by G. Faurf]

,F =70-74
Molto Adagio
4


f sempref


V

PP


4


CAlfred Publishing



[Excerpt from Concerto in B Minor, Op. 104 by A. Dvorak,l" movement]



a tempo. te i1


p dole e motto soteenuto



nc C



Ii M



CInternationai Music Company


a C^ 3


1 4


~tl


W:'+.















Intonation

[Excerpt from Arioso (from Cantata 156), by J.S. Bach]


S=58
Adagio ,


2,t 0 o 4 ? f __ &


mf cresc. f

4Afred Pub





CAlfred Publishing


[Excerpt from Elgy, Op. 24 by G. Faurd]

a tempo =70-74
30 4 c-4 I 2,.3 -










poco a poco cresc.


Alfred Publishing


3 3


1
















Intonation (continued)

[Excerpt from Suite No. 5 in C Minor, BWV 1011 by J.S. Bach, Sarabande]


SARABANDE 4

h "j .


& a


o 4 1


[Excerpt from Sonata in D Minor, Op. 40 by D. Shostakovich, 1I' movement]


Si ,13s8 Administered by G. Schirmer Inc.
Alegro ma non troppo 138 v v




mf molt tspr. erese. di.
1 y n J














I esc. dim. p/
aced.

.ks 6 1 h- r Ft A -.
Scrse. IT ii I PE I IlP
A i e 2G. Sc4n Inc.


i














Part 2: Basic Bowing Strokes


Slurred Legato

Execution: Smoothly connected; Groups of notes phrased as smoothly as possible.

[Excerpt from Suite No. 1: Allemande, by J. S. Bach]


ALLEMANDE J =54


a 7n e.n -A h


. .' P _.a ..2 r 4lip


Birenreiter Music Corporation


[Excerpt from Sonata in A Major, by C. Franck, I" movement]


J. =60


7,,


r.' i p r ._ I l
------ IL i 4


I


International Music Company


a'In


L);Rbt~~f~7+-rt~-~FE PC Fff~


/- /-^. -^


-Pz


S2 -


~7~f~a















Slurred Legato (continued)

[Excerpt from Concerto in D Minor, by E. LaIo, 2" movement]


Andante con moto V

SL


=40-44
(mime movement pour la J.)
a -


L t
.a L f -CT~N L~2C3N


-l Vgrl Lro
Allegro presto

1, El=


r-0 P


lInternational Music Company


~C~FfS~r~F


6-
^__ __-*''ang P
./ ^ ?' *^ s~


I""


wr


fci^ ..Kr.^riirr~ri..rm FE .iA


-












D6tache

Execution: An active, yet smooth bow stroke with no visible or audible accent.

[Excerpt from Sonata in E Minor, Op. 1, No. 2, by B. Marcello, 2d movement]


J -90
Allegro ,
2



f


4
2 2


0I 0


r ~ F r taP .P *- a-it is

IL~

a ia -
9*- ~1111111FiF fK F f: 1~


Alfred Publishing


Accentuated Dttache


Execution: Bow changes are not concealed, but emphasized with accents. One hears the
articulation of the bow changes.

[Excerpt from Sonata in C Major, Op. 102, No. 1 by L. van Beethoven, 4" movement]

CG. Schirmer Inc.


Allegro vivace
J =110




crese. -
0ik~
CTrfl ni


v


f


Ij


fp fp













Martele' (Marcato)

Execution: A fast well-articulated, heavy, separate stroke resembling a sforzando, or
pressed accent. The weight is pressed into the string like in the staccato, but is heavier
and more accented. The notes are separated with a momentary stop after each note.

[Excerpt from Sonata in E Minor, No. 5, by A. Vivaldi, 2"r movement]


J =102

allegro ditache'


Smartele'


S-AAlfred Publishing



[Excerpt from Toccata by G. Frescobaldi, arr. by Cassado]


=102-108
=y I ZAllegr giusto.




fe 'fpre mtta it.

-fscinre


Cassado TOCCATA
01925 by Universal Edition
C renewed
All Rights Reserved
Used by permission of
European American Music Distributors LLC
And Canadian agent for Universal Edition














Portato (Loure')


Execution: The Portato is a series of broad strokes played in one bow with a smooth
slightly separated sound between each note. The bow does not stop as in the slurred
staccato. Each note is to be clearly enunciated with a slight pressure or "nudge" from the
index finger and upper forearm.

[Excerpt from Sonata in D Minor, by C. Debussy, Prologue]

S=52





1 124 1 324
Gqr-E I -0


- r r~E


1915 Editions Durand and C&


[Excerpt from Sonata in E Minor, Op. 38, No. 1, by J. Brahms, 1" movement]


Allegro non troppo J =56
esrsfl. tegato


1 H a 1|
I-~~ -4-1 -y


CInternational Music Company


pit.


. .f-T--


' 0 *5,7 ill


ro T- VI












Staccato


Execution: The staccato is a short, stiff on-the string stroke. Press the bow into the
string and keep the weight in the string while drawing the bow. Stop after each note,
releasing the pressure; repeat for each note.

[Excerpt from Sonata in G Minor, No. 3 by J.S. Bach, 3" movement]


Allegro


;- =66
4-t
t i L 3-^ : J. J. i Ai


01979 Peer International Corporation


[Excerpt from Sonata in G Minor, by H. Eccles, 2d movement]


Corrente -10
Allegro con spirlto
IV 4t V V
&V f~


k |
^4.


f -
W V V


t #t~io


b"*- Ul ,--d; .. -j I T aa


till


Alfred Publishing


. L


2


3 1.


.I mf I e #I
ilW [ It jj -'


V


~I-f-F9 IE I I li"


-roll L1=4- + i"J I ^ ^















Slurred Staccato

Execution: Two or more staccato notes to a bow. Press into the string, release the
pressure after drawing the bow and quickly draw the next note. The bow stops after each
stroke.

[Excerpt from Sonata in E Minor by B. Marcello, 4dh movement]

S=126


Allegretto
~2


4 ^ 4


I I I I I I I I;Op T a


,96


CAlfred Publishing




[Excerpt from Sonata in E Minor, Op. 38, No- I by J. Brahms, 2" movement]


J =134


C.


4)


International Music Company


4
Af-; g i'i


g7N


Allegretto quasi Minuetto
Pi anoV


b W if t f m 1 1


-0' Si


a


wi


r-- Iva IN. r ,' .....


. ... i


v
-7--ii *


Oj 4,.
















Spiccato


Execution: A controlled, bounced-bow stroke; the bow begins above the string and is thrown on the
string with a swinging follow-through arm motion. The bow describes an inverted arch one or two
inches above the string. As the bow bounces up after sticking the string it is held by a light but firm
grip. Slower spiccato is played near the frog with the entire arm; faster lighter spiccato is played
further out on the bow with the fingers and wrist initiating the movement.

[Excerpt from Sonata in G Minor, Op. 5 Nr. 2 by L. van Beethoven, 3" movement]


Rondo
Allegro


S=69


- 3


I i Vo
4


W~bt~ihiF~Lft~t ~= n n


CG. Schirmer Inc.


[Excerpt from Concerto in Bb Major, by L. Boccherini, 3d movement]


Rondo
Allegro


J =144


rIk >. k ioa


. 6 i R i1


4
. ; L *w Sta


1
1 A I t


a
i.-1i ,$-e


ni
2 *v
es


I s e-f- I \ 1 il I K,


I
1 2 1
2 1 a


IL 1 Ir i I rir LL -J
dim. -


un poco pesi
2 I
t ~t o tnizf:


- -, t -- "- .-1- i "-' -- -- .4 -
I^ = i., i


ante


A- l f r F rl if
fp f
'Alfred Publishing


Cr)ftn


'I
Jp


---


-L.L .















Flying Spiccato (sometimes called Flying Staccato)

Execution: A rapid series of notes played up-bow with a bounced bow stroke. Start from
on the string and allow the bow to spring slightly from the string.

[Excerpt from Gavotte in D Major by D. Popper]


Lively ( = 84)
.--~.3 o *-,


f-N


lfi .z


CAlfred Publishing


(Excerpt from Allegro Appassionato, Op. 43 by C. Saint-Sains]
Allegro J =120



f dim.


82 A L
..=.,~~~ ~~ f .ii, L L .-itl 5 pit Op & ai.L.'
a, ~ ~ ~ ~ ~ ~ ~ ~ 1 J -


ep1 p
sempre p


CAlfred Publishing


[Excerpt from Sonata in D Major by P. Locatelli, I1 movement]

J =116
Allegro
^ ^ -- ^.^ ^ --\-- -


n F 1 F f 1 F


Wf


_----_____ -


CG. Schirmer Inc.


-4


i-H F [ _lft d 1 ,,,,r


... r11 i. F i |


~f7
~f~f ~e_ ~f~f~ Y,
















Sautili (Sometimes called bounced bow or uncontrolled spiccatto)

Execution: The Sautillt is a stroke that uses a rapid natural rebounding of the bow centered around
the balance point It is played much like a rapid detacht but with primary movement from the wrist
and hand, with pressure being applied with the first finger only, using a light bow hold. Allow the
bow to bounce off the string of its own momentum.


J =120


[Excerpt from Etude, Op. 44, No. 5 by C. Schroder]
Allero vivace. 4 cont


- Z~4*$lt~ fttAA.


P i


r






E EI
-as fjF+Flt-Ff -r~


CCarl Fisher, LLC


[Excerpt from Hungarian Rhapsody, Op. 68 by D. Popper]

Presto J =132



S r



CG. Schirmer Inc.


ti

















Sautilli (continued)

[Excerpt from Scherzo, Op. 12 by D. van Goens]


Vivace molto e con spirit J =140-160
II F F. r..-r.r... r --F F


-nl-- CM S = GO= mr= = = = C



A- -W r Ir
f


I 4


p


1 ..4


1 ...


r --


2 U LIUI "-LI-


MII'


SAlfred Publishing









[Excerpt from Concerto in E Minor, Op. 85, by E. Elgar, 2d movement]


Allegro mollo. = :so


32 .
h # $ii^i::: aaaari


Novello and Company Limited


"1


&I EP jj -: i i I -, P OP I -- -w W


W- GPM.-d-W


'


m .


" ..















Pizzicato

Execution: Use of thumb for "strummed" chordal passages in the Elgar excerpt.

[Excerpt from Concerto in E Minor, Op. 85 by E. Elgar, 2" movement]


RECIT. freely: =102-140
Lento. ac
18 P1" f J


rl.


SONovello and Company Limited
Allegro molto.
A & rco" ri *


. "-...r ri i : r '
P. W~ W F


Sa tempo Ite.
rit. I a arco a, a ,-


plzz.


---o p^f 1-----ift-----JF ) l s



a Irmo rit mto Apinz. 19a ae M a

,d,.,F'.




Brahms SONATA FOR PIANO AND VIOLONCELLO. OP. 99, NO. 2
O 1973 by Wiener Urtext Edition
All Rights Reserved
Used by permission of
European American Music Distributors LLC
U.S. and Canadian agent for Wiener Urtext Edition

[Excerpt from Sonata in F Major, Op. 99 by J. Brahms, 2" movement]


Adagio affettuoso J =90-110
40 3pizz.
m M, st' A a # P A L -


w


~i~LeCt--tt'F~.-F~ F1 F f --- F.,b


~glt~--tFfl--t~


- I w di w


w


'~f~t
----~-----













Pizzicato (continued)

[Excerpt from Concerto in G Minor, Op. 49 by D. Kabalevsky, 1" movement]


,lge. a (J. W-10- .4) y1.1


P

'P6~ 1 l ~ I L 9 W 3


' 1


A
A.! e to. ~


T77r-~-H~H--


3UNi


CInternational Music Company

Administered by G. Schirmer Inc.


' '..











APPENDIX H
THE PLAYING TEST EVALUATION FORM

Student's Name Adjudicator's Code
Grade Level
Degree Program
Audition Day Audition Time

Part One: Left Hand Technique

Scales The student's playing of scales exhibits:
(Check All that Apply, worth 2 points each)
I 95 % accurate whole and half steps.
I evenly divided bow distribution.
D steady tempo.
D effortless position changes.
D smooth string crossings.
Observations/Comments:



Arpeggios The student's playing of arpeggios demonstrates:
(Check All that Apply, worth 2 points each)
0 mostly accurate intonation.
D smooth connections of positions.
D little audible sliding between notes.
D clean string crossings.
D a steady and consistent tempo.
Observations/Comments:



Broken Thirds The student's playing of broken thirds:
(check One only)
0 10 demonstrates the highest level of competency.
S8 shows a high degree of experience, with only minor performance flaws.
06 indicates a moderate degree of competence or experience.
04 is tentative and faltering with some pitch and/or intonation errors.
D2 is undeveloped and results in many inaccurate pitches and out of tune notes.
Observations/Comments:










Double Stops The student's playing of double stops features:
(Check All that Apply, worth 2 points each)
D consistently good intonation with all intervals.
I a clear, unscratchy tone.
D the clean setting and releasing of fingers when playing double stops.
D even bow-weight distribution on two strings.
D the ability to vibrate on two strings simultaneously.
Observations/Comments:



Position Changes The student's technique of changing positions:
(check One only)
D 10 demonstrates well-prepared, smooth shifting between notes, without interruption
of the melodic line, or creating a break between notes.
S8 shows smooth shifting and uninterrupted melodic line, but includes excessive
audible slides.
D6 indicates experience with position changes, but includes some sudden jerky
motions when shifting and several audible slides.
D4 indicates some experience with shifting but position changes are often either,
jerky, unprepared, or filled with audible slides.
02 exhibits un-prepared and inaccurate shifting. Sliding between notes is often
heard and hand/arm motions are jerky.
Observations/Comments:



Arpeggiated Chords The student's playing ofarpeggiated chords exhibits:
(Check All that Apply, worth 2 points each)
0 coordinated action between the left hand and bow arm.
D even string crossings, with steady rhythm.
D an ease in preparing chordal fingering patterns.
D clear tone on all strings.
D graceful, fluid motion.
Observations/Comments:



Thumb Position The student's playing of thumb position reveals that
(CheckAll that Apply, worth 2 points each)
D the thumb rests on two strings and remains perpendicular to the strings.
D the fingers stay curved and don't collapse while playing.
D correct finger spacing is consistently used.
D there is an ease of changing from string to string.
D the arm and wrist support the thumb and fingers versus resting on the side of
the cello.
Observations/Comments:










Vibrato The student's vibrato:
(check One only)
1 10 is full, rich, even, and continuous. It is used consistently throughout the
fingerboard.
D8 is full and rich, but occasionally interrupted due to fingering/position changes.
D6 is mostly utilized, but is irregular in its width or speed and lacks continuity
throughout the fingerboard. Excessive tension is apparent in the vibrato.
04 is demonstrated, but in a tense, irregular way. It is not used consistently by all
fingers in all positions. Vibrato width/speed may be inappropriate.
D2 is demonstrated marginally with a tense, uneven application. Vibrato is
inconsistently used and lacks appropriate width/speed.
Observations/Comments:



Intonation The student's intonation:
(check One only)
D 10 is accurate throughout on all strings and in all positions.
D8 is accurate, demonstrating minimal intonation difficulties, with occasional lack of pitch
correction.
D6 is mostly accurate, but includes out of tune notes resulting from half-step
inaccuracies, inaccurate shifting or incorrect spacing of fingers.
D4 exhibits a basic sense of intonation, yet has frequent errors of pitch accuracy and
often doesn't find the pitch center.
02 is not accurate. Student plays out of tune the majority of the time.
Observations/Comments:



Part Two: Basic Bowing Strokes

Slurred Legato The student's legato bow stroke:
(check One only)
D 10 is smoothly connected with no perceptible interruption between notes.
S8 is smooth, but has some breaks within phrases.
06 includes some disconnected notes and detached bowing.
D4 shows breaks within phrases and is often not smoothly connected.
D2 exhibits little skill of smooth bowing. Bowing has many interruptions between
notes.
Observations/Comments:










D6tach6/Accentuated D6tach6 The student's d6tach6 bow stroke is:
(check One only)
0 10 vigorous and active-played on the string. Accentuated D6tach6 features greater
accented attacks.
D8 vigorous and active, but occasionally lacking articulation or bow control.
D6 moderately active, but lacking articulation or suffering from too much
accentuation.
04 not making sufficient contact with the string, or else producing a scratchy sound.
D2 undeveloped, and lacking the control to produce a consistent vigorous sound.
Observations/Comments:



Martel6 The student's playing ofmartel6 bowing features:
(Check All that Apply, worth 2 points each)
0 a fast, sharply accentuated bow stroke.
D a heavy separate stroke resembling a sforzando.
D bow pressure being applied before the bow is set in motion.
D the bow being stopped after each note.
D great initial speed and pressure with a quick reduction of both.
Observations/Comments:



Portato The student's use of portato bowing demonstrates:
(CheckAll that Apply, worth 2 points each)
0 a slightly separated legato bow stroke.
D the pressure of the index finger being applied to pulse each note within a slur.
D an enunciation of each note through a slight change of bow pressure/speed.
D the bow does not stop between notes.
D notes being articulated without lifting the bow from the string.
Observations/Comments:



Staccato/Slurred Staccato The student's playing of staccato:
(check One only)
D 10 is crisp and well-articulated, with the bow stopping after each note.
08 demonstrates a high level of mastery, with minor flaws in execution.
S6 shows a moderate level of attainment.
04 reveals only a limited amount of bow control.
02 does not demonstrate the ability to execute these strokes.
Observations/Comments:










Spiccato/Flying Spiccato The student's playing of spiccato indicates:
(Check All that Apply, worth 2 points each)
D a bounced-bow stroke with good control of the bow's rebound off the string.
I good tone production through control of bow pressure and speed.
D the bow springs lightly from the string.
D notes are individually activated.
D even use of bow distribution (Flying Spiccato excerpts).
Observations/Comments:



Sautille The student's use of sautille bowing demonstrates:
(Check All that Apply, worth 2 points each)
D a rapid, natural rebounding of the bow.
D a primary movement initiated from the wrist and hand, using a light bow hold.
D the bow's contact with the string is centered around the balance point of the bow.
D the tempo is fast enough for the bow to continue to bounce of it own momentum.
D the resilience of the bow stick is used to allow the bow to spring off the string.
Observations/Comments:



Pizzicato The student's playing of pizzicato illustrates:
(Check All that Apply, worth 2 points each)
D confidently played arpeggiated chords, using the thumb.
D strong, vibrant tone (as demonstrated in the Brahms excerpt).
D clear ringing sound in the upper register (as in the Kabalevsky excerpt).
D an absence of snapping sounds caused by pulling the string at too steep
an angle.
D an absence of buzzing or dull, thudding tones due to inadequate setting of the
left-hand fingers.
Observations/Comments:












APPENDIX I
REPERTOIRE USED IN THE PLAYING TEST


Piece Technique/Bow Stroke


Composer


Bach, J.S.










Boccherini, L./Grutzmacher








Beethoven, L. van










Brahms, J.








Breval, J. B.


Arioso (from Cantata 156)

Sonata in G Minor, No. 3, 3rd mvt.

Suite No. 1 in G Major, Allemande

Suite No. 3 in C Major, Allemande

Suite No. 5 in C Minor, Sarabande

Concerto in Bb Major, 1st mvt.

Concerto in Bb Major, 1st mvt.

Concerto in Bb Major, 1st mvt.

Concerto in Bb Major, 3rd mvt.

Sonata in G Minor, Op. 5, No. 2
3rd mvt.

Sonata Op. 69 in A Major, 1st mvt.

Sonata Op. 69 in A Major, 3rd mvt.

Sonata in C Major, Op. 102, No. 1
3rd mvt.

Sonata No. 1 in E Minor, Op. 38, 1st mvt.

Sonata No. 1 in E Minor, Op. 38, 1st mvt.

Sonata No. 1 in E Minor, Op. 38, 2nd mvt.

Sonata No. 2 in F Major, Op. 99, 2nd mvt.

Concerto No. 2 in D Major, Rondo


Intonation

Staccato

Slurred Legato

Double Stops

Intonation

Scales

Arpeggiated Chords

Thumb Position

Spiccato

Spiccato


Scales

Thumb Position

Accentuated Detache


Position Changes

Portato

Slurred Staccato

Pizzicato

Thumb Position











Debussy, C.

Dotzauer,

Dvorak, A.



Eccles, H.



Elgar, E.






Faure, G.






Franck, C.

Frescobaldi, G.

Goens, D. van



Golterman, G.






Haydn, J.




Jensen, H. J.


Sonata in D Minor, Prologue

Etude Op. 20, No. 13

Concerto in B Minor, Op. 104, 1st mvt.

Concerto in B Minor, Op. 104, 2nd mvt.

Sonata in G Minor, 1st mvt.

Sonata in G Minor, 2nd mvt.

Concerto in E Minor, Op. 85, 2nd mvt.

Concerto in E Minor, Op. 85, 2nd mvt.

Concerto in E Minor, Op. 85, 4th mvt.

El1gy, Op. 24

El1gy, Op. 24

Elegy, Op. 24

Sonata in A Major, 1st mvt.

Tocatta

Scherzo, Op. 12

Scherzo, Op. 12

Concerto in G Major, Op. 65, No. 4
3rd mvt.

Concerto in G Major, Op. 65, No. 4
3rd mvt.

Concerto in C Major, Hob. VIIb. 1
3rd mvt.

Concerto in D Major, Op. 101, 1st mvt.

The Ivan Galamian Scale System for
Violoncello


Portato

Arpeggios

Vibrato

Double Stops

Vibrato

Staccato

Pizzicato

Sautille

Arpeggiated Chords

Scales

Vibrato

Intonation

Slurred Legato

Martele

Sautille

Thumb Position

Position Changes


Arpeggiated Chords


Double Stops


Broken Thirds

Broken Thirds










Kabalevsky, D. B.

Lalo, E.

Locatelli, P.

Marcello, B.



Popper, D.



Rimsky-Korsakov, N.

Saint-Saens, C.



Sammartini, G. B.



Schroder, C.

Shostakovich, D.

Squire, W.H.

Starker, J.


Schumann, R.

Tchaikovsky, P. I.

Vivaldi, A.


Concerto in G Minor, Op. 49, 1s mvt.

Concerto in D Minor, 2nd mvt.

Sonata in D Major, 1st mvt.

Sonata in E Minor. Op. 1 No. 2, 2nd mvt.

Sonata in E Minor. Op. 1 No. 2, 4th mvt.

Gavotte in D Major

Hungarian Rhapsody, Op. 68

/.ehett'/ I :h/'e, Op. 35, 1st mvt.

Allegro Appassionato, Op. 43

The Swan

Sonata in G Major, 1st mvt.

Sonata in G Major, 1st mvt.

Etude, Op. 44, No. 5

Sonata in D Minor, Op. 40, 1st mvt.

Danse Rustique, Op, 20, No. 5

An Organized Method of String Playing
(p. 33)

Fantasy Pieces, Op. 73, 1st mvt.

Chanson Triste, Op. 40, No. 2.

Concerto in G Minor for 2 Cellos, RV 531,
1st mvt.

Sonata in E Minor, No. 5, 2nd mvt.


Pizzicato

Slurred Legato

Flying Spiccato

Detache

Slurred Staccato

Flying Spiccato

Sautille

Arpeggiated Chords

Flying Spiccato

Position Changes

Arpeggios

Double Stops

Sautille

Intonation

Scales

Position Changes


Arpeggios

Vibrato

Scales


Martele










APPENDIX J
THE STUDENT SELF-ASSESSMENT PROFILE


Name


Status (year/college)


Major


Minor


Years of study on the Cello


Other instruments) played


Repertoire previously studied:

Methods/Etudes


Solo Literature




Orchestral Experience:



How interested are you in each of these areas of performance?

I am interested in solo performance.
o Strongly agree o Agree o Disagree o Strongly disagree

I am interested in chamber music performance.
o Strongly agree o Agree o Disagree o Strongly disagree

I am interested in orchestral performance.
o Strongly agree o Agree o Disagree o Strongly disagree

Other areas of performance interest?


What are your personal goals for studying the cello?


What areas of cello technique do you feel you need the most work on?









Summarize your goals in music and what you need to do to accomplish these goals.
6 months:

1 year:

2 years:

4 years:

10 years:










APPENDIX K
DESCRIPTIVE STATISTICS FOR RAW DATA


Table K-1. Raw Scores of the Written Test Items, and
Deviations


Composite Means and Standard


Student Fingerboard
Geography


1 11
2 11
3 11
4 11
5 11
6 11
7 5
8 11
9 11
10 11
11 11
12 11
13 11
14 11
15 11
16 11
17 11
18 11
19 11
20 11
21 11
22 11
23 11
24 11
25 11
26 11
27 11
28 11
29 11
30 11
M 10.80
SD 1.10


Interval
Id.


7
7
7
7
8
7
3
6
6
8
5
4
8
5
7
3
7
8
8
8
3
6
5
6
5
8
8
8
6
8
6.40
1.63


Pitch
Location


30
30
32
31
32
0
16
32
29
29
31
12
22
31
29
2
32
32
32
31
30
14
28
32
27
32
31
32
29
31
26.70
8.82


Single-Pos.
Fingering


32
23
32
29.80
4.11


Note Total
Id. Score


12
12
12
11.57
0.90


92
87
93
91
95
59
59
93
90
92
91
68
65
90
86
59
88
94
95
92
85
73
86
87
86
93
94
95
81
94
85.20
11.38










Table K-2. Raw Score, Percent Score, Frequency Distribution, Z Score, and Percentile Rank of
Written Test Scores



Raw Percent Frequency Z Percentile
Score Score Score Rank


59 62.00 2 -2.30 1.67
62 66.00 1 -2.04 8.33
65 68.00 1 -1.78 11.67
68 72.00 1 -1.51 15.00
73 77.00 1 -1.07 18.33
81 86.00 1 -0.37 21.67
85 89.00 1 -0.02 25.00
86 91.00 3 .07 28.33
87 92.00 2 .16 38.33
88 92.00 1 .25 45.00
90 95.00 2 .42 48.33
91 96.00 2 .51 55.00
92 97.00 3 .60 61.67
93 98.00 3 .69 71.67
94 99.00 3 .77 81.67
95 100.00 3 .86 91.67










Table K-3. Raw Scores of the Playing Test Items, Composite Means, and Standard Deviations

Student Scales Arpeggios Broken Double Position Arpeggiated
Thirds Stops Changes Chords


1 10 10 8 10 10 10
2 10 10 8 8 6 6
3 10 10 8 8 10 10
4 10 8 10 8 8 10
5 8 8 6 6 8 10
6 8 10 10 8 8 8
7 10 10 8 8 10 8
8 8 10 8 6 8 8
9 8 10 8 4 8 6
10 8 10 8 8 10 8
11 6 8 8 8 10 4
12 8 10 8 6 10 10
13 8 8 6 8 6 8
14 6 6 6 4 6 6
15 6 6 6 4 6 8
16 6 4 4 4 6 6
17 6 6 6 8 6 2
18 8 8 4 6 4 4
19 8 8 8 8 8 10
20 6 10 6 4 8 8
21 4 6 6 6 10 8
22 6 6 4 4 6 4
23 0 2 2 2 6 2
24 6 6 4 4 4 4
25 8 10 6 4 6 2
26 8 8 6 8 8 8
27 8 8 8 8 10 10
28 10 10 8 6 8 10
29 10 8 8 8 8 10
30 10 10 8 8 8 8
M 7.6 8.13 6.8 6.4 7.67 7.2
SD 2.19 2.10 1.86 1.99 1.83 2.66
(Table K-3 continues on next page)










Table K-3. (continued)


Student Thumb Vibrato Intonation Slurred Detache Martele
Position Legato

1 8 10 8 8 8 8
2 6 8 8 8 8 10
3 10 10 6 10 10 10
4 10 8 8 10 8 4
5 10 10 8 10 8 10
6 8 8 6 10 8 8
7 6 10 6 10 10 10
8 4 8 6 10 8 6
9 8 4 6 4 10 6
10 6 10 8 10 10 10
11 6 8 6 8 10 8
12 6 10 8 8 8 8
13 8 10 6 8 10 2
14 6 4 4 8 8 4
15 6 8 4 6 6 8
16 4 6 2 8 8 2
17 6 4 4 8 8 2
18 4 6 8 8 8 4
19 6 8 8 8 10 8
20 10 8 6 9 10 10
21 2 8 6 6 2 2
22 8 6 6 4 8 6
23 6 8 4 8 8 4
24 8 10 4 8 10 4
25 6 4 4 4 8 4
26 8 8 6 10 6 6
27 8 10 8 8 10 10
28 10 8 8 10 10 8
29 10 8 6 10 10 8
30 6 10 8 10 8 10
M 7.0 7.93 6.2 8.23 8.47 6.67
SD 2.08 2.00 1.69 1.85 1.72 2.84
(Table K-3 continues on next page)










Table K-3. (concluded)

Student Portato Staccato Spiccato Sautille Pizzicato Total
Score

1 10 8 8 10 8 152
2 8 10 8 4 10 136
3 8 8 10 10 8 156
4 8 8 6 10 10 144
5 8 8 8 4 4 134
6 10 10 8 8 10 146
7 10 10 10 8 8 152
8 10 10 8 6 4 128
9 4 6 8 10 6 116
10 10 10 8 10 8 152
11 6 4 4 4 6 114
12 10 8 8 8 6 140
13 10 6 2 8 6 120
14 10 8 2 4 4 96
15 10 8 8 10 6 116
16 4 6 2 2 2 76
17 10 8 6 8 4 102
18 4 10 6 6 2 100
19 8 10 8 10 10 142
20 10 10 3 8 8 134
21 6 6 2 4 2 86
22 0 6 2 2 4 82
23 8 8 4 0 4 76
24 4 8 8 8 4 104
25 0 6 6 10 4 92
26 10 6 4 10 4 124
27 10 8 10 10 8 152
28 8 8 8 4 6 140
29 10 8 6 10 10 148
30 6 8 8 8 6 140
M 7.67 7.93 6.3 7.13 6.07 123.33
SD 2.97 1.62 2.61 3.00 2.55 25.18









LIST OF REFERENCES


Abeles, H.F. (1973). Development and validation of a clarinet performance
adjudication scale. Journal ofResearch in Music Education, 21, 246-255.

Aristotle, trans. 1943, Jowett, B. Politics. (1340b24) New York: Random House.

Aristotle, Nichomachean Ethics. Bk. 2 (1103a26-1103b2) as paraphrased by Durant, W.
(1967). The Story ofPhilosophy. New York: Simon and Schuster.

Asmus, E.P. & Radocy, R.E. (2006). Quantitative Analysis. In R. Colwell (Ed.), MENC
handbook of research methodologies (pp.95-175). New York: Oxford University
Press.

Bergee, M. J. (1987). An application of the facet-factorial approach to scale
construction in the development of a rating scale for euphonium and tuba music
performance. Doctoral dissertation, University of Kansas.

Berman, J., Jackson, B. & Sarch, K. (1999). Dictionary of bowing and pizzicato terms.
Bloomington, IN: Tichenor Publishing.

Blum, D. (1997). Casals and the art of interpretation. Berkeley and Los Angeles, CA:
University of California Press.

Boyle, J. (1970). The effect of prescribed rhythmical movements on the ability to read
music at sight. Journal ofResearch in Music Education, 18, 307-308.

Boyle, J. (1992). Evaluation of music ability. In D. Boyle (Ed.), Handbook of research
on music teaching and learning (pp. 247-265). New York: Schirmer Books.

Boyle, J. & Radocy, R.E. (1987). Measurement and evaluation of musical experiences.
New York: Schirmer Books.

Brophy, T. S. (2000). Assessing the developing child musician: A guide for general
music teachers. Chicago: GIA Publications.

Colwell, R. (2006). Assessment's potential in music education. In R. Colwell (Ed.),
MENC handbook of research methodologies (pp. 199-269). New York: Oxford
University Press.

Colwell, R. & Goolsby, T. (1992). The teaching of instrumental music. Englewood
Cliffs, NJ: Prentice Hall.

Eisenberg, M. (1966). Cello playing of today. London: Lavender Publications.









Ekstrom, R., French, J., Harman, H., & Dermen, D. (1976). Kit offactor-referenced
cognitive tests. Princeton: Educational Testing Service.

Elliott, D. J. (1995). Music matters: A new philosophy of music education. New York:
Oxford University Press.

Epperson, G. (2004). The Art of string teaching. Fairfax, VA: American String Teachers
Association with National School Orchestra Association.

Farnum, S. E. (1969). The Farnum string scale. Winona, MN: Hal Leonard.

Gardner, H. (1983). Frames of mind: The theory of multiple intelligence. New York:
Basic Books.

Gillespie, R. (1997). Rating of violin and viola vibrato performance in audio-only and
audiovisual presentations. Journal ofResearch in Music Education, 45, 212-220.

Gromko, J. E. (2004). Predictors of music sight-reading ability in high school wind
players. Journal ofResearch in Music Education, 52, 6-15.

Hoover, H., Dunbar, S., Frisbie. D., Oberley, K., Bray, G., Naylor, R., Lewis, J., Ordman,
V., & Qualls, A. (2003). The Iowa tests. Itasca, IL: Riverside.

Jensen, H. J. (1994). The Ivan Galamian scale system for violoncello. Boston MA: ECS
Publishing.

Jensen, H.J. (1985). The four greatfamilies ofbowings. Unpublished manuscript,
Northwestern University.

Katz, M. (1973). Selecting an achievement test: Principles and procedures. Princeton:
Educational Testing Services.

Kidd, R.L. (1975). The construction and validation of a scale of trombone performance
skills. Doctoral dissertation, University of Illinois at Urbana-Champaign.

Lehman, P.B. (2000). The power of the national standards for music education. In B.
Reimer (Ed.), Pelfi inuig in ith understanding: The challenge of the national
standards for music education (pp. 3-9). Reston, VA: MENC.

Magg, F. (1978). Cello exercises: A comprehensive survey of essential cello technique.
Hillsdale, NY: Mobart Music.

Mooney, R. (1997). Position pieces. Miami, FL: Summy-Birchard Music.

Mutschlecner, T. (2004). The Mutschlecner diagnostic test of cello technique: Pilot
study. Unpublished manuscript, University of Florida.










Mutschlecner, T. (2005). Development and validation of a diagnostic test of cello
technique. Unpublished manuscript, University of Florida.

Reimer, B. (1989). A philosophy of music education. (2nd ed.) Englewood Cliffs,
NJ: Prentice Hall.

Reimer, B. (2003). A philosophy of music education: Advancing the vision. (3rd ed.)
Upper Saddle River, NJ: Pearson Education.

Renwick, J. M. & McPherson, G. E. (2002). Interest and choice: student selected
repertoire and its effect on practicing behavior. British Journal of Music
Education, 19 (2) 173-188.

Sand, B. L. (2000). Teaching genius: Dorothy delay and the making of a musician.
Portland, OR: Amadeus Press.

Saunders, T. C. & Holahan, J. M. (1997). Criteria-specific rating scales in the evaluation
of high-school instrumental performance. Journal of Research in Music
Education, 45, 259-272.

Spiro, R. J.; Vispoel, W. P.; Schmitz, J. G.; Samarapungavan, A.; & Boeger, A. E.
(1987). Knowledge acquisition for application: Cognitive flexibility and transfer
in complex content domains. In B. K. Britten & S. M. Glynn (Eds.). Executive
control processes in reading (pp. 177-199). Hillsdale, NJ: Lawrence Erlbaum
Associates.

Starer, R. (1969). Rhythmic training. New York: MCA Music Publishing,

Starker, J. (1965). An organized method of string playing: Violoncello exercises for the
left hand. New York: Peer Southern Concert Music.

Watkins, J., & Farnum, S. (1954). The Waukin,-Farnum performance scale.
Milwaukee, WI: Hal Leonard.

Warren, G. E. (1980). Measurement and evaluation of musical behavior. In D. Hodges
(Ed.), Handbook of music psychology (pp. 291-392). Lawrence, KS: National
Association for Music Therapy.

Zdzinski, S.F. (1991). Measurement of solo instrumental music performance:
A review of literature. Bulletin of the councilfor Research in Music Education,
no. 109, 47-58.

Zdzinski, S. F., & Barnes, G. V. (2002). Development and validation of a string
performance rating scale. Journal ofResearch in Music Education, 50,
245-255.









BIOGRAPHICAL SKETCH

Timothy Miles Mutschlecner was born on November 17, 1960 in Ann Arbor, Michigan.

A middle child with an older brother and younger sister, he grew up mostly in Bloomington,

Indiana, but finished high school in Los Alamos, New Mexico, graduating in 1979. He earned

his Bachelors in Music from Indiana University in 1983 where he studied cello with Fritz Magg.

In 1992 Tim graduated from the Cleveland Institute of Music with a Masters degree in

Performance and Suzuki Pedagogy. He taught in the Preparatory Department at The Cleveland

Institute of Music from 1992 to 1995 before accepting the position of director of the cello

program at the Suzuki School of Music in Johnson City, Tennessee.

In Tennessee, Tim taught a large cello studio and played in two regional orchestras. He

taught students as well through Milligan College and East Tennessee State University. Along

with giving recitals and playing in the Meadowlark Trio, Tim was a featured soloist with the

Johnson City Symphony.

In 2003, Tim began work on his Ph.D. in Music Education at the University of Florida in

Gainesville. During the next four years he taught university cello students as a graduate assistant

while completing his degree. Tim remained an active performer while studying at the University

of Florida, serving as principal cellist in the Gainesville Chamber Orchestra from 2003 to 2007,

and performing with the Music School's New Music Ensemble. He maintained a private studio

with students of all ages and levels.

Upon the completion of his Ph.D. program, Tim will begin teaching at the University of

Wisconsin-Stevens Point in the Aber Suzuki Center. This position will provide opportunities to

work with beginning and intermediate level cello students, as well to offer cello pedagogy for

university-level students. He anticipates continuing to do research in the field of string music









education, particularly in the areas of string pedagogy and assessment. Tim has been married to

Sarah Caton Mutschlecner, a nurse practitioner, for 18 years. They have three daughters:

Audrey, age 16; Megan, age 14; and Eleanor, age 10.





PAGE 1

1 CONSTRUCTION, VALIDATION, AND ADMINISTRATION OF A DIAGNOSTIC TEST OF CELLO TECHNIQUE FOR UNDERGRADUATE CELLISTS By TIMOTHY M. MUTSCHLECNER A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORI DA IN PARTIAL FUFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY UNIVERSITY OF FLORIDA 2007

PAGE 2

2 2007 Timothy M. Mutschlecner

PAGE 3

3 The most perfect technique is that which is not noticed at all. Pablo Casals

PAGE 4

4 ACKNOWLEDGMENTS This work is dedicated to my dear wi fe Sarah who had shown unwavering support and encouragement to me in my studies. In every way she made possible the fulfillment of this goal which would have be unimaginable without her. To my children Audr ey, Megan, and Eleanor I owe a debt of gratitude for their patient understa nding. My parents Alice and Paul, through their continued reassurance that I was up to this tas k, have been much appreciated. Dr. Donald and Cecelia Caton, my parent s-in-law, spent many hours editing this manuscript, and I am very grateful for their skill and encouragement. Th e professional editing expert ise of Gail J. Ellyson was invaluable. The transition from music performance to academic scholarship has not always been easy. Dr. Timothy S. Brophys expertise in the fiel d of music assessment and his enthusiasm for the subject was truly the inspiratio n for what grew from a class paper into this dissertation. As Chair of my committee he has provided the nece ssary guidance and direction leading to the completion of this work. I consider myself very fortunate to have worked under Dr. Brophys mentorship. As members of my supervisory committee, Dr. Art Jennings, Dr. Charles Hoffer, and Dr. Joel Houston have generously offered their insigh t in refining this research. I thank them for their service on my committee and for their support. Gratitude is also extended to Dr. Camille Smith and Dr. David Wilson for serving as initial members of my committee. A study of this magnitude would have b een impossible without the commitment from colleagues: Dr. Wesley Baldwin, Dr. Ross Harbough, Dr. Christopher Hutton, Dr. Robert Jesselson, Dr. Kenneth Law, and Dr. Greg Sauer. Their willingness to a llow their students to participate in my research, made this study possible. The suppor t and insight from these master

PAGE 5

5 teachers was invaluable. Dr. E lizabeth Cantrell and Dr. Christ opher Haritatos, as independent judges, spent many hours viewing video-taped recordi ngs of student performances. Their care in this critical aspect of my study was much appreciated. Dr. Tanya Carey, one of the great li ving cello pedagogues, provided valuable suggestions for the design of this test. Thanks go as well to the many Suzuki cello teachers who participated in this research. Their willingness to shar e ideas and provide suggestions on ways to improve my test was heartening. Finally, thanks go to the 30 students who agre ed to participate in this research. The effort they made to prepare and pl ay their best is gratefully acknow ledged. Future cellists are in their debt for being pioneers in the field of assessment in string performance.

PAGE 6

6 TABLE OF CONTENTS ACKNOWLEDGMENTS.4 LIST OF TABLES...... DEFINITION OF TERMS ABSTRACT... CHAPTER 1 INTRODUCTION .... Purpose of Study.... Research Questions Delimitations...... Significance of the Study.......14 2 REVIEW OF LITERATURE....16 Introduction........16 Philosophical Rationales.... Bennet Reimer... David Elliott... Comparing and Contrasting the Phil osophic Viewpoints of Reimer and Elliott......22 Theoretical Discussion...23 Assessment in Music: Theories and Definitions....23 Constructivism and Process/Product Orientation..................26 Definitions..27 Research.....29 The Measurement of Solo Instrumental Performance...29 John Goodrich Watkins..29 Robert Lee Kidd.....31 Janet Mills..32 The Use of Factor Analysis in Performance Measurement... Harold F. Abeles Martin J. Bergee.....36 The Development of a Criteri a-Specific Rating Scale...37 The Measurement of String Performance.. Stephen E. Farnum.....39 Stephen F. Zdzinski and Gail V. Barnes.... Summary: Implications for the Present Study...42

PAGE 7

7 3 METHODOLOGY Setting and Participants..45 Data Collection..45 The Written and Playing Test....46 The Student Self-Assessment Profile. Rationale for the Assessment Methodology..47 Interjudge Reliability.49 Data Analysis. Content Validity.....50 4 RESULTS..51 Data Analysis.....51 Participants. Part One: The Written Test....52 Scoring the Written Test Results from the Written Test53 Regression Analysis of Written Test Items...55 Part Two: The Playing Test... Scoring the Playing Test Results from the Playing Test....56 Comparison of Left Hand Tec hnique and Bowing Stroke Scores.57 Comparison of Playing Test Scores and Teacher-Ranking... Interjudge Reliability of the Playing Test......58 Part Three: The Student Self-Assessment Profile..............................................................58 Repertoire Previously Studied.......................................................................58 How Interested Are You In Each of These Areas of Performance: Solo, Chamber, and Orchestral?............................................................................59 Other Areas of Performance Interest?...................................................................59 What Are Your Personal Goals for Study On the Cello?......................................59 What Areas of Cello Technique Do You Feel You Need the Most Work On?..............................................................................60 Summarize Your Goals in Music and What You Need To Accomplish These Goals.. Summary of Results...61 5 DISCUS SION AND CONCLUSIONS.....75 Overview of the Study...75 Review of the Results....75 Observations from the Results of Administering the Diagnostic Test of Cello Technique. The Written Test....76 The Playing Test................................................78 The Student Self-Assessment Profile.81 Discussion of Research Questions.....84

PAGE 8

8 To What Extent Can a Test of Cello Playing Measure a Students Technique?.............................................................................................84 To What Extent Can a Criter ia-Specific Rating Scale Provide Indications of Specific Strengths an d Weaknesses In a Students Playing?..........85 Can a Written Test Demonstrate a Students Understanding of Fingerboard Geography, and the Ability to Apply Music Theory To the Cello?..........................................................................................................86 Observations on the Playing Test from Participating Teachers.88 Comparative Findings The Farnum String Scale...89 Zdzinski and Barnes...90 Conclusions APPENDIX A PILOT STUDY.. B VALIDITY STUDY..97 C VALIDITY STUDY EV ALUATION FORM. D INFORMED CONSENT LETTER. E THE WRITTEN TEST....103 F THE WRITEN TEST EVALUATION FORM... G THE PLAYING TEST. H THE PLAYING TEST EVALUATION FORM.....144 I REPERTOIRE USED IN THE PLAYING TEST...149 J THE STUDENT SELF-ASSESMENT PROFILE..152 K DESCRIPTIVE STATISTICS FOR RAW DATA.154 LIST OF REFERENCES..... BIOGRAPHICAL SKETCH...

PAGE 9

9 LIST OF TABLES Table Page 4-1 Summary of regression analysis for year in school as a predictor of written, playing, and total test scores ( N = 30)... 4-2 Years of study, frequenc y and test scores.. 4-3 Summary of regression analysis for piano experience as a predictor of written, playing, and total test scores ( N = 30)... 4-4 Item difficulty, discrimination, and point bi serial correlation for the written test. 4-5 Mean scores of playing test items in rank order....68 4-6 Comparison of teacher-ranking to playing test ranking.9 4-7 Comparison of researchers and inde pendent judges scoring of student performances of the playing test....70 4-8 Numbers of students expressing inte rest in solo, chamber, and orchestral performance ( N = 29)....71 4-9 Personal goals for studying the cello.....72 4-10 Student perception of priori ties for technical study...73 4-11 Goals in music and means of accomplishing them....74 K-1 Raw scores of the written te st items, composite means, and standard deviations... K-2 Raw score, percent score, fre quency distribution, z score, and percentile rank of written test scores... K-3 Raw scores of the playing te st items, composite means, and standard deviations...156

PAGE 10

10 DEFINITION OF TERMS Fingerboard geography the knowledge of pitch location and the understanding of the spatial relationships of pitches to each other Horizontal intervals intervals formed across two or more strings Vertical intervals intervals formed by the distance between two pitches on a single string Visualization the ability to conceptualize the fi ngerboard and the names and locations of pitches while performing or away from the instrument Technique 1) the artistic execution of the skills required for performing a specific aspect of string playing, such as vi brato or staccato bowing 2) the ability to transfer knowledge and pe rformance skills prev iously learned to new musical material Target Note a note within a playing position used to find the correct place on the fingerboard when shifting

PAGE 11

11 Abstract of Dissertation Presen ted to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy CONSTRUCTION, VALIDATION, AND ADMINISTRATION OF A DIAGNOSTIC TEST OF CELLO TECHNIQUE FOR UNDERGRADUATE CELLISTS By Timothy M. Mutschlecner August 2007 Chair: Timothy S. Brophy Major: Music Education The purpose of this study was to construct, va lidate, and administer a diagnostic test of cello technique for use with underg raduate cellists. The test consisted of three parts: (1) A written test, which assessed a students understa nding of fingerboard geogr aphy, intervals, pitch location, and note reading, (2) A playing test, wh ich measured a students technique through the use of excerpts from the standard repertoire for cello, and (3) A self-assessment form, through which students could describe th eir experience, areas of interest, and goal s for study. A criteriaspecific rating scale with descriptive statements for each technique was desi gned to be used with the playing test. The written test, playing te st, and self-assessment were pilot-tested with five undergraduate students at a university in the so utheast. A validation study was conducted to determine to what extent teachers felt this test measured a students technique. Nine cello teachers on the college and preparatory leve l were asked to evaluate the test. The test was administered to 30 undergradua te cellists at universities located in the southeastern region of the United States. Str ong interitem consistency was found for the written test ( r KR20 = .95). A high internal consistency of items from the playing test was found ( =

PAGE 12

12 .92). Interjudge reliability of the playing test was high, as measured by comparing the independent evaluations of two judges with th e researchers evalua tions using Pearsons r (Judge A r = .92; Judge B r = .95. Other conclusions drawn fr om the study include: (1) Piano experience has a significant positive effect on the result s of the playing test ( R2 = .15); (2) The playing test is a good predictor of teacher-rankings of their student in terms of technique; (3) Year in school, degree program, or years of play ing experience were not significant indicators of students playing ability as measured by this test. Participating teachers described this test as a valuable tool for evaluating students and charting their course of study. They found it to be an efficient means to identify a students strengths and weaknesse s in cello technique.

PAGE 13

13 CHAPTER 1 INTRODUCTION Diagnosing a students playing is a primar y function of every music teachers daily routine. Boyle and Rodocy (1987) note that applied music teachers fo cus instruction on the basis of their diagnostic evalua tions of a performers strength s and weaknesses. In short, diagnostic evaluation is a critic al and ever present part of any good music program (p. 11). Without denying the role and value of traditional means of gathering information subjectively in the teaching studio, educators agr ee that evaluative decisions are better when they have a strong information base, that is a base including bot h subjective and objective information (Boyle and Rodocy, p. 2). A diagnostic test of cello technique, designed for use at the college-level, could supplement existing methods of evaluation and provide a greater degree of objectivity in assessing a students needs. The successful teacher has much ability to rapidly determine strengths and weaknesses of a new students technique and prescribe exercise s, pieces, or new ways of thinking about the instrument to correct errors in playing. Howe ver, deficiencies of technique or understanding often show up in a students playing while wo rking on an assigned piece from the standard repertoire. When this occurs, teachers must th en backtrack and correct the deficiencies with etudes or exercises, or jettison the work for a simpler piece--a demoralizing experience for the student. Determining the playing level and technical needs of ea ch new student is an immediate need. Within a few weeks of a college students entry into a studio, the focus of lessons often becomes preparation for a degree recital or jury exam. The opportunity to study technique on a broader scale than what is merely required to prepare an upcoming program can quickly diminish. A diagnostic test, admini stered to assess technique, coul d be a valuable tool in this process.

PAGE 14

14 Purpose of the Study The purpose of this study was to design, validate and administer a diagnostic test of cello technique for use with undergra duate college-level students. Research Questions 1. To what extent can a test of cello playing measure a students technique? 2. To what extent can a criteria-speci fic rating scale provide indications of specific strengths and weaknesse s in a students playing? 3. Can a written test demonstrate a students understanding of fingerboard geography, and the ability to appl y music theory to the cello? To answer these questions, a diagnostic test of cello technique was administered to thirty college-level students currently st udying the cello. The test results were analyzed using a rating scale designed for this study (see Chapter 3). In terjudge reliability of the test was measured by comparing independent evaluations of two judges who viewed vide o-recordings of five students taking the test. Delimitations This study was not concer ned with the following. Instruments other than the cello Creating an assessment instrument for ranking students, determining a letter grade, or determining chair placement in ensembles Creating a playing test to be used in auditions Determining the subjects sight-reading ability The measurement of musical aptitude The measurement of a students musicality or expressivity Significance of the Study A review of literature indicates that this is the first attempt to systematically measure the diverse elements of cello techni que. The five items used by Zdzinski/Barnes (2002) in their String Performance Rating Scale : Interpretation/Musical Effect, Articulation/Tone, Intonation

PAGE 15

15 Rhythm/Tempo and Vibrato do not attempt to examine a broad array of technical skills, but rather provide a general assessment of a students performance. This present study appears to be the first to evaluate specific aspects of cello technique. The results of this study can inform the teach ing of strings, particul arly the cello, at the college-level. For example, teachers may find it usef ul to have a diagnostic tool to evaluate the technical level of new students. Results from such a test ma y support or bring into question conclusions commonly made by teachers based primar ily on audition results and/or the students performance in initial lessons. Si milarly, the test could expose area s of deficiencies in technique and provide the teacher with indi cations regarding the etudes exercises or solo materials most appropriate for study. An assessment of the studen ts overall playing level can assist the teacher in choosing repertoire that is ne ither too easy nor too difficult. Often errors in cello playing can be traced to a students lack of clarity about the location and relationship of pitches on the fingerboard. This understanding of so called fingerboard geography is measured in the Written Test, as well as an awareness of intervals, fingering skill, and the ability to read in the three clefs used in cello music. The written test can quickly reveal if a student is deficient in understanding this abi lity. Clarification of these areas can bring instant results that no amount of practice can achieve. The approach and design of this study could be used to create similar diagnostic tests for violin, viola, and bass. Though there are aspects of technique that are unique to each of the instruments in the string family, much of what is explored in this study would be transferable. Future studies could also include a ve rsion designed for high school students.

PAGE 16

16 CHAPTER 2 REVIEW OF LITERATURE Introduction Literature dealing with assessment of mu sical performance tends to fall into two categories: summative assessments focus on the value of a finished project; formative assessments focus on data gathered during the proce ss of reaching a goal or outcome (Colwell, 2006). A studio teachers ongoing process of di agnosis, correction, and reevaluation is an example of formative assessment in music. A student recital or jury exemplifies summative assessment in music performance. The diagnostic test of cello techniqu e designed for this study is a formative assessment, in that it measures a students performance ability as a certain point on a continuum that leads to mastery. This literature review is di vided in three parts. Part One examines the philosophical foundation for this study. Part Two explores assessment theory and provides the theoretical bases for this research. Part Three reviews rese arch in assessment with particular emphasis on performance. Part One: Philosophical Rationales A philosophic rationale is the bedrock upon wh ich any scholarly inquiry is made. Reimer (2003) succinctly describes its importance: The Why questionsthe questions addresse d by philosophyare the starting point for all conceptualizations of edu cation, whether in music, other subjects, or education as a whole. Answers to these questionsquesti ons of valueprovide the purposes of education, purposes dependent on what people in a culture regard to be so important that education must focus on them (p. 242). These questions must be asked not only of a given educational curriculum but also of the means chosen for evaluation of material ta ught. Simply asking ourselves, How do we determine what we know? brings our educationa l materials and pedagogy into greater focus.

PAGE 17

17 Subjective as well as objective information shape our systems of evaluation. As Boyle and Radocy (1987) observe, subjective information te nds to vary from observer to observer and its value in informing decision making is limite d. Objective information, by definition, is relatively unaffected by personal feelings, opinions, or biases. Musical evaluation should not be limited to gathering only objective data but should include subj ective observations as well. Although certain aspects of musical performance can be measured with scientific precision, such as vibrato width or decibel levels, the comp lex multi-faceted nature of music makes the reliability of any measure less than perfect. This observation need not discourage music educators, but rather help them recognize the need for stronger objective cr iteria for evaluation. A music educators personal phi losophy of assessment is not ta ngential to their work, but an essential base from which to define and di rect teaching. Brophy (2000) explains the need for a philosophy of assessment: A personal assessment philosophy is an essentia l element in the development of a general teaching philosophy. Exploring ones reas ons for being a music teacher should inevitably reveal personal r easons and motivations for be lieving that assessment is important, including why it is important. Th e depth of ones commitment to music education as a profession is also a fairly reliable predictor of ones commitment to assessment as an important aspect of the music program (p. 3). Deciding what is important for students to learn and why it is important determines how one will assess what students know. Attitudes toward assessment directly influence the content and quality of teaching. Inevitably, a teachers philo sophy of assessment will be most influenced by how he or she was taught and evaluated as a st udent. This may help explain the range of attitudes noted by Colwell (2006): Evidence from learning psychol ogy reveals that assessment properly conducted makes a major difference in student learning and when incorrectly used, a corresponding negative effect. The current hype, however, has not produced much action in the United States, Canada, or Great Britain. To many music educators, assessment is so much a part of instructionespecially in achieving goals in performancethat they do not believe more

PAGE 18

18 is needed. Other music educators believe that any assessment is inappropriate as either too quantitative or t oo mechanical (p. 210). That some applied music teachers believe that they have no need for methods to assess technique beyond their own listening skill is understa ndable. Most have spent their lives refining evaluative skills: first, of their own playing, and th en that of their students. These teachers may feel it insulting to suggest that a test is better than they are at diagnosi ng a students strengths and weaknesses. However, these same teachers would not think twice about having a diagnostic test of their cars electrical system if it were acting strangely. If a diagnostic test of cello technique could be shown to give a reasonably accurate and rapid assessment of a students playing level and particular needs, skeptical teachers might co me to appreciate the tests pragmatic value. Aristotle in his Politics stated what is implied by ever y music school faculty roster: It is difficult, if not impossible, for those who do not perform to be good judges of the performance of others (p. 331). These philosophic roots may he lp to explain why teachers of applied music are almost always expected to be expert performe rs. Skills of critical listening required of a teacher must be refined and molded in the furnac e of performance; these listening skills are the essential abilities that a musi c teacher cannot do without. Because music performance involves competence in the cognitive, affective, and psychomotor dom ains of learning, authentic assessment must extend beyond single criterion, bi-level test s of the type appropr iate for math or spelling. No single test can measur e all factors that go into a perf ormance; at best a single test may evaluate only a few aspects of a students playing. Two contemporary philosophical views on the role of evalua tion in music are those of Bennett Reimer (1989/2003), and David Elliott ( 1995). Though these scholars share many beliefs about the role and value of universal music e ducation, they represent two poles of thought

PAGE 19

19 regarding the best way to achieve a musically fluent society. Their differences are philosophic and concern a definition of th e very nature of music. Bennett Reimer Reimer makes an important claim when discus sing evaluation in music. After raising the question: By what criteria can t hose who partake of the work of musicians evaluate that work, he asserts, the same criteria applied to their work by musicians all over the word are the criteria that can be applied to evaluating the results of their work (pp. 266-267). For example, if certain technical skills are required for a musically satisfying performance, these same skills can and should be criteria for evaluation. Reimers use of the term craft comes close to what musicians mean when they speak of technique: Craft, the internaliza tion within the body of the ways and means to make the sounds the music calls on to be made, is a foundational crit erion for successful musicianship. This is the case whether the musician is a first grader being a musician, a seasoned virtuoso, or anything in between. It is th e case whatever the music, of whatever style or type, from whatever culture or time (p. 266). What is universal is the craft of music making, in all its variet ies. However, the expression of that craft is very distinct: But crucially what coun ts as craft is particular to the particular music being evaluated (p. 266). Reimers argumen t seems to support the validity of designing assessment measures that are instrument, and even genre, specific. Bennett Reimer notes: everything the musi c educator does in his job is carrying out in practice his beliefs about his subject (Rei mer, 1970, p. 7). It is important that the pedagogical approach a teacher uses reinforces his or her philosophical belief about why we do what we do in music. If we believe, as Reimer does, that we are fundamentally teachers of aesthetics through the medium of music, then every aspect of our work should support and defend this view rather then detract from it.

PAGE 20

20 Instrumental technique is a means to an e nd, not the end itself. Certainly the virtuosic pyrotechniques required for some pieces blurs this distinction, but by and large most teachers would be quick to acknowledge that complete absorption with the mechanics of playing is a recipe for burn-out and loss of the joy of musicmaking. Cello teacher Fritz Magg observed that Calisthenics literally comes from two Greek words: kalos which means beautiful and stenos which means strength (Magg, 1978, p. 62). Accepti ng the principle that the development of strength is a requisite for expres sion of the beautiful serves as a rationale for designing a test to assess technique. Reimer believes that past and present attempts of assessment have two crucial flaws (2003). First, they are not tailored to a speci fic musical activity, maki ng the false assumption that what is tested for is applicable to any a nd all musical involvements. Reimer states, The task for the evaluation communityis to develo p methodologies and mechanisms for identifying and assessing the particular di scriminations and connections re quired for each of the musical roles their culture deems important (p. 232). Just as Gardner (1983) brought to our attention the need to define distinct kinds of intelligence, Re imer cautions that we should be wary of assuming direct transfer of musical inte lligences from role to role. The second weakness of music testing accord ing to Reimer is its almost exclusive concentration on measuring the ability to discriminate, thereby neglecting to examine the necessary connections among isolated aspects of musical intelligence (2003). The question of how meanings are created through connections has been largely ignored, he suggests. This may be partially attributed to heavy dependence on objective measurement in music research. Qualitative studies may be better suited for this pu rpose. Reimer notes that many recent studies in cognitive science may be appl icable to musical evaluation.

PAGE 21

21 David Elliott Elliott (1995) makes a clear distinction betw een evaluation and assessment. He notes, The assessment of student achie vement gathers information that can benefit students directly in the form of constructive feedback. He sees evaluation as being primarily concerned with grading, ranking, and other summary procedur es for purposes of student promotion and curriculum evaluation (p. 264). For Elliott, how ever, achieving the goals of music education depends on assessment. He describes the primar y function of assessment as providing accurate feedback to students regarding the quality of their growing musicianship. Standards and traditions are the criteria by which students ar e measured in determining how well they are meeting musical challenges. Elliot leaves it to the reader to define what these standards and traditions are and more specifically what means are used to determine their attainment. Elliotts concept of assessment is one of supporting and advancing achievement over time, noting the quality and development of a learners musical thinking is something that emerges gradually (p. 264). Elli ott is concerned with the inad equacy of an assessment which focuses on the results on a students individual thinking at a single moment in time. Real assessment of a students development occurs when he or she is observed making music surrounded by musical peers, goals, and standards that serve to guide and support the students thinking (p. 264). Regarding evaluation, Elliott is unequivo cal: there is no jus tification for using standardized tests in music (p. 265). He sees conventi onal methods of evaluation as inappropriate in music because they rely on linguistic thinking. Like Gardner, Elliott insists that an assessment, if it is to be intelligence-fair, mu st be aimed directly at the students artistic thinking-in-action.

PAGE 22

22 To summarize, Elliott sees assessment as a process-oriented appr oach to teaching, using constructive feedback embedded into the daily ac ts of student music making. Music is something that people do; music assessment must th en occur in the context of music making. Comparing and Contrasting the Philosophic Viewpoints of Reimer and Elliott The crux of the difference in music philos ophies of Reimer and Elliott revolves around the role of performance. Elliott sees all asp ects of music revolving around the central act of performing. As stated by Elliott, Fundamentally, music is something that people do (Elliott, p.39, italics in original). Reimer notes that processes (music making) produce products (integral musical works) and that, performance is not su fficient for doing all that music education is required to do, contrary to what Elliott insists (Reimer, p. 51). Reimer sees performance as only one of several ways musical knowledge is acquire d, as opposed to being th e essential mode of musical learning. Elliott defines assessment of student achievement as a means of gathering information that can be used for constructive feed back. He also values it as a means to provide useful data to teachers, parents, and the surrounding educational community (p. 264). However, Elliott is uncomfortable with a ny use of testing that simply focuses on a students thinking at one moment in time. One can imagine him acknowledging the value of a diagnostic performance test, but only if it were part of a continuum of evaluations. Elliotts insistence on the central role of performance prevents him from recognizing the value in a critique of a musicians abilities at a given moment in time. Reimer sees the act of performing composed music and improvisation as one requiring constant evaluation. Because he is willing to acknowledge musical products (form) separately from the act of creating or regenerating, he asks a more incisive question: By what criteria can those who partake of the work of musicians evaluate that work? (p. 265). Considering the myriad styles, t ypes and uses of music, Reimer

PAGE 23

23 concludes that criteria fo r judging music must be distinctive to each form of music and therefore incomparable to one another (p. 266). Reimer softens his stance by providing examples of universal criteria: that is, criteria applicable to diverse musical forms. He does insist, however, that they must be applied distinctively in each case: Assessment of musical intelligence, the n, needs to be role-specific. The task for the evaluation community (those whose intelligence centers on issues of evaluation) is to develop methodologies and mechanisms for identifying and assessing the particular discriminations and connections required for each of the musical roles their culture deems important. As evaluation turn s from the general to the specific, as I believe it urgently needs to do, we are lik ely to both significantly increase our understandings about the divers ities of musical intelligences and dramatically improve our contribution to helping individuals id entify and develop areas of more and less musical capacity (p. 232). Reimer accepts the view that there is a ge neral aspect of musical intelligence, but suggests that it takes its reality from its varied ro les. This allows him to see evaluation in music as a legitimate aspect of musicianship, part of the doing of music that Elli ott insists on. His philosophic position supports creatin g new measures of musical perf ormance, especially as they bring unique musical intelligences to light and ai d in making connections across diverse forms of music making. Part Two: Theoretical Discussion Assessment in Music: Theories and Definitions Every era has a movement or event that seems to represent the dynamic exchange between the arts and the society of that ti me. Creation of the National Standards for Art Education is one such event. Th e Goals 2000: Educate America Act de fined the arts as being part of the core curriculum in the United States in 19 94. That same year witnessed the publication of Dance Music Theatre Visual Arts: What Every Young American Should Know and Be Able to Do in the Arts (MENC, 1994). It is signif icant that among the nine cont ent standards, number seven

PAGE 24

24 was: Evaluating music and music performances Bennett Reimer, one of the seven music educators on the task force appointed to writ e the document, discusse s the central role of evaluation in music: Performing composed music and improvising re quire constant evaluation, both during the act and retrospectively. Listen ing to what one is doing as one is doing it, and shaping the sounds according to how one judges their effec tiveness (and affectiveness), is the primary doingresponding synthesis occu rring within the act of creating performed sounds (Reimer, 2003, p. 265). Central to success is the ability to assess ones wo rk. This assessment includes all of the content standards, including singing, performing on in struments, improvising, and composing. Evaluation is the core skill that is required fo r self-reflection in music. When a student is capable of self evaluation, to some extent teach ers have completed their most important task. Reimer sees the National Standards as the em bodiment of an aesthetic ideal, not merely a tool to give the arts more legislative clout: The aesthetic educational agenda was give n tangible and specific formulation in the national content standards, and I suspect that the influence of the standards will continue for a long time, especially si nce their potential for broadening and deepening the content of instruction in music education has barely begun to be realized (p. 14). Reimer and the other members of the task fo rce were given an opportunity to integrate a philosophy into the national standa rds that values music educati on. With this statement they articulated a philosophy defending the scholastic validity of the arts: The Standards say that the arts have academi c standing. They say there is such a thing as achievement, that knowledge and skills matte r, and that mere willing participation is not the same thing as education. They affirm that discipline and ri gor are the road to achievementif not always on a numerical s cale, then by informed critical judgment (MENC, 1994, p. 15). Such statements are necessary in a culture that perniciously sees the arts as extracurricular activities and not part of the core educational experience of every child. Reimer has provided a philosophical foundation fo r assessment in the arts. Others, like Lehman (2000), observe that, Our attention to this topic is very uneven. It is probably fair to

PAGE 25

25 say that in most instances evaluation is treated in an incidental manner and is not emphasized in a systematic and rigorous way (Lehman, pp. 5-6). As the standards movement grows, fueled by greater interest in achievement testing in the ar ts, it is likely that this attitude will change. Lehman describes how he sees the emerging role of music assessment: I believe that the standards movement has se t the stage for an assessment movement, and I believe that assessment may become the defi ning issue in music education for the next decade. Developing standards and defining cl ear objectives that fl ow naturally from standards make assessment possible wher e it was often not possible before. But standards do more than make assessment possi ble. They make it necessary. Standards have brought assessment to the center of th e stage and have made it a high-priority, highvisibility issue. Standards and assessment inescapably go hand in hand. We cannot have standards without assessment (p. 8). Furthermore, we cannot have assessment wit hout tests that are designed to measure all kinds of music making, whether it be in bands, orch estras, choirs, or jazz ensembles. Included in this list should be assessment of individua l performance. New ways of more objectively determining achievement in individua l performance are greatly needed. The need for assessment measures capable of assessing the multiple intelligences present in the arts has been articulated: Although some aspects of learni ng in the arts can be measur ed adequately by paper-andpencil techniques or demonstrations, many skills and abilities can be properly assessed only by using subtle, complex, and nuanced methods and criteria that require a sophisticated understanding. Assessment meas ures should incorporate these subtleties, while at the same time making use of a broa d range of performance tasks (Reimer, p. 15). When Reimer observes that assessment in the arts is a complex task with subtle shades of meaning, he is alluding to the ill-structured quality of many of the s ubject content domains in music. Spiro, Vispoel, Schmitz, Smarapungavan, and Boeger (1987) define ill-structured domains as content areas where there are no rules or principles of sufficient generality to cover most of the cases, nor defining characteristics for determining the actions appropriate for a given case (p. 184, as quoted Brophy, p. 7). Criteria for judgment in performance, therefore, must be

PAGE 26

26 tailored to the idiosyncrasies of the particular instrument, its role as a solo or ensemble member, the age and/or playing level of student and the purpose of assessment. Constructivism and Pro cess/Product Orientation Brophy defines the constructivist view of knowle dge as those situations in which students draw upon previous experience to understand ne w situations (2000, p. 10). This occurs when teachers assess something specific like cello techni que. Students are asked to transfer knowledge and psycho-motor skills from one context: (previ ous playing experience) to another (performing new or unfamiliar excerpts). Constructivist theory coincides with one of the definitions of technique used in this res earch: the ability to transfer knowledge and performance skills previously learned to new musical material. Process-orientation tends to be aligned with a constructivist approac h. Inquiry into new areas of knowledge and understanding does not necessarily have a predetermined outcome. Learning occurs during the process of explor ation. Methods of eval uation in music and elsewhere have tended to be produc t-oriented. The need to objectively quantify what has been learned is an ongoing problem in the arts. The desire to evaluate student achievement in relation to the attainment of pre-specified objectives led to the creation of criterion-referenced or objective-referenced tests. These tests evaluate achievement in relation to specific criteria rather than through comparing one student to another (Boyle and Radocy, pp. 9-10). These tests, however, have been criticized for measuring verbal intelligence rather than authentic music making (Elliott pp. 75-76). It is possible, however, for tests to be designed that measur e components of both the process (technique) and product (complete musical statement) of making music. Diagnostic tests that ev aluate students as they progress through increasing challenges may give the teacher insight regarding the

PAGE 27

27 students cognitive and psychomotor abilities. Thus, a diagnostic test in music can be designed to evaluate both process and product. Definitions To understand theoretical rati onale behind the evaluation of music ability terminology must be clear. The term test refers to any systematic pr ocedure for observing a persons behavior relevant to a specific task or series of tasks. Measurement is a system designed to quantify the extent to which a pe rson achieves the task being tested. In music, testing usually involves some form of a scori ng system or rating scale. Evaluation means making judgments or decisions regarding the level of quality of a musi c behavior or of some other endeavor (Boyle, 1992). The ideal evaluation model has a strong objective data component but encompasses subjective but enlightened judgments from expe rienced music teachers (Boyle, p. 247). Boyle and Radocy claim that evaluative decisions are best made whe n, decision makers (a) have a strong relevant information base, including both subjective and objective information, (b) consider affective and, where appr opriate, aesthetic reactions of (or to) the individual, group, or endeavor being evaluated, and (c) be made with the primary goal of improving the quality of the learners educational experiences (1987, p. 8). True evaluation mu st provide information that enhances the educational experience and does not simply provide data for the purpose of assigning grades, for determining who is allowed to play, or what the students chair placement will be. A diagnostic test is one which focuses on the present and is used to classify students according to their strengths and weaknesses rela tive to given skills or knowledge (Boyle and Radocy, p. 10). Such a test can be used to (a) group students for instru ction or (b) provide individualized instruction that corrects errors or challenges th e learner. The diagnostic test of

PAGE 28

28 cello technique created for this st udy is designed to serve the latt er purpose. It falls into the category of a narrow content focus test, which is defined as intensive in nature (Katz, 1973). This type of test is appropriate for judging an individuals st rengths and weaknesses. It allows for intra -individual comparisons, such as ability levels of differing sk ills. Intensive tests provide the basis for remedial instructi on, as well providing i ndications of the means of improving areas of weakness. The purpose of a test largely determines wh at type of test needs to be chosen or constructed for assessment purposes. If a test s primary purpose is to discriminate among individuals, then the test is norm-referenced (Boyle and Ra docy, p. 75). An individual performance is judged in comparison to the performances of his or her peers. This type of test is appropriate for making comparisons among individuals, groups or institutions. Criterion-referenced tests de scribe student achievement in terms of what a student can do and may be evaluated against a criterion or absolute standard of performance (Boyle, p. 253). Such a test is ideally suited to individual performance; the ch allenge for this test is how to establish the criteria to be used as a standa rd. If a performance ev aluation uses excerpts accurately revealing a students ability in demons trating specific tasks, then that test has good content validity ; the test materials coincide with the skills being tested. The focus of performance assessment may be global i.e. a judgment of its totality, or specific i.e. a judgment of only particular aspects of performanc e. A diagnostic test would be expected to use criteria that re veal specific aspects of performance, although the evaluation could still include global statements about overall playing ability. The use of global and specific approaches are explored in the review of literature at the end of this chapter.

PAGE 29

29 Part Three: Research The field of testing in string instrument performance is remarkably uncultivated. However, there is a growing body of literature d ealing with performance assessment in general, and this writing has many implications for the pr oblem addressed in this study. Examination of this literature will begin with a survey of resear ch in solo instrumental performance, noting the specific aspects of performance measured and the a pproaches used. An exploration of the use of factor analysis as a means of achieving high reliab ility and criterion-related validity will follow. This section will close with a review of the re search in measurement of string performance. The Measurement of Solo Instrumental Music Performance John Goodrich Watkins The earliest known research in the area of solo instrumental performance was carried out by Watkins (1942) for his doctora l dissertation at Teachers Colle ge, Columbia University. Watkins constructed an objectively scored, cornet rating scale. For this he composed 68 melodic exercises based on selected cornet methods. Four equivalent forms of the test were designed, each containing sixteen melodies of increasing di fficulty. The measure was established as the scoring unit and was considered to be played incorrectly if any errors of pitch, time, change of tempo, expression, slur, rests, holds and pauses, or repeats occurred. Af ter administering the four preliminary test forms to 105 students, he used item analysis to cons truct two final forms of the test. Equivalent forms and test-retest reli ability coefficients were high (above .90). Following this research, Watkins developed the Watkins-Farnum Performance Scale (WFPS) (1954) for wind instruments and snare drum. This scale, along with the subsequently constructed Farnum String Scale (Farnum, 1969), constitutes the only readily available performance measure. As with the Watkins corn et study, this test, admi nistered individually,

PAGE 30

30 requires the performance of a seri es of passages of increasing diffi culty. The student plays with the aid of a metronome, continuing through the exer cises until he or she scores zero in two consecutive exercises. Again, the scoring unit is the measure, and the examiner is given a detailed explanation of what constitutes an e rror. Two equivalent forms were constructed and 153 instrumentalists were tested. Correlations between Form A and Form B of the test have ranged from .84 to .94. Criterion-related validit y based on rank-order co rrelations ranged between .68 for drum to .94 for cornet and trumpet. Concerns have been raised about how we ll-suited the examples are for particular instruments (Boyle and Radocy 1987). Some dynami c markings appear artificial and no helpful fingerings are provided for technical passages. There is no attempt to measure tone quality, intonation, or musical interpretation. The latt er is an inherently subjective judgment but nevertheless a critical part of an assessment of musical performance. As a result, the tests content validity has been questioned (Zdzinski and Barnes, 2002). The WFPS contains highly sp ecific directions for scoring aspects of playing, that teachers can all agree upon. As a result, it continues to be used by default, as no other measure provides a similar level of objectivity. A nu mber of investigators have used the WFPS as a primary measurement tool for their research. B oyle (1970), in an experime ntal study with junior high wind players, demonstrated that students who practiced reading rhythms by clapping and tapping the beat showed signifi cantly greater improvement as measured by the WFPS. More recently Gromko (2004) investigated relationshi ps among music sight reading as measured by the WFPS and tonal and rhythmic audiation ( AMMA Gordon, 1989), visual field articulation ( Schematizing Test Holzman, 1954), spatial orient ation and visualization ( Kit of FactorReferenced Cognitive Tests, Ekstrom et al., 1976), and academic achievement in math concepts

PAGE 31

31 and reading comprehension ( Iowa Tests of Educational Development Hoover, Dunbar, Frisbie, Oberley, Bray, Naylor, Lewis, Ordman, and Qualls, 2003). Using a regression analysis, Gromko determined the smallest combinations of variables in music sight reading ability, as measured by the WFPS The results were consistent with ea rlier research, suggesting that music reading draws on a variety of cognitive skills in cluding visual perception of patterns rather than individual notes. The WFPS has its greatest validity as a test for si ght reading. Sight reading is a composite of a variety of skills, some highly speciali zed. Using only this test to rank students on musicianship, technique or aptit ude would be inappropriate, however This test design reveals a certain degree of artificiality; the use of the m easure as a scoring unit and choice of ignoring pauses between measures are somewhat contrived. Nevertheless, Watkins and Farnum succeeded in developing the most reliable and obj ective performance testi ng instrument in their day. Robert Lee Kidd Kidd (1975) conducted resear ch for his dissertation con cerning the construction and validation of a scale of trombone performance sk ills at the elementary and junior high school levels. His study exemplifies a trend toward mo re instrument-specific research. Kidd focused on the following questions: What performance skills are necessary to perform selected and graded solo trombone literature of Grades I and II? What excerpts of this body of literature provide good examples of these trombone performance skills? To what extent is the scale a valid instrume nt for measuring the performance skills of solo trombonists at the elementary and junior high school level? To what extent is the sc ale a reliable instrument?

PAGE 32

32 Solos from the selective music lists of th e National Interscholas tic Music Activities Commission of the MENC were content analyzed, and 50 performance skills were identified coinciding with range, slide technique and ar ticulation. Each skill was measured by four excerpts and administered to 30 junior high schoo l trombonists. These performances were taped and evaluated by three judges. Results from th is preliminary form of the measurement were analyzed, providing two excerpts per skill area. E quivalent forms of the measure were created, each using one of the two excerpts selected. This final version was administered to 50 high school students. Interjudge reli ability coefficients were .92 fo r form A and .91 for form B. Equivalent forms reliability was found to be .98. Validity coefficients ranged from .77 to 1.0 for both forms. Zdzinski (1991, p.49) notes that the us e of a paired-comparison approach rather than the use of teacher rankings may have affected validity coefficients. Kidd concluded that the Scale of Trombone Performance Skills would be useful to instrumental music educators in their appraisal of the followi ng areas of student progress: guidance, motivation, improvement of instruction and program, student selection maintenance of standards, and research. Kidd recognized that the time requirement (thirty-six minutes for administration, twenty one minutes for judging, and nine minutes for scoring) could make this version of the scale impractical in a public school situati on and acknowledged that some modifications in the administrati on and scoring procedures could facilitate the extent of the scales use (pp. 93-94). Janet Mills Mills (1987) conducted an investigation to determine what extent it was possible to explain current assessment methods for solo musi c performances. In a pilot study, she chose six instrumental music students, aged 15 years or above, who were capable of performing grade-

PAGE 33

33 eight music from a British graded music list. Videotapes were made of their performances and these were scored by 11 judges. Judges were as ked to write a comment about each performance and give it a mark out of 30 based on the scale of the Associated Boards of the Royal Schools of Music. Two adjudicating groups were formed consisting of: 1) Music teachers and music specialist students, and 2) Nons pecialists with experience of musical performance. After the judging occurred, judges were interviewed about the evaluative criteria. From these interviews, the following 12 statements or constructs were generated: The performer was Nervous/Confident The performer Did not enjoy/Did enjoy playing The performer Hardly knew/W as familiar with the piece The performer Did not make sense/Made sense of the piece as a whole The performers use of dynamics was Inappropriate/Appropriate The performers use of tempi was Inappropriate/Appropriate The performers use of phrasi ng was Inappropriate/Appropriate The performers technical problems we re Distracting/Hardly noticeable The performance was Hesitant/Fluent The performance was Insensitive /Sensitive The performance was Muddy/Clean I found this performance Dull/Interesting In the main part of her study, phase two, Mills taped ten performances, again dividing her 29 judges into the groupings previously mentione d. Judging was done using both the original 30-point overall rating (with commen ts), as well as with the newly created criteria. Inter-item correlations and correlations among marks on the 30-point scale were all positive. Correlations between overall marks and individual items were all negative. Because of the small sample size, no data on significance could be provided. Nevert heless, this study demonstrates a well designed method for examining criterion-related validity of newly created evaluative statements with an existing performance measurement.

PAGE 34

34 The Use of Factor Analysis in Performance Measurement The tests discussed so far, and others lik e them, have a fundamental problem with reliability; the measures employed were typical ly subjective judgments based on uneven and unspecified observations. It beca me increasingly clear to researchers that greater attention needed to be focused on systematically objectif ying the methods used in musical evaluation. The use of rating scales to replace or substantiate judges general im pressions is an approach that has been explored by several resear chers. Factor analysis of desc riptive statements generated for assessment became an important technique fo r improving content validity and interjudge reliability. Factor analysis comprises a number of t echniques that can be used to study the underlying relationships between la rge numbers of variables. Co mmon factor analysis reveals the factors that are based on th e common or shared variance of the variables (Asmus and Radocy, 2006). All methods of factor analysis seek to define a smaller set of derived variables from a larger collection of data. When applie d to performance evaluation, factor analysis can help to determine systematically common evalua tive criteria. Potent ial benefits include increased content validity and gr eater interjudge reli ability. The groundbreaking work of Abeles in the use of factor analysis to develop a highl y reliable and valid perfor mance scale for clarinet led other researchers to use factor analysis in designing their scales. The following studies are examples of the applicati on of factor analysis to performance measurement. Harold F. Abeles Abeles (1973) research in the developmen t and validation of a clarinet performance adjudication scale grew from a desire to re place a judges general impressions with more systematic procedures. He turned to rating scales because they would allow adjudicators to base

PAGE 35

35 their decisions on a common set of evaluative dimensions rather than their own subjective criticisms. In the first phase of the study, 94 statements were generated through content analyses of essays describing clarinet performance. These st atements were also formulated through a list of adjectives gathered from several studies which described music performance. Statements were paired with seven a priori ca tegories: tone, intonation, interp retation, technique, rhythm, tempo, and general effect. The statements were then transformed to items phrased both positively and negatively; items that could be used by instrumental music te achers to rate actual clarinet performances. Examples from this item pool ar e: 1. The attacks and releases were clean. 2. The clarinetist played with a natural tone. 3. The clarinetist played flat in the low register. The items were randomly ordered and paired with a five poin t Likert scale, ranging from highly agree to highly disagree. Factor analysis was performed on the evalua tion of 100 clarinet performances using this scale. Six factors were identified: interp retation, intonation, rhythm, continuity, tempo, articulation, and tonewith five de scriptive statements to be judge d for each factor. The final form of the Clarinet Performance Rating Scale (CPRS) was comprised of items chosen on the basis of having high factor loadi ngs on the factor they were sele cted to measure and low factor loadings on other factors. The thirty statements chosen were grouped by factors and paired with a five-point Likert scale. Ten taped performances were randomly selected and rated using the CPRS by graduate instrumental music educatio n students. For the purpose of determining interjudge reliability, judg es were divided into groups of 9, 11 and 12 judges. Item ratings from these judges were again factor analy zed to determine structure stability.

PAGE 36

36 Abeles found that the six-factor structur e produced from the factor analysis was essentially the same as the a priori theoretical st ructure. This suggested good construct validity. He concluded that this structure would be a ppropriate for classifying music performance in general, as none of the factors seemed to reflect idiosyncratic cl arinet characteristics. On the other hand, Zdzinsky (2002) found that the factors identified to assess st ringed instrument, wind instrument and vocal performance are distinct and rela ted to unique technical challenges posed by each performance area. The interjudge reliability estimates for the CP RS were consistently high (.90). Individual factor reliabilities ranged from .58 to .98, with all factor s but tone and intonation above .70. Criterion-related validity based on correlations be tween CPRS total scores and judges ratings were .993 for group one, .985 for group two, and .978 for group three. Predictive validity (<.80) was demonstrated between the CPRS and global performance ratings. Martin J. Bergee The development of a rating scale for tuba and euphonium (ETPRS) was the focus of a doctoral dissertation by Bergee (1987). Using methods similar to Abeles, Bergee paired descriptive statements from a literature, adjudication sheets and essays with a Likert scale to evaluate tuba and euphonium performances. Judges initial responses led to identification of five factors. A 30-item scale was then constructed ba sed on high factor loadings Three sets of ten performances were evaluated by three panels of judges ( N = 10) using the rating scale. These results were again factor analyzed, resulting in a four-factor structure measuring the items: interpretation/musical effect, tone qualit y/intonation, technique, and rhythm/tempo. Interestingly, factor analysis produced sli ghtly different results then in the Abeles Clarinet Performance Adjudication Scale Technique was unique to this measure, while articulation was

PAGE 37

37 unique to the Abeles measure. Abeles measure also isolated tone qua lity and intonation as independent items. The idiomatic qualities of specific instruments or families of instruments may result in the use of unique fact ors in performance measurement. Interjudge reliability for the ETPRS was found to be between .94 and .98, and individual factor reliabilitie s ranged from .89 to .99. Criterion-related validity was determined by correlating ETPRS scores with global ratings based on magnit ude estimation: (.50 to .99). ETPRS scores were also correlated with a MENC-constructed wind instrument adjudication ballot resulting in validity estimates of .82 to .99. The Development of a Criteria-Specific Rating Scale T. Clark Saunders & John M. Holahan Saunders and Holahan (1997) investigated th e suitability of crit erion-specific rating scales in the selection of high school students for participation in an honors ensemble. Criteriaspecific rating scales differ from traditionally us ed measurement tools in that they include written descriptors of specific le vels of performance capability. Judges are asked to indicate which of several written criteria most closel y describes the perceive d level of performance ability. They are not required to express their lik e or dislike of a perfor mance or decide if the performance meets an indeterminate standard. In this study, criterion-specifi c rating scales were used by 36 judges in evaluating all 926 students seeking selection to the Connecticut All-State Band. These students were between grades 9-12 and enrolled in public and private high schools throughout the state of Connecticut. Only students who performed with woodwind and brass instruments were examined in this study, because the judges were able to use th e same evaluation form. The 36 adult judges recruited in this study were comp rised of elementary, secondary, and college-level instrumental

PAGE 38

38 music teachers from Connecticut. All had a minimu m of a bachelors degree in music education and teachers certification. Three aspects of student performances were examined: solo evaluation, scales and sight reading The following specific dimensions of in strumental performance were assessed: Solo Evaluation: Tone, Intonation, Technique/A rticulation, Melodic A ccuracy, Rhythmic Accuracy, Tempo, and Interpretation Scales: Technique, Note Accuracy, and Musicianship Sight-Reading: Tone, Note Accuracy, R hythmic, Technique/Articulation, and Interpretation For each performance dimension, a five-point criteria-specific rating scale was constructed using either continuous (se quentially more demanding perfor mance criteria) or additive (nonsequential performance criteria) Each of the criteria were chosen to describe a specific level of music skill, content, and technical achievement. Th e Woodwind/Brass Solo evaluation was comprised of 11 continuous rating scales and four additive rating scales. The overall level of performance achievement for each student was de rived from the sum of the scores for each of the performance dimensions. The observed means and standard deviations indicated that ju dges found substantial variation in the performances in each dimensi on and for each instrument. Despite the relative homogeneity of the student sample, judges demons trated a high level of variability. Students were provided specific informati on about levels of performance strengths and weaknesses. The median alpha reliability among the 16 inst ruments was .915, suggesting that there was a sufficient level of internal consistency among judges. The correlations between each performance dimension and the total score ranged from .54-.75 with a median correlation of .73. These correlations suggest that each scale dimens ion contributed substantial reliable variance to the total score. Saunders and Holahan concluded th at the pattern of correla tions provided indirect

PAGE 39

39 evidence of the validity of the criteria-specific rating scales for diagnosing the strengths and weaknesses of individual performances. The re searchers noted that because three kinds of performances (prepared piece, scales, and sight-r eading) were measured, factor analysis would provide insight into the interd ependence of performance dime nsions across th ese types of playing. Factor analysis would i ndicate the constructs that guid e adjudicators in the evaluation process as well. Saunders and Holahans findings have impli cations for the present study. Their data provide indirect evidence that criteria-specific rating scales ha ve useful diagnostic validity. Through such scales, students are given a diagnos tic description of deta iled aspects of their performance capability, somethi ng that Likert-type rating scal es and traditional rating forms cannot provide. Such scales help adjudicators list en for specific aspects of a performance rather than having them make a value judgment a bout the overall merits of a performance. The Measurement of String Performance Stephen E. Farnum Because of the success obtained and reported with the Watkins-Farnum Performance Scale and its practical value as a si ght-reading test for use in de termining seating placement and periodic measurement, it was suggested that a similar scale be developed for string instruments (Warren, 1980). As a result, the Farnum String Scale: A Perf ormance Scale for All String Instruments (1969) was published. Both te sts require the student to play a series of musical examples that increase in difficulty. No reliabili ty or validity information is provided in the Farnum String Scale (FSS) The test manual describes four prel iminary studies used to arrive at sufficient range of item difficulty. Initially Farnum simply attemp ted to transpose the oboe test from the WFPS, but he found that there was an in adequate spread of di fficulty. New exercises

PAGE 40

40 were written, resulting in a fina l form of 14 exercises that are designed to evenly increase in difficulty level. Like the WFPS, the Farnum String Scale uses scoring based on measure-by-measure performance errors. The performance errors that can be taken into account are as follows: Pitch Errors (A tone added or omitted or played on a wrong pitch) Time Errors (Any note not given its correct time value) Change of Time Errors (A marked increase or decrease in tempo) Expression Errors (Failure to observe any expression marks) Bowing Errors Rests (Ignoring a rest or failure to give a rest it s correct value) Holds and Pauses (Pauses between notes within the measure are to be counted as errors) Repeats (Failure to observe repeat signs) The Farnum String Scale manual does not indicate how to us e test results, except for the title page which states: A St andard Achievement Test for Ye ar to Year Progress Records, Tryouts, Seating Placement, and Sight Reading ( 1969). Grading charts are included as part of the individual sheets. Despite the extensive revision pr ocess, criticism has been leveled at this test by some, suggesting that the bowings were not well thought out (Warren, 1980). In examining the exercises written, the following problems are found: 1. bowings that require excessive retakes, 2. bowings that are awkward, i.e. non-idiomatic, a nd 3. bowings that are am biguous, or not clearly marked. Clarity in bowing is a concern because bowing errors often lead to other errors, especially in rhythm. In severa l of the exercises, arbitrary bo wing decisions have to be made when sight-reading. Since bowing is one of the te sted items, students shou ld not be required to devise bowing solutions that are not clearly marked. Bowing ambigu ity represents a flaw in the test validity. Boyle and Rodocy observe that, despite the criticisms that may be leveled against the WFPS and the FSS the tests do attain a certain amount of objectivity by providing highly

PAGE 41

41 specific directions for scoring performance aspect s about which most experienced teachers could agree regarding correctness (p. 176). These te sts established a precedent for providing explicit detail as to what constitute s an error in performance. Stephen F. Zdzinski & Gail V. Barnes Zdzinski and Barnes demonstrated that it was possible to achieve high reliability and criteria-related validity in a ssessing string instrument perfor mances. In their 2002 study, they initially generated 90 suitable statements gath ered from essays, statements, and previously constructed rating scales. These statements were sorted into a priori categories that were determined by previous research. As with the Ab eles study, a Likert scale was paired with these items. Fifty judges were used to assess one hundr ed recorded string performances at the middle school through high school level. Results from the initial item pool were factor-analyzed using a varimax rotation. Five factors to a ssess string performance were identified: (interpretation/musical effect, articulation/tone, intonation, rhyt hm/tempo and vibrato). These were found to be somewhat different than Abel es (1973) and Bergee (198 7) in their scales construction studies of woodwind and brass performa nce. This is not su rprising, considering the unique challenges of string instrument and woodw ind instrument technique. String instrument vibrato had items that were idiomatic for the inst rument. Likewise, articu lation and tone quality are largely controlled by the right (bowing) side in string perf ormance and were loaded onto a single factor, as contrasted with wind instrume nt assessment scales. The authors found that factors identified to assess string instrument wind instrument, and vocal performance are distinct, and related to unique technical challenges specific to the instrument/voice (Zdzinski, p.253).

PAGE 42

42 Twenty-eight items were selected for subs cales of the String Performance Rating Scale (SPRS) based on factor loadings. The reliability of the overall SPRS was consistently very high. Reliability varied from .873 to .936 for each judgi ng panel using Hoyts analysis of variance procedure. In two studies conducted to establish criterion related validity, zero order correlations ranged from .605 to .766 between the SPRS and two other rating scales. The researchers concluded that string performance measurement may be improved through the use of more specific criteria, similar to those used in their study (Zdzinsky, p. 254). Such tools may aid the educator/researcher by pr oviding highly specific factors to listen and watch for when analyzing student performances. Summary: Implications for the Present Study Studies carried out in the measurement of instrumental music performance have increased in reliability, validity, and specificity since the first standardized test for band instrumentsthe Watkins-Farnum Performance Scale of 1954. Surprisingly, along with the Farnum String Scale this is still the only readily availabl e published performance measure. One can conjecture that the use of teacher-made tests account for this, but the more plausible explanation is music teachers dist rust of any test that would clai m to be capable of measuring a subject as complex and multifaceted as music performance. The use of descriptive statements that were found through factor analysis to have commonly accepted meanings has been a significan t development in increasing content validity in performance measurement. As researcher s applied the techniques pioneered by Abeles (1973), they discovered that factors identified for one instrument or group of instruments did not necessarily transfer directly to another instrumental medium. Statements about tonal production

PAGE 43

43 on a clarinet may not have the same high factor loadings on a string in strument where tone production is controlled primarily by bowing technique (Zdzinski, 2002). Through factor analysis the reliability of the new measures improved. However, with additional research came more questions. In the Abeles (1973) and Zdzi nski (2002) studies, only the audio portions of performances were analyz ed by judges. The reasons these researchers chose not to include visual input is not addressed in their studies, but the fact that they chose to record results using audio only may have contri buted to the higher reliability found in these studies. Gillespie (1997) compared ratings of vio lin and viola vibrato performance in audio-only and audiovisual presentations. Thirty-three in experienced players and 28 experienced players were videotaped while performing vibrato. A panel of experts rated the vi deotaped performances and then six months later rate d the audio-only porti on of the performances on five vibrato factors: width, speed, evenness, pitch stabil ity, and overall sound. While the experienced players vibrato was rated higher regardless of what mode of presentation, results revealed significantly higher audiovisual ratings for pitc h stability, evenness, and overall sound for inexperienced players and for pitch stability for e xperienced players. The implications are that visual impressions may cause adjudicators to be less critical of th e actual sound produced. Gillespie notes; The visual stimuli give viewer s additional information about a performance that can either be helpful or distracting, causing them to rate the performance di fferently than if they had simply heard it. He adds, If the memb ers of the panel see an appropriate motion for producing vibrato, they may rate the vibrato higher, regardless if the pitch drifts slightly (Gillespie, p. 218). At the very least, the st udy points out the need fo r the strictest possible consistency in the content-format given to the judge s to assess. If assessment is made from an

PAGE 44

44 audiovisual source or a viewed liv e performance, the possible eff ects of visual influence on the ratings needs to be considered. Concerns about content valid ity were uppermost in mind when choosing the excerpts for the Diagnostic Test of Cello Technique In the following chapter the development and validation of these materials is discussed, as well as the measurement used to quantify the data from the written and playing portions of the test.

PAGE 45

45 CHAPTER 3 METHODOLOGY The purpose of this study was to construct, validate and administer a diagnostic test of cello technique for use with college-level students. This test is criterion-referenced and included both quantitative and qualitative measurements. This study was implemented in the following stages: (a) development of an initial testing in strument, (b) administration of a pilot test, (c) administration of a validity study, (d) administration of the final test, and (e) data analyses procedures for the final test, including an inte rjudge reliability measurement. This chapter describes the following methodological elements of the study: setti ng and participants, instrumentation, data collection, data analys is, and validity and reliability procedures. Setting and Participants Approval for conducting this study was obtai ned first from the Institutional Review Board (IRB) of the University of Florida. A copy of the informed consent letter is included in Appendix D. The testing occurred at the respec tive schools of the participants, using studio or classroom space during times reserved for this study. College-level students ( n = 30) were recruited for this st udy from three private and three public universities in the southeastern region of the United States. While this demographic does not include all the regions of the United States, the variability is c onsidered adequate for this test, which was not concerned with regional variations, if such variations exist, in cello students. The participants selected were undergraduate cello students, both majoring an d minoring in music. This subject pool consisted of music performance majors ( n = 16), music minors ( n = 1), double majors ( n = 3), music therapy majors ( n = 2), music education majors ( n = 6), and music/premed. students ( n = 2). Using subjects from a diversity of academic backgrounds assumes that

PAGE 46

46 this test has value as a diagnostic tool for students studying music through a wide variety of degree programs, not just those majoring in performance. A letter of introduction that explained the pur pose of the study was mailed to the cello faculty of the six schools. Upon receiving approval from the faculty cello teacher, the letter of consent along with the Playing Test (Appendix G) was provided for each participant. One copy of the consent form was signed and returned from each participating student. Following this, times were arranged for each student to take the Written and Playing Test. Each student received a copy of the Playing Test a minimum of two w eeks before the test date. Included with the Playing Test was a cover letter in structing the students to prepare all excerpts to the best of their ability. Attention was directed toward the metronome markings provided for each of the excerpts. Students were instructed to perform these excerpts at the tempos indicated, but not at the expense of pitch and rhythmic accuracy. Data Collection The Written and Playing Test Each participant met individually with the pr imary investigator for forty-five minutes. The first thirty minutes of testing time was used for the Playing Test. Before beginning to perform the Playing Test, students were asked to check their tuning with the pitch A-440 provided for them. Students were also asked to take a moment to visually review each excerpt prior to performing it. Students were asked to at tempt to play all the ex cerpts, even if some seemed too difficult for them. The primary investigator listened to and j udged the individual students skill level for each performance. For each aspect of technique assessed, a fivepoint criteria-specific rating scale was constructed. The Playing Test evalua tion form (Appendix H) used both continuous

PAGE 47

47 (sequentially more demanding performance criteria) and additive (nonsequential performance criteria). When a technique was measured using a continuous rating scale, the number next to the written criterion that corresponded to the perceive d level of skill was circled. When using the additive rating scale, the primary investigator marked the box beside each of the written criteria that described one aspect of the performance demonstrating mastery of the skill. Both the continuous and the additive rating scale have a score range of 2-10 point s, as two points were awarded for each level of achievement or each performance competency. It was theoretically possible for a student to score 0 on an item using an additive scal e if their performance matched none of the descriptors. Seven c ontinuous rating scales and ten a dditive rating scales constituted the Playing Test evaluation form. The overall leve l of performance achievement for each student was calculated as the sum of the scores for each area of technique. The Student Self-Assessment Profile The last fifteen minutes was devoted to th e completion of the Written Test (Appendix E) and the Student Self-Assessment Profile (Appendix J). To maintain the highest control in administering the test, the primary investigator remained in the room while the Written Test was taken, verifying that neither a piano nor cello wa s referred to in completing the test. The Written Test evaluation form is provided in Appendix F. Rationale for the Assessment Methodology Saunders and Holahan (1997) have observed th at traditional rating instruments used by adjudicators to determine a level of quality and characte r (e.g., outstanding, good, average, below average, or poor) provide little diagnostic feedback. Such rating systems, including commonly used Likert scales, cause adjudicators to fall back on their own subjective opinions without providing a means to inte rpret the results of the examination in new ways. Furthermore,

PAGE 48

48 due to their design, these rating sc ales are incapable of providing mu ch in the way of interpretive response. As Saunders and Holahan observe, k nowing the relative degree to which a judge agrees or disagrees that, rhythms were accurate, however, does not provide a specific indication of performance capability. It is an evaluation of a judges magnitude of agreement in reference to a nonspecific and indeterminate performan ce standard and not a precise indication of particular performance attainment (p. 260). Criteria-specific rating scal es are capable of providing gr eater levels of diagnostic feedback because they contain wr itten descriptors of specific leve ls of performance capability. A five-point criteria-specific ra ting scale was developed for this study to allow for greater diagnostic input from judges. As pects of left hand and bowing technique were evaluated using both continuous (sequentially more exacting cr iteria) and additive (nons equential performance criteria). Both continuous and additive scales requir e a judge to choose which of the several written criteria most closely desc ribe a students performance. The additive scale was chosen when a particular technique (s uch as playing scalar passages) has a number of nonsequential features to be evaluated, such as evenness, good bow distributi on, clean string crossings, and smooth connections of positions. Along with the five-point criteria specific ra ting scale, the Playing Test evaluation form (Appendix H) provided judges with an option of writing additiona l observations or comments about each technique evaluated. While these data are not quantifiable for measurement purposes, recording the judges immediate reac tions in their own words to a students performance may capture an insight into some as pect of performance that the written criteria overlooks. Because the primary purpose of this test is diagnostic, allowing room for commentary is important.

PAGE 49

49 Interjudge Reliability Two adjudicators were recruited to dete rmine interjudge reliability of the Playing Test. Both judges were professional cellists who teach at the college-level. To decrease selection bias as a threat to external validity, the adjudicators were c hosen from two different geographical regions and teachi ng institutions. An introductory DVD was provided, explaining how to use the Playing Test evaluation form in assessing student performances. Each judge viewed and listened to DVDs of five separate student performances of the Playing Test, and rated the performances using th e Playing Test evaluation form (Appendix H). Judges were asked to return the results by a specified date, using a self-addressed stamped envelope provided. The combined judges evaluati ons of ten individual students were correlated to the primary investigators evaluation results of these same students. Data Analyses Data analyses included item analysis for both the Written and the Playing Test. The distribution of total scores was described using means and standard deviations. Item difficulty, as expressed as the proportion of students who answered an item correctly, was determined. Item discrimination analysis was conducted usin g the point biserial co rrelation to reveal the strength and direction of the relationship betw een success on a particular item and success on the total test. Qualitative data from the Observa tions/Comments portion of the Playing Test were examined and compared with individual scores. The content of the Student Se lf-Assessment Profile was evaluated and correlated to the data from other sections of the test. Relationships were studie d between the students scores on the Written and Playing Test and: a) year in college, b) major/minor distinction c) years of study, d) piano experience, e) extent and content of repertoire, f) de gree of interest in performance

PAGE 50

50 areas, g) personal goals for studying the cell o, h) expressed area of technique needing improvement, and i) short term and long term goals in music. Content Validity The techniques that were assessed in this study are believed to be esse ntial aspects of lefthand and bowing techniques for a college-level st udent. The choice of categories for left-hand and bowing technique was based on the frequency these techniques are f ound in the repertoire for cello, as well as the discussion of them in the following sources: The Ivan Galamian Scale System for Violoncello arranged and edited by H. J. Jensen; The four Great Families of Bowings by H. J. Jensen (Unpublished Paper); Cello Playing of Today by M. Eisenberg; Cello Exercises: A Comprehensive Survey of Essential Cello Technique by F. Magg; and Dictionary of Bowing and Pizzicato Terms by J. Berman, B. Jackson, and K. Sarch. A validation study was conducted to determine to what extent teachers felt this test measured a students technique (Mutschlecner, 2005). Cello teachers ( N = 9) on the college and college preparatory level agreed to participate in this validity st udy by reading all sections of the diagnostic test and then responding to questions in an evaluation form. The results of this study are provided in Appendix B

PAGE 51

51 CHAPTER 4 RESULTS This chapter describes the procedures used to analyze the data collected and presents the results of these analyses. Data from the Written Test, the Playing Test, and the Student SelfAssessment were collected from 30 participants in accordance with the procedures outlined in Chapter 3. The dependent variables of this st udy were the Written and Playing Test scores. Independent variables were (a) year in school, (b ) major/minor distincti on, (c) years of cello study, and (d) piano experience. Data Analysis Descriptive data for the scores were tabul ated and disaggregated by independent variable. Data were explored using t -tests, regressions, an d correlations. Regressions were used to determine the effect of the independent variable s on the obtained test scores. The independent variables of major/minor distinct ion, year in school, and piano experience are categorical, and dummy codes were used to represent these variab les in the regression anal yses. Item difficulty, item discrimination, and point bise rial correlations were calc ulated for the Written Test. Cronbachs Alpha ( ) was used to estimate of reliability of individual items on the Playing Test. The Spearman rank-order correlation was used as a measure of the Playing Tests validity. Interjudge reliability was calculated using Pearsons r Questions on the Written Test were dichot omous, and tests were scored and yielded continuous data. The Playing Test performances were evaluated using the criteria-specific rating scale that was revised following th e pilot test (see Appendix A fo r the Pilot Study report). Two external reliability researchers viewed and evaluated videotapes of 33% ( N = 10) of the Playing Tests. These data were then correlated with the primary investigators scores of these same student performances as a measure of interjudge re liability. The participan ts cello teachers rank

PAGE 52

52 ordered their students by level of technical skill based on thei r assessment of the students playing technique. These rankings were correlated to those based on the Playing Test results as a measure of validity. The data an alysis was designed to explore th e following research questions: 1. To what extent can a test of cello playing measure a students technique? 2. To what extent can a criteria-speci fic rating scale provide indications of specific strengths and weaknesse s in a students playing? 3. Can a written test demonstrate a students understanding of fingerboard geography, and the ability to appl y music theory to the cello? Participants Written and Playing Test scores, and student answers to questions in the Student SelfAssessment Profile were obtained ( N = 30). Participants were undergraduate music majors and minors studying cello at three privat e and three public universities ( N = 6) in the southeastern region of the United States. Part One: The Written Test Scoring the Written Test The Evaluation Form used to tabulate the scores for the Written Test is provided in Appendix F. Items on the Written Test were assigned points using the following system: (1) Fingerboard Geography: 11 points. (44 pitch locations to identify were divided by 4) (2) Interval Identification: 8 points. (3) Pitch Location and Fingering: 32 points. (a single point was assigned for correctly identifying both pitch and fingering) (4) Single Position Fingering: 32 points. (5) Bass, Treble, and Tenor Clef No te Identification: 12 points. The total possible score for the combined se ctions of the Written Test was 95 points.

PAGE 53

53 Results from the Written Test Table K-1 (Appendix K) presents the raw scores of the Written Test items and the composite means and standard deviations. Reliability of the Written Test was obtained using the Kuder-Richardson formula, re vealing the internal cons istency of test items: rKR20 = .95. This result indicates that despite th e narrow range of scores, the Wr itten Test has strong interitem consistency. Table 4-1 presents the data from a regre ssion analysis for year in school (freshmen, sophomore, junior, and senior) and the Written, Pl aying, and combined Test scores. Freshmen classification emerged as a significant predictor ( p < .05) for the Playing Test and combined test scores. The R -squared value of .28 indicates that fres hmen classification accounted for 28% of the variance in the Playing Test Scores. For the combined Written and Playing Test scores, the R-squared value of .265 indicates that freshmen cl assification accounted for 27% of the variance. With the exception of these findings, year in sc hool does not seem to bear a relationship to technical level, as measured by the Written and Playing Test. Exploring the relationship of test scores and students degree program was complicated, as there was a mixture of music performance majors, double majors, music education majors, music therapy majors, and music minors. One sc hool did not allow freshmen to declare music performance as a major until their sophomore year, insisting they enter the studios initially as music education majors. If one classified double majors in the music performance category, then there were 21 music performance majors and nine students in the other category. A regression analysis was conducted with major/minor distinc tion as a predictor of the written, playing and total scores. No effect of major or minor distinction was found for the Written Test ( R2 = .001). Results were nearly significant for the Playing Test ( p = .08)) and not si gnificant for the

PAGE 54

54 combined Written and Playing Tests ( p = .15). A students choice to major in cello does not appear to be an indication of his or he r technical level accord ing to this test. The 30 cellists participating in this resear ch had studied the cello between five and sixteen years (Table 4-2). A regression was conduc ted with years of cello study as a predictor of the scores. For the Written Test, ( B = .037, SE B = .069, = .53) and the Playing Test, ( B = .044, SE B = .024, = 1.82) years of cello playing was not found to be a significant predictor ( p = .60; p = .08). A lack of relationship between years of cello playing and scores may reflect the wide range of students innate ability and developmen tal rate. The relatively small sample size also means that outliers have skewed the results. Effi cient use of practice time is an acquired skill; it is possible for students with fewer years of experi ence to surpass those that, while having played longer, are ineffective in their practice. Though no data on actual numbers of years of piano experience were collected, exactly one-half of the participants reported having piano experience, and one-half reported having no piano experience ( ns = 15). A t -test of the means for Written and Playing Test scores was conducted based on the participants self-reported piano experience Both tests were significant. Students reporting piano expe rience scored significantly higher on the Playing Test ( M = 91.93, SD = 3.08), t (30) = 115.55, p = .000, than those without piano experience ( M = 78.47, SD = 12.71), t (30) = 23.92, p = .000. Students reporting piano experi ence also scored significantly higher on the Written Test ( M = 129.73, SD = 20.63), t (30) = 24.35, p = .000, than those without piano experience ( M = 116.93, SD = 28.28), t (30) = 16.01, p = .000. Because significant differences were found in these groups based on reported piano experience, a regression was conducte d with piano experience as a predictor of the scores. For the Written Test, ( B = -2.00, SE B = 4.21, = -.48) piano experience was not found to be a

PAGE 55

55 significant predictor. In the Playing Test, ( B = -19.20, SE B = 8.63, = -2.23) piano experience emerged as a significant predictor ( p < .05). The R2 value of .15 indicates th at piano experience accounted for 15 % of the variance in the Playing Test scores. Results are shown in Table 4-3. Regression Analysis of Written Test Items In the Interval Identification section of th e Written Test, the mean score for those with piano experience was 7.07 out of 8 possible points as compared with a mean of 5.73 for those without experience. Through regression analysis piano experience was shown to be a significant predictor ( p = .002) of the Interval Identification scores ( B = 1.56, SE B = .41, = 3.81). The R2 value of .528 indicates that pia no experience accounted fo r 53 % of the varian ce in the Interval Identification scores. This is a highly significant figure. Students with piano experience clearly are better at thinking intervallically on the cello. For the Pitch Location and Fingering section of the test, the mean s were 31.13 out of 32 possible points for those with piano experi ence compared with 22.26 for those without. Regression analysis revealed that this piano ex perience was nearly significant as a predictor of these scores ( p = .061). Piano experience again emer ged as a significant predictor ( p = .002) of the Single-Position Fingering scores ( B = 1.80, SE B = .47, = 3.83). The R2 value of .53 indicates that piano experience accounted fo r 53 % of the variance in the Single-Position Fingering scores. This section re quired students to look at notes vertically through a series of arpeggios and arrive at a fingering, something that pianists ar e frequently required to do. Item difficulty, item discrimination, and point bi serial correlations were calculated for the Written Test. Results are presented in Table 4-4. The Interval Identification section had the highest average difficulty level (.80) of any section of the Written Test. Items on the Bass, Treble, and Tenor Clef Note Iden tification section were found to be the least difficult. Item 23

PAGE 56

56 ( rpbs = 0.80) and item 31 ( rpbs = 0.82) of the Pitch Location an d Fingering Section had the two highest correlations to the total test score. The range of difficu lty level (1.0-.80) indicates that the Written Test is not at an appropriate leve l of difficulty for undergraduate cellists. Using Pearsons r a low positive correlation was obtained between student scores on the Written and Playing Test ( r2 = .16). This suggest s little relationship between scores on these tests. This suggests that the cognitive knowledge required to do well on the Written Test may be distinct from the psychomotor ability needed to demonstrate the techniques found in the Playing Test. Part Two: The Playing Test Scoring the Playing Test A discussion of the criteria-spe cific rating scale used to score the Playing Test is found in Chapter Three. Ten techniques were evaluated us ing an additive rating scale which ranged from 0 and 10 points per item. Seven techniques were evaluated using a contin uous rating scale with a range of 2 to 10 points possible. A zero sc ore resulted from none of the criteria being demonstrated for an additive item. The total pos sible score for the combined sections of the Playing Test was 170. Results from the Playing Test Reliability was estimated by using Cronbachs Alpha to find the relationship between individual items on the Playing Test. The results ( = .92) indicate high internal consistency of test items: this suggests that the means of assessing each technique are well-matched. Table K-3 (Appendix K) presents the raw sc ores of the Playing Test items and the composite means and standard deviations. Table 4-5 lists these items from highest to lowest based on their mean scores. These data re veal that students scored highest on the detach bowing

PAGE 57

57 stroke ( M = 8.46), and lowest on pizzicato ( M = 6.06). Discussion of the significance of these mean scores is found in Chapter Five. Comparison of Left Hand Technique and Bowing Stroke Scores The total mean scores were calculated for th e two sections of the Playing Test: Left Hand Technique ( M = 7.21), and Bowing Strokes ( M = 7.31). Students performed at very similar level for both sections and performed uniformly, i.e. higher-scoring students di d well on both sections and lower-scoring students di d less well on both sections. Comparison of Playing Test Scores and Teacher-Ranking To determine the predictive validity of the Playing Test, teachers from the six music school participating in this resear ch were asked to rank their stude nts from lowest to highest in terms of their level of technique. Five of th e six teachers responded to this request. These rankings were compared to the rank-order based on the Playing Test scores. The results are shown in Table 4-6. Two teachers (School A and B) ranked their students in exactly the same order as the Playing Test ranking ( r2 = 1.0). Using the Spearman rank-order correlation, the correlations of the other three schools who responde d were positive and strong: ( r2= 0.65, 0.84, and 0.76 respectively). Results indicate students performance on the Play ing Test closely corresponds to the level of their technique as perceived by th eir teachers. The Play ing Test is criterionreferenced and not designed to be used as a norm-reference test. However, the strong positive correlations of the teachers rank-or der of their students to that of the rank order of the scores on the Playing Test suggests that th is measure is a valid means of determining undergraduate cello students technical ability.

PAGE 58

58 Interjudge Reliability of the Playing Test Two judges were recruited to evaluate five di fferent student performances of the playing test as described in Chapter Three. Interjudge reliabilities were cal culated using Pearsons r Correlations were as follows: Judge A and the primary investigator ( r = 0.92); Judge B and the primary investigator ( r = 0.95). These results are presented in Table 4-7. The students observed by judges A and B represented 33% of the total nu mber of students participating. These data, with its highly significant correlations, appear to confirm the effectiveness of the criteria-specific rating scale used in this study as a means of r ecording information about specific strengths and weakness in a students playing. Part Three: The Student Self-Assessment Profile The Student Self-Assessment Profile (SSAP) was created as another means to gather diagnostic information about students. Many teach ers have developed ques tionnaires to better understand the performance backgro und of their students. The self -assessment used in this study served this function, as well as providing add itional information about areas of performance interest and personal goals. In addition, the SSA P allows students to comment on what aspects of technique they feel they need to improve. Tw enty-nine of the thirty students participating in this study completed the Student-Assessment Pr ofile. The following subheadings represent sections of the Student Self-Assessment Profile. Repertoire Previously Studied Students listed many of the standard methods and etudes collections for the cello: Cossman, Dotzauer, Duport, Fuillard, Franchomme, Piatti, Popper, Sevick, Starker, and Suzuki. Pieces from the standard literature for cello were listed. For a teacher, such information shows

PAGE 59

59 the extent and breadth of a new students experience and may indica te appropriate directions for further study. How interested are you in each of these a reas of performance: Solo, Chamber, and Orchestral? Table 4-8 lists students res ponses to this question. Eightythree percent of the students stated they either agreed or strongly agreed to having interest in solo and orchestral performance, and ninety-three percent expressed the same for ch amber music. Noting res ponses to this section could be a means for teachers to in itiate discussion with students about their plan of study. If a students greatest interest was in playing chamber mu sic, his teacher might help to facilitate this desire. Knowing that a students primary goal was to win an orchestral a udition would dictate in part the choice of repertoire studied. Other areas of performance interest? Students listed the following areas of performing interest: jazz ( n = 2), conducting ( n = 1), piano accompanying ( n = 1), choir n = 1), improvisation ( n = 1), bluegrass ( n = 1), praise bands ( n = 1), and contemporary performance ( n = 2). Teachers provided with this information might choose to direct students to nontrad itional sources of study, such as improvisation methods, learning to read chor d charts, or playing by ear. What are your personal goals for studying the cello? Responses to this question are provided in Table 4-9. Five out of the twenty-nine students (17%) listed teaching privately as a goal for study. The second most frequently mentioned goal was orchestral performance ( 10%). If this study was conducted with the highest ranking music conservatori es in the United States, the researcher suspects that solo performance might be frequently mentioned as well.

PAGE 60

60 What areas of cello technique do you feel you need the most work on? Answers to this question are presented in Table 4-10. Bow stroke was mentioned by ten students as needing the greates t attention. Nine students discussed the need to work on relaxation as they played, specifica lly referring to left and right hand, shoulder, and back tension. Many of the techniques assessed in the Pl aying Test were alluded to such as spiccato bowing or thumb position. The specificity of many of the ar eas of technique mentioned may have been due to the students filling out the SSAP after having taken the Playing Test. The difficulty students had with playing certain passages caused them to list these techniques as ones to work on. This appears to be anecdotal evidence that the playing test can cause students to be more self-aware. Summarize your goals in music and what you need to accomplish these goals. In answering this question, students descri bed their broad musical objectives, often discussing career goals. The goals in music were to be written for six month, one, two, four, and ten-year intervals, but not all students comple ted each sub-category. Table 4-11 presents the responses to this section in th e students own words. Many of the goals implied an understanding between the teacher and the student, such as a tw o-year goal of memorizing a full concerto. Acquiring advanced degrees were goals for two of the students. One students six-month goal was to practice more musically than technically. Without ag reement between the teacher and student on such a goal, conflicts could arise: what if the teacher felt the next six months were best spent drilling technique? One students four-year goal was, To get pa st the pre-eliminations in an orchestra audition. The Student Self-Assessment Profile woul d help to assure that the teacher was privy to this information. One music majors long-t erm goal was to, play recreationally, not as a career. This belies the assumption that every music-major is planning on a career in music.

PAGE 61

61 Access to this kind of information could pr event misunderstandings developing between a teacher and student that resu lt from conflicting goals. Summary of Results The following summarizes the results obtained in these analyses: 1. The Written Test was found to be too easy fo r most undergraduate cellists. Lower scores in the Interval Identification sec tion indicate that some students have difficulty applying their understanding of interval s to the cello. 2. Strong interitem consistency was found for the Playing Test, indicating high reliability for this section of the test. 3. Year in school was a significant predicto r of Playing Test scores and combined scores for freshmen students. 4. Music performance majors scores did not diffe r significantly from scores earned by students in other degree programs. 5. The number of years a student had played th e cello was not found to be a significant predictor of the Written or Playing Test scores. 6. Piano experience was found to be a significa nt predictor of Playing Test scores, and scores on two sections of the Written Test. 7. Playing Test scores were a significant predictor of how teach ers would rank their students in terms of level of technique. 8. The criteria-specific rating scale devel oped for this study appears to be a highly reliable measurement tool based on interjudge reliability.

PAGE 62

62 Table 4-1. Summary of Regression Analysis for Year in School as a Predictor of Written, Playing, and Total Test Scores ( N = 30) Score B SE B Written Test Freshmen ( n = 11) .0069 .0079 .88 Sophomore ( n = 8) .0060 .0074 .82 Junior ( n = 5) .0085 .0060 -1.40 Senior ( n = 6) .0031 .0067 -0.47 Playing Test Freshmen .010 .003 3.30* Sophomore .0058 .0032 -1.83 Junior .0032 .0028 -1.16 Senior .0014 .0030 -0.46 Total Score Freshman .009 .0027 3.18* Sophomore .0038 .0029 -1.32 Junior .0040 .0024 -1.67 Senior .0009 .0027 -0.34 Note. Written Test Scores: R2 = .027 Freshmen; R2 = .023 Sophomore; R2 = .065 Junior; R2 = .008; Senior. Playing Test Scores: R2 = .280 Freshmen; R2 = .107 Sophomore; R2 = .046 Junior; R2 = .008; Senior. Total Test Scores: R2 = .265 Freshmen; R2 = .058 Sophomore; R2 = .091 Junior; R2 = .004; Senior. p < .05

PAGE 63

63 Table 4-2. Years of Study, Fr equency, and Test Scores Years of Study Frequency Written Test Mean Score Playing Test Mean Score ________________________________________________________________________ 5 1 91 144 6 1 91 114 7 6 83 108 8 4 75.5 101.5 9 2 93 126 9.5 1 95 142 10 2 93.5 140 11 7 87.71 141.1 11.5 1 68 140 12 3 81.31 108.7 13 1 93 156 16 1 87 104

PAGE 64

64 Table 4-3. Summary of Regression Analysis for Piano Experience as a Predictor of Written, Playing, and Total Test Scores ( N = 30) Test Section B SE B Written Test Scores -2.00 4.21 -.48 Playing Test Scores -19.20 8.63 -2.23* Total Combined Score -21.20 11.74 -1.81 Note. R2 = .008 for Written Test Scores; R2 = .15 for Playing Test Scores; R2 = .10 for Total Test Scores. p < .05

PAGE 65

65 Table 4-4. Item Difficulty, Disc rimination, and Point Bi-Serial Co rrelation for the Written Test Category Item Item Item Point Bi-Serial Number Difficulty Discrimination Correlation _____________________________________________________________________________________ Fingerboard Geography 1 .97 .13 Interval Identification 1 .77 .25 0.15 2 .87 .38 0.26 3 .80 .50 0.37 4 .77 .13 0.06 5 .77 .25 0.08 6 .70 .38 0.06 7 .90 .25 0.49 8 .83 .50 0.40 Pitch Location 1 .93 .25 0.63 And Fingering 2 .93 .25 0.63 3 .90 .25 0.52 4 .90 .25 0.49 5 .87 .38 0.41 6 .93 .25 0.47 7 .90 .38 0.57 8 .80 .50 0.50 9 .97 .13 0.78 10 .90 .25 0.63 11 .77 .38 0.75 12 .87 .38 0.40 13 .90 .38 0.63 14 .83 .50 0.63 15 .93 .13 0.33 16 .87 .38 0.63 17 .83 .50 0.70 18 .83 .50 0.70 19 .87 .38 0.71 20 .87 .38 0.70 21 .90 .38 0.75 22 .90 .38 0.70 23 .90 .38 0.80 24 .90 .38 0.65 25 .90 .38 0.71 (Table continued on next page)

PAGE 66

66 Table 4-4. (continued) Category Item Item Item Point Bi-Serial Number Difficulty Discrimination Correlation _____________________________________________________________________________________ Pitch Location 26 .80 .50 0.71 And Fingering 27 .83 .50 0.73 28 .80 .50 0.73 29 .83 .50 0.74 30 .83 .50 0.74 31 .83 .50 0.82 32 .80 .50 0.76 Single Position Fingering 1 .97 .13 0.07 2 .97 .13 0.07 3 .97 .13 0.07 4 .97 .13 0.07 5 1.0 0.0 N/A 6 1.0 0.0 N/A 7 1.0 0.0 N/A 8 1.0 0.0 N/A 9 .97 .13 0.43 10 .97 .13 0.43 11 .83 .38 0.16 12 .80 .38 0.15 13 .93 .13 0.23 14 .90 .25 0.16 15 .97 .13 0.36 16 .97 .13 0.36 17 .83 .13 0.06 18 .83 .25 0.32 19 .90 .25 0.40 20 .87 .25 0.35 21 .93 .13 0.23 22 .93 .13 0.23 23 .93 .13 0.23 24 .93 .13 0.23 25 .90 .25 0.23 26 .87 .13 0.12 (Table continued on next page)

PAGE 67

67 Table 4-4. (concluded) Category Item Item Item Point Biserial Number Difficulty Discrimination Correlation _____________________________________________________________________________________ Single Position Fingering 27 .93 .25 0.31 28 .87 .25 0.18 29 .93 .13 0.23 30 .93 .13 0.23 31 .97 .13 0.36 32 .97 .13 0.36 Bass, Treble, and Tenor Clef Note Identification 1 1.0 0.0 N/A 2 .97 .13 0.46 3 .97 .13 0.43 4 1.0 0.0 N/A 5 1.0 0.0 N/A 6 .97 .13 0.02 7 .97 .13 0.0 8 1.0 0.0 -0.04 9 .93 .13 0.08 10 .93 .13 0.27 11 1.0 0.0 -0.05 12 .90 .13 0.01 Note Point Biserial Correlations were not found for the Fingerboard Geography items as 97% of the students had perfect scores on this section.

PAGE 68

68 Table 4-5. Mean Scores of Play ing Test Items in Rank Order Item Rank Order Mean Score ____________________________________________________ Detach 1 8.47 Slurred Legato 2 8.23 Arpeggios 3 8.13 Staccato 4 7.93 Vibrato 5 7.93 Portato 6 7.67 Position Changes 7 7.67 Scales 8 7.60 Arp. Chords 9 7.20 Sautill 10 7.13 Thumb Position 11 7.00 Broken Thirds 12 6.80 Martel 13 6.67 Double Stops 14 6.40 Spiccato 15 6.30 Intonation 16 6.20 Pizzicato 17 6.00 Note. Ratings ranged from 2 through 10.

PAGE 69

69 Table 4-6. Comparison of Teacher-R anking to Playing Test-Ranking Teacher Ranking Playing Test Scores Playing Test Ranking r2 _____________________________________________________________________________________ School A 1 76 1 1.0 2 102 2 3 136 3 4 152 4 5 156 5 School B 1 124 1 1.0 2 140 2 3 140 3 4 148 4 5 152 5 School C 1 100 1 0.65 2 142 4 3 134 2 4 134 3 School D 1 92 1 0.84 2 116 2 3 116 3 4 128 4 5 146 6 6 152 7 7 132 5 School E 1 76 1 0.76 2 86 3 3 114 4 4 120 5 5 82 2 6 144 7 7 140 6

PAGE 70

70 Table 4-7. Comparison of Researchers a nd Independent Judges Scoring of Student Performances of the Playing Test Student Primary Judge A No Investigator __________________________________________________ 1 152 162 2 136 142 3 156 158 4 144 142 5 134 136 __________________________________________________ M 144.4 148 SD 9.63 11.31 r2 0.92 __________________________________________________ Student Primary Judge B No Investigator 6 146 138 7 152 152 8 128 104 9 116 98 10 152 134 __________________________________________________ M 138.8 125.2 SD 16.09 8.67 r2 0.95

PAGE 71

71 Table 4-8. Numbers of Student s Expressing Interest in Solo Chamber, and Orchestral Performance ( N = 29) Category Strongly Agree Agree Disagree Strongly Disagree ______________________________________________________________________________ Solo 10 14 5 0 Chamber Music 20 7 2 0 Orchestral 16 8 4 1 Note Students could indicate interest in multiple cate gories, resulting in totals exceeding the number of students completing the form ( N = 29).

PAGE 72

72 Table 4-9. Personal Goals for Studying the Cello Specified Goal Frequency Mentioned ( N = 29) ________________________________________________________________________ Teaching privately 5 Orchestral performance 3 Chamber music performance 2 Expand repertoire 2 Lifelong hobby, personal enjoyment 2 College-level teaching 1 Obtain advanced degrees with the goal of college teaching 1 Improve concentration 1 Become a fluid improviser 1 Work as a studio musician 1 Ability to convey interpretation of music to others 1

PAGE 73

73 Table 4-10. Student Perception of Priorities for Technical Study Technique Frequency Mentioned _______________________________________________________________ Bow Stroke 10 Relaxation, including right and left hand, shoulders and back 9 Vibrato 4 Vibrato in upper positions 2 Thumb position 3 Musicality 3 Sound production/tone 2 Double stops 2 Sautill 2 Sight-reading 1 Reading in different clefs 1 Rhythm 1 Coordination between right and left hand 1 Proper employment of left hand position and whole arm movement 1 Extensions 1 Shifting 1 Spiccato 1

PAGE 74

74 Table 4-11. Goals in Music a nd Means of Accomplishing Them Six Months: Catch up to my peers. To shift easily. Work strictly on technique, not worrying about pieces or recitals. Practice more musically than technically. Have lessons with other teachers. Improve jazz vocabulary. One Year: Keep my scholarships. To have perfect intonation. Become an effective musi c educator (lifelong). Resolve all tension issues; slow, loose practice-making it a habit. Increase in difficulty of music. Work on awareness of bowing choices. Practice. Two Years: To be able to support myself solely through playing and teaching. I hope to memorize a full concerto and feel comfortable performing. Much practice; memorization and pe rformance practice will be needed. Graduate, and find a graduate school with a fabulous teacher. Four Years: To get past the prelims in an orchestral audition. To graduate, get a job as a music therapist, and join the commun ity of a professional orchestra. Play recreationally, not as a career. Ten Years: To be a guest artist at a major music festival. Be teaching at a university with a Ph.D. in music. Be employed in a high school as a music teacher, but still make time to perform and possibly give private lessons. Able to teach other cellists. Gigging professionally. Be a financially stable musician.

PAGE 75

75 CHAPTER 5 DISCUSSION AND CONCLUSIONS This chapter presents a discussion of the re sults of administering the Diagnostic Test of Cello Technique. Following a review of the purpos es and procedures of this study, the findings of this study are addressed in light of (a) th e research questions pos ed, (b) a comparison of results with similar studies, and (c) implications for string education. This chapter closes with conclusions and recommended dir ections for future research. Overview of the Study The purpose of this study was to design, validate and administer a diagnostic test of cello technique for use with college-lev el students. Written and play ing tests were designed, pilot tested, and a validity study was unde rtaken. Thirty students from si x different universities in the southeastern United States were re cruited to participate in this re search. Each student completed a written test, playing test, a nd a self-assessment profile. A criterion-based rating scale was developed to evaluate the Play ing Test performances. Two uni versity-level teachers were recruited to judge ten video-taped performances of students taking the Playing Test. Evaluations from those judges were correla ted with the primary research ers to determ ine interjudge reliability. Review of Results The independent variables in this study we re (a) year in school, (b) major/minor distinction, (c) years of cello study, and (d) pian o experience. Freshmen classification emerged as a significant predictor of Playing Test scores ( p = .003) and total scores ( p = .004). No effect of major/minor distinction was found for the Written Test ( R2 = .001). Results were nearly significant for the Playing Test ( R2 = .104) and not significant for the combined Written and Playing Tests ( R2 = .072). Years of cello study were not si gnificant predictors of test results.

PAGE 76

76 Piano experience was shown to have a signifi cant effect on the Playing Test scores: ( p = .034). Students with piano experience scored 14% higher on the Written Test and 7% higher on the Playing Test that those without pi ano experience. The reliability of the Playing Test was high as shown by coefficient alpha ( rtt = 0.92). Correlation coefficients obtained between the primary researcher and the two reliability resear chers were positive and strong (Judge A, r2 = 0.92; Judge B, r2 = 0.95), suggesting that the criteria-specifi c rating scale designed for this study was effective. Observations from the Result s of Administering the Diagnos tic Test of Cello Technique The Written Test Future versions of the Written Test designed for college-students should eliminate The Fingerboard Geography section, as only one student ma de errors in filling out this section. This section should be included for a high school ve rsion of the test; the lik elihood is that not all students at this level would be clear about the location of pitches on the fingerboard. The Interval Identification section as a whol e had the highest averag e difficulty level of any section of the Written Test based on item analys is. In this section, item six (a major sixth across two strings) had the highest difficulty leve l of any item on the test (.70). This item, however did not discriminate well between high-scoring and low-sc oring students (.38). On this item students most likely erred by not keeping in mi nd that on the cello, the interval of two notes lying directly across from each other on adjacent strings is always a perfect fifth. Adding a whole step to a perfect fifth, results in the inte rval of a major sixth. This is an example of something universally known by undergraduate cello students but not necessarily visualized by them on the fingerboard. This suggests that some students were either unclear about interval designations or that they do not th ink intervallically when playing th e cello. It is the researchers

PAGE 77

77 opinion that an awareness of inte rvals while playing represents a higher-order of thinking than simply playing note-by-note. Additional research is needed to determine to what extent intervallic thinking while playi ng the cello is a distinguishing characteristic of advanced performers. In the Interval Identification section of the Written Test, the mean score for those with piano experience was 7.07 out of 8 possible points as compared with a mean of 5.73 for those without experience. Piano expe rience was found to be a signifi cant predictor for this item ( p = .002). Students who play piano are able to identi fy intervals more easily on a representation of a cello fingerboard than those w ithout piano experience. In th e Single-Position Fingering section piano experience again was found to be a sign ificant predictor of a students score ( p = .002). This suggests that students with piano experience may think more clearl y about vertical pitch relationships. String instrument teachers would likely concur, observing that their students who play piano tend to: 1) be better sigh t readers, 2) have a clearer sens e of pitch and intervals, and 3) have better rhythmic accuracy. Additional eviden ce of the positive effect of piano experience on cello performance would be gained through stud ies that compared st udents length of time studying both instruments to their performance on the Playing Test. The Single Position Fingering section may be unc lear in its directions Several students thought they were being asked fo r fingering that would allow th e notes to be played as a simultaneous chord, which wasnt possible fo r some items. The final section (Note Identification in Three Clefs) had several very low point biserial correlati ons (0.07) Errors in this section were almost certainly caused by carelessness and did not reflect a students ability in note reading. One single exception was a student who missed all the tenor clef items but got all the other note identification items right. Comple te fluency in note reading is an essential

PAGE 78

78 prerequisite for sight-reading abi lity. As a result, this secti on should be included in future versions of this test. The Written Test needs to be revised for unde rgraduate students in terms of difficulty level. A greater range of scores would likely result if the present version of the test was administered to high school students. In future versions, using actual passages from the cello repertoire to evaluate a st udents understanding of interv als, fingering, and fingerboard geography would be in keeping with the tes ting philosophy of using situated cognition. The Playing Test Left Hand Technique (nine items) and Basic Bowing Strokes (eight items) were evenly dispersed within the range of lowest to highest m ean scores (Table 4-5). The choice in this study to divide technique into left hand techniques and bowing techniques does not reflect in reality how integrated these two areas are. This study s design did not isolate bo w techniques from the musical context in which they are found. If su ch a study was conducted, it might reveal that some students excel in bowing techniques and others in left hand technique. These two areas of technique are so intermeshed that it would be diffi cult to isolate them. Bo wing serves literally to amplify what the left hand does. Development of bowing skill, through practice on open strings without using the left hand, is limited, and is usually, though not always, confined to initial lessons. The Playing Tests mean scores reveal ed that students scored highest on the detach bowing stroke ( M = 8.46), followed by legato bowing ( M = 8.33), and arpeggios ( M = 8.13). Detach bowing is the most commonly used bow stroke; legato playing is also very ubiquitous. One might have expected to find Scales, Broken Thirds and Arpeggios grouped together the same difficulty category. Thes e three areas of technique are considered the core left hand

PAGE 79

79 techniques: indeed most music is comprised of fragments of scal es, arpeggios, broken thirds, and sequential passages. The excerpts used in the Pl aying Test to evaluate scales may have been more challenging to perform than the arpeggios; this may partially explain why scales were not among the easier items. Another explanation may be that scales are the first item on the test. Initial nervousness or stage fright may have affect ed this item more than subsequent ones. The researcher noted that most students seemed to become free of nervousness shortly after commencing the test, but these initial jitters may have had a deleterious effect on their performance of the first item. In the Pilot Study (Appendix A) broken thirds we re the fourth most difficult item. It was conjectured that broken thirds are under-assigne d by teachers, and as a result, not practiced much. In this present study broken thirds ag ain were found to be difficult for students to demonstrate. The ability to play (and especia lly to sight read) broken thirds requires two skills: 1) The capacity to quickly discer n if a written third is major or minor, and 2) having an accurate sense of interval distances on a given string. The correlation of st udents scores on broken thirds to their total Playing Te st scores was strong ( r = .81), suggesting that stude nts ability to perform well in this area may be a good indicator of their overall level of technique. The difficulty of demonstrating a gi ven technique through excerpts varies. Spiccato bowing, the third lowest score ( M = 6.3), requires a succession of separately bowed notes played rapidly enough that the bow bounces off the string almost of its own accord. This is not a technique that is easily demonstrated unless th e player is very familiar with the notes. Sautill bowing, another bounced-bow stroke ( M = 7.13) appears to be slightly easier than spiccato Though sautill bowing requires a faster bow speed than spiccato the repetition of pitches meant the speed of fingering changes is actually slow er for these passages, thus easier to play.

PAGE 80

80 The relatively low score for martel bowing is likely due to a lack of understanding as to what constitutes this bow stroke. The two excerpt s used for this item were moderately easy to play. A large number of students, however did not demonstrate the heavily, accented articulation, and stopping of the bow on the strin g, which characterizes this stroke. While many method books include a description of martel bowing, students are unlik ely to have a clear grasp of how to execute this bowing unless it is demonstrated by a teacher. The item with the lowest score was pizzicato ( M = 6.06). The excerpts chosen featured three separate techniques: (a) ar peggiated chords using the thumb (Elgar), (b) notes with a strong vibrant tone (Brahms), (c) clear ringing sound in the upper register (Kabalevsky). These excerpts were not easy to sight read for students who were ill-p repared. This was the final section in a series of excerpts requiring great concentration; me ntal and/or physical fatigue may have been a factor. It is al so possible that the study of pizzicato is neglected in lessons. Intonation was the second lowest score ( M = 6.20). Judge B assi gned the only perfect score given to a student. It is axiomatic that string players must be constantly vigilant about playing in tune. Not allowing students to become tolerant of playing outof-tune is one of the essential roles of the teacher. Pablo Ca sals words on this subject are timeless: Intonation, Casals told a student, is a que stion of conscience. You hear when a note is false the same way you feel when you do something wrong in life. We must not continue to do th e wrong thing (Blume, 1977, p.102). Five students (15%) mentioned intonation when asked, What areas of cello technique do you feel you need the most work on (see Chapter 4, p. 63). From this study it appears the Playing Test may help make students more aware of the importance of work on intonation.

PAGE 81

81 The Student Self -Assessment Profile The premise for designing the Student Self-Asse ssment Profile is that better information about a students background, intere sts, and goals for study can resu lt in more effective teaching. It value as a diagnostic tool is in revealing a student s years of study, previous repertoire studied, and playing experience. The emphasis on identif ying personal goals for studying the cello as well as overall goals in music opens a window into a students self awareness. Communication of these goals to a teacher can affect the course of study. Allowing students goals to influence their education may result in thei r feeling more invested in the learning process. The outcome may be more effective, goal-dire cted practice. Students are more likely to be motivated by goals that they perceive as being se lf-initiated. Awareness of these goa ls is not necessarily derived by conventional teaching methods; it comes from a di alogue between the teacher and student. The Student Self-Assessment Profile can act as a catalyst for such a dialogue. The personal goal for studying the cello most often mentioned was teaching privately (Table 4-9). When a teacher knows that a student wants to teach the cello as a vocation, his role becomes more of a mentor, exemplifying for the st udent the art of teaching. A greater role for discussion during the lesson may ensue as the need for various approaches to problems becomes apparent. Perhaps the most important thing a teac her can provide a student aspiring to teach is to help them become reflective about their own play ing; asking themselves why they do something a certain way. Questions that ask why rather than how take precedence. Two students mentioned college-level teaching as one of their personal goals. Providing stude nt-teaching opportunities for these students as well as opportunities to observe experienced teachers at work would be invaluable.

PAGE 82

82 Goals such orchestral or chamber music pe rformance could have a direct effect on the program of study if the teacher ag reed that these objectives were appropriate and attainable. A student who has expressed a sincer e goal of playing professionally in a major orchestra deserves to know both the playing standards required a nd the fierce competition involved. A serious attempt to address some of the personal goals mentioned here would challenge even the most veteran of teachers. How do you help a stud ent improve concentration? Become a fluid improviser? Convey their interpreta tion of music to othe rs? Addressing these goals as a teacher means taking risks, varying ones approach, and being flexible. Over one third of the students who filled out the Student Self-Assessment Profile listed bow stroke as a priority for technical study (Table 4-10). They are in good company; string musicians agree that true artistry lies in a play ers mastery of the bow. Musical issues such as phrasing, dynamics, and timing are the bows domain. A cellists approach to bowing largely determines their tone, and articu lation. These qualities, along with vibrato, are the distinguishing unique characteristics of an individual cellists sound. After bow stroke the most commonly noted area of technique addressed was relaxation or lowering body tension. This is an aspect of technique that musicians have in common with athletes. Gordon Epperson summarized the observations of many teachers: What is the chief impediment to beauty of sound, secure intonation, and technical dexterity? I should an swer, without hesitation, excess tension Sometimes tension alone is blamed; but surely, we cant make a move without some degree of tension. Its the excess we must watch out for. (Epperson, 2004, p. 8). Excessive tension may not always be readily ap parent; teachers may not realize students are struggling with this area unless the issue is raised. Students who mention excessive tension while playing as a major concern should be dir ected to a specialist in Alexander Technique.

PAGE 83

83 Work on Vibrato, either in general or in upper positions was mentioned by six students. Despite sounding like an oxymoron, it is true that an effortless sounding vibrato is very difficult to make. Dorothy Delay, of the Juilliard School of Music, assigned the fi rst hour of practice to be spent on articulation, shifting, and vibrato exercises for the le ft hand, and various bow strokes for the right (Sand, 2000). Students who express a desire to develop thei r vibrato should be guided with appropriate exerci ses, etudes, and solos. Other areas of technique are far more easil y addressed. A student who mentions sightreading or reading in differen t clefs can be easily directed to materials for study. Applying oneself to the exercises in Rhythmic Training, by Robert Starer will benefit any student who felt deficient in rhythm (Starer, 1969). There are mate rials to address virtuall y every technical need, as long as the need is made apparent to the teacher. The final question of the SSAP asks, Sum marize your goals in music and what you need to do to accomplish these goals. The words wi th underlined emphasis were added based on input from the Validity Study (Appendix B). Th is phrase is meant to suggest a students personal responsibility to follow-th rough with their stated goals. Table 4-11 is a tr anscription of student responses to this que stion in their own words. Six-month goals are short term, and reflect a students semester-long objectives. Work strictly on technique, not worrying about pieces or recitals, is one example. Some one-year goals seem nave: To have pe rfect intonation. Goals are th e driving forces behind ones outward acts; playing with perfect intonation may not be attainable but that doesnt mean it isnt a valid aspiration. One student has shown they understand the need to make some aspects of playing virtually automatic thr ough repetition: Resolve all tens ion issues: slow loose practicemaking it a habit. Music and athletics have in common the need for drilling desired actions.

PAGE 84

84 As Aristotle noted, We are what we repeatedly do. Excellence the n, is not an act, but a habit (Aristotle, trans. 1967). Goal setting is most effective when it is m easurable, as with a stude nts two-year goal of memorizing a full concerto. Academic ambiti ons, such as pursuing graduate studies, are important to share with ones teacher, and can di ctate a students course of study. Teachers may occasionally be surprised in reviewing their st udents long-term goals: One performance major stated her goal as a cellist was to play recreationally, not as a ca reer. However, most four-year and ten-year goals were career-oriented. There is value in having students express these goals concretely; through this activity, students visualize doing somethi ng they are presently not able to do. Goal setting requires a leap of faith. Discussion of Research Questions In this section the original research questions are reexamined in light of the results. These questions are restated below with discussion following. To what extent can a test of cello playing measure a students technique? The extent to which the Playing Test is ab le to measure an individual cello students technique depends on the way a teacher uses it. If students are strongly encouraged by their teacher to practice the excerpts a nd are expected to play from them in the lesson, testing error resulting from unfamiliarity with the music and sight-reading mistakes can be minimized. The results can come much closer to a true diagnosis of a students technical level. The comparison of teacher-ratings to Playing Test ratings (Table 4-7) revealed a high correlation and tended to confirm the tests validity. It is possible that, in some cases, ra nking differences occurred due to a teachers bias based on his or her estimation of a students potential. As one teacher noted in discussing a students rating: It pains me to make this assessme nt, as I confirm that (student)

PAGE 85

85 has far underperformed both her stated aspirations and potential the last se veral years (personal correspondence, May 2007). One of the primary purpos es of this test was to provide a tool that allows greater diagnostic objectivity, thereby providing a counterbalance to the subjective impressions that a teacher receives about each student. Each technique is represented by several excerpts of increasing difficulty. On those items using an additive scale, the listener can fi nd a descriptive statement that corresponds to the performance level a given student has demonstrate d. In thirty minutes of concentrated listening the teacher/evaluator is able to come to defi nite conclusions about a students ability to demonstrate seventeen essential ar eas of technique. As the Playing Test is made up of excerpts from the standard repertoire for cellists, the teacher is given insight into what pieces are appropriate for study. To what extent can a criteria-specific rating scale provide indication s of specific strengths and weaknesses in a students playing? Interjudge reliability was positive and strong (Judge A r2 = 0.92, Judge B r2 = 0.95), suggesting that the criteria-speci fic rating scale designed for this study was an effective means of directing the evaluator to listen and watch for specific aspects of technique. A factor analysis of the descriptive statements generated for the Pl aying Test evaluation form is recommended. Statements that were found to have low factor lo adings could be replaced, and reliability of this measure could be increased. One example where im provement might be made is in the criteria choices provided for Vibrato There were students who did not really match any of the descriptors provided for this item ; their vibrato was not tense nor too fast, but to the contrary, was unfocused and too slow.

PAGE 86

86 Can a written test demonstrate a students understanding of fingerboard geography, and the ability to apply musi c theory to the cello? The answer to this research que stion is a provisional yes, not ing that the results of such a test do not necessarily predict how well a student plays. Additional research is needed to determine to what degree intervallic understandin g or fingerboard visuali zation is part of the practical core knowledge of an advanced cellist. While scores on the Written Test ranged fr om 62% to 100% correctly answered, the difficulty level for all items was found to be low. However, it is good that the Fingerboard Geography section was filled out flawlessly by 29 out of the 30 students. Any real confusion here would be a signal that something was seriou sly lacking in a student s understanding of half steps, the chromatic scale, or th e relationship of strings tuned a fifth apart fr om each other. The Written Test may be seen as a kind of barrier examination; if students score below 90%, review of these content domains is indicated. Item di fficulty could be increa sed by more challenging interval identification an d pitch location items. Perhaps a means to achieve more authentic as sessment of fingering skills would be to have students provide fingerings for passages from actual orches tral, chamber, or solo music written for the cello. The challenge in this would be the number of acceptable choices. Nevertheless, a teacher might gain more insight about a students situated cognition, that is, the thinking process at the cello, by using this approach, Ensuing results could become the basis for discussion about why one fingeri ng might be better than another. The point biserial correlations from the Interv al Identification section indicate that some students, who otherwise had high sc ores, were less successful on th is section. However, seven of the nine students who made perfect scores in this section also we re the top scorers of the whole test. Of the nine students who correctly identified all the interval s, eight had piano experience.

PAGE 87

87 Piano experience emerged as a significant effect on students scores on the Interval Identification section through regr ession analysis ( p = .002). It is suspected that di scussions of intervals rarely occur in the teaching of string in struments. A students understanding of intervals derived from music theory classes may not automatically transf er to the cello fingerboard and cello music. The use of fingerboard representations to test interval understanding may have favored visual learners. This test doe s not extend beyond mere interval identification to the more important skill of seeing a written interval and be ing able to imagine how it will sound. This skill, traditionally tested in sight-singing classes, is very valuable to instrumentalists but is often underdeveloped. Future versions of the test mi ght include having students play passages based on a series of intervals rather than given pitches. Students Written Test scores do not have a st rong correlation to their Playing Test scores ( r2 = 0.16). The Written Test may measure a theoreti cal understanding that, while valuable, does not directly influence a students demonstration of the techniques found in the Playing Test. A comparison of students scores on the Written Te st and a sight reading test such as the Farnum String Scale (Farnum, 1969), might be found to have a higher correlation. Pitch Location and Fingering, as well as the Single Position Fingering section, require the students to demonstrate a skill that is required for effective sight reading, namely, coming up with efficient fingerings. Additional research is needed to explore to what extent an unders tanding of fingerboard geography and music theory, as applied to the cel lo, affects a students playing. It can be hypothesized that there is a cogni tive skill set that accompanies the psychomotor skills of string playing. Better understanding of th e kind of situated cognition requi red to think well on a string instrument would be valuable to teachers and students.

PAGE 88

88 Observations on the Playing Test from Participating Teachers After the research for this study was complete, participa ting teachers were asked to comment on the value of the test as a diagnostic t ool. In one particular case, a teacher had his students play from the Playing Test during lessons at the be ginning of the semester. He comments on the beneficial aspects of us ing the excerpts within his studio: In terms of using the playing test as a studio project, it was helpful in several ways. First, it was great to have a community project that I could get everyone involved in working on. Secondly, it was useful to ha ve excerpts that were shorter than any etude I might assign (I do sometimes assign etude excerpts, however) but focused on a small sub-set of technical problems. For some stude nts, certain excerpts were a lot harder than others (though they all struggled on the doubl e-stop section of the Dvorak concerto!) which meant it was also a pro cess of self-discovery. Fina lly, in some cases I later assigned repertoire included in the excerpt s, and students were able to build upon the work theyd already done, learning some of the trickier parts (Personal communication, May 2nd, 2007). The reference to self-discove ry corroborates evidence gath ered through the Student SelfAssessment Profile (SSAP) that the Playing Test can result in greater st udent self-awareness of their playing. The number of comments f ound in the SSAP referring back to techniques encountered in the Playing Test s uggests that the test can indeed make students more self-aware of their strengths and weaknesses. That the test could influence the choice of repertoire assigned to students was also demonstrated. The positiv e value the test had un iting the studio in a community project was unexpecte d. If students worked on this common repertoire and played it for each other in cello class, the test could f unction as a means to connect members of a studio and to learn from each other. The completeness of the Playing Tests c ontent and its capacity to quickly assess a students skill level was noted by another teacher: I found the test to be a very thorough and comp rehensive survey of all of the basic issues in cello technique, using excerpts drawn mostly from the standard repertoire, so that at least some of them should already be fam iliar to any cello student. By asking an

PAGE 89

89 intermediate-to advanced level student to play through these ex cerpts (or even just one or two excerpts under each technical element), with or without prior preparation, the teacher should be able to quickly (in about thirty minutes or less) identify the students strengths and weaknesses in any of the essential aspects of cello technique. (Personal communication, May 23rd, 2007) Another participating teacher confirmed the diagnostic worth of the test and its usefulness in setting goals: I feel the diagnostic test designed by Tim Mutschlecner is a valuable tool for evaluating students and charting their c ourse of study. Students come to a teachers studio with such a wide di versity of skill and backgrounds that any aid in assessing their ab ilities is welcome. Tha nk you for your original and worthwhile test. (Personal communication, May 10th, 2007). This teacher addresses what the test results have shown; students enter co llege with a wide range of experience and preexisting abilities. One of the student participants, a freshman, scored higher on the Playing Test than fi ve out of six seniors. This exemplifies why the test has questionable value as a norm-referenced measure. When ranking students, one teacher observed that comparing students was like comparing apples and oranges. The playing test provides a set of criteria that can supplement a teacher s performance standards and expectations. Comparative Findings The Farnum String Scale When discussing the Farnum String Scale ( FSS) in Chapter Two, it was observed that the test requires the student to play a series of musical examples th at increase in difficulty. This approach was adopted in the Diagnostic Test of Cello Technique (DTCT). Unlike the FSS musical examples were taken directly from actua l music written for the cello. The rationale for this was that using real music increased the tests capacity for authentic assessment; students would be playing the actual passages where th e techniques in questi on would be found. The downside to this was the potential of distracters, aspects of the excerpts that would mask a

PAGE 90

90 students real ability with a given technique. In some cases, for example the double-stop excerpt from the Dvork concerto, other challenges in pl aying the passage may have adversely affected a students ability to demonstrate the technique. However, after administering the test and receiving positive feedback from students as well as teachers, it is felt that the benefits of using real music far outweigh the disadva ntages. Students liked the fact that they were playing from standard works for cello and ones that they w ould quite possibly study someday, if they hadnt already. This illustrates a w eakness of the DTCT if it is used normatively. Unlike the FSS passages, which would be unfamiliar to all test ta kers, students approach the DTCT with varying degrees of familiarity with the excerpts. It woul d be unfair and ill-advised to use this test as a means to compare students among themselves or to assign grades. Each students performance of the test must be judged solely on th e criteria defined in the evaluation form. One university professor declined to have hi s students participate in this study because the bowings and fingering were not always the ones that he taught. Although he was alone in this objection, it does demonstrate a dilemma that this kind of test design faces: If the test-maker provides ample fingerings and bowings, there will be students who have learned these passages differently and will be thrown off. If few or none are provided, it will create much more work for the average student to play these excerpts. The best compromise may be to seek bowings and fingerings that are most commonl y used, even while instructing students that they are free to develop their own choices. Zdzinski and Barnes The design of this study owes much to th e string performance rati ng scale of Zdzinski and Barnes (2002). The success they found in usi ng a criteria-specific ra ting scale was validated in this research. High interjudge reliability correlations (Judge A r2 = 0.92, Judge B r2 = 0.95)

PAGE 91

91 indicate that drawing a judges at tention to specific aspects of the playing is an effective way to increase consistency in evaluating music perf ormances. Additive rating scales, as used by Saunders and Holahan, (1997) elimin ate the use of unspecific numer ical ratings such as those commonly used in Likert scales. By requiring a judge to listen for specific evaluative criteria, rather than trusting in their general impressions of a music performance, reliability is increased. Conclusions The following conclusions can be drawn from the results of this study. 1. Results from the Interval Identification section of the Written Test indicate that not all students recognize in tervals confidently on the cello. 2. The excerpts used in the Playing Test ar e a valid and reliable way to measure a undergraduate cellists technique. 3. Piano experience improves how well stude nts perform on the Playing Test. 4. The Playing Test is a good predictor of teacher-rankings of their students in terms of technique. 5. The criteria-specific rating scale used in this study is a reliable instrument for measuring a students technique. 6. A students year in sc hool, degree program, or years of cello study are not strong indicators of their playing ability. Recommendations for future research in th e area of string instrument teaching and assessment are: 1. A high school version of this test shou ld be developed for use in diagnostic evaluation and teaching.

PAGE 92

92 2. This test can be used as a m odel for violin, viola, and bass diagnostic tests of technique. 3. Future studies should explore the relati onship of theoretical knowledge and performance ability on the cello. As testing increasingly becomes a majo r focal point in discussions on improving education, questions regarding the value and pur pose of assessment will increasingly be raised. Diagnostic evaluation, because of its capacity to inform teaching, is an important component of music education, including applied music. Tool s like the Diagnostic Test of Cello Technique help clarify for both teachers and students wh at needs to be learne d. Along with existing approaches to evaluation, music educators will co ntinue to seek better objective means to assess musical behavior. Normative assessment has limited value in the arts; students come from such diverse backgrounds and experiences that their work must be judged by established criteria, not from comparison. The effectiveness of instrumental teaching depends on how clearly performance objectives are communicated to the student. Welldefined performance criteria results in clear objective goals. In music, as in life, when the ta rget is clear, it is easier to hit the mark.

PAGE 93

93 APPENDIX A PILOT STUDY A pilot study was carried out (Mutschlecner, 200 4) which provided indications of ways to improve an initial form of the Diagnostic Test of Cello Technique Five undergraduate music majors studying at a school of music located in the southeastern region of the United States volunteered to participate in the pilot study. Four out of the five students were cello performance majors. One was a music education major. Thes e students were met with individually at the school of music in a studio space reserved for this use. The students were first given the Self-A ssessment Profile to fill-out. Following this, students were given the Written Examination, which took between ten and fifteen minutes for them to complete. The Written Examination used in the pilot study was shorter than the one developed for the present study. It included: a fingerboard chart, horiz ontal and linear (on one string) interval id entification, note id entification in three clefs, and single-position fingering exercises. In the pilot study students were not give n the Playing Examination ahead of time but were required, essentially, to sight-read the exce rpts. However, students were advised that this was not a sight-reading test per se, but rather a test to assess their ability to demonstrate specific technical skills and bowings. The students were encouraged to take some time to visually familiarize themselves with the excerpts, and were told they could repeat an excerpt if they felt that they could play it better a second time, an option only c hosen twice. The students took between thirty and forty-five minutes to comple te the playing portion of the test. The pilot studys version of the Playing Examination was shorter then the presen t study, measuring fewer categories of left hand and bowing technique and not using as many excerpts for each technique.

PAGE 94

94 Results of the Written Examination showed that the students had no difficulty with the questions asked. What errors there were amounted to careless mistakes. This suggests that the Written Examination did not discriminate well for cello students at this level. These results led the researcher to increase the difficulty level of the present study The rating instrument used for the Playing Examination was a five-point Likert scale which included brief descriptions as to wh at each performance level represented. Student performances of the Playing Examination ranged between 74.7% and 93.3% of a perfect score. The student who had the weakest score was a musi c education major. St udents in general did slightly better in the Basic Bowing Strokes section of the exam than in the Left Hand Technique section (91% compared to 86%). This was not surprising: The musical excerpts used to demonstrate left hand technique were of necessity more difficult, and less easy to sight-read. The lowest combined score was for the portato bowing. This was defined in the Playing Examination as: A series of broad strokes played in one bow with a smooth slightly separated sound between each note. The bow does not stop as in the slurred staccato. Each note is to be clearly enunciated with a slight pressure or nudge from the index finger and upper arm. Despite this extended definition students were un able to consistently de monstrate this bowing. The evidence suggested that this stroke is not being taught or discussed to the same extent as other bowings. The next three lowest combined scores after portato bowing were for position changes, string crossings, and broken th irds. Well-performed position ch anges and string crossings may be part of the identifying characteristics of an advanced player. The researcher suspects that broken thirds are not practiced much and not em phasized by teachers, thus explaining the lower

PAGE 95

95 scores in this area. Results from the Playi ng Examination indicate the need to increase the difficulty level. The results of the Student Self-Assessment Profile include d the following responses: How interested are you in each of these areas of performance? I am interested in solo performance. 1 Strongly agree 1 Agree 3 Disagree 0 Strongly disagree I am interested in chamber music performance. 4 Strongly agree 1 Agree 0 Disagree 0 Strongly disagree I am interested in orchestral performance. 3 Strongly agree 2 Agree 0 Disagree 0Strongly disagree What was most unexpected was the number of students who chose disagree for the statement: I am interested in solo performance. One would have expected performance majors to at least agree with this statement, if not strong ly agree. They may have been influenced by the choice of the word, performance, and were thinking about whethe r they enjoyed the experience of solo playing which, by connotation, meant auditions, juries and degree recitals. These students may have been reading solo as meaning solo career, and responded likewise. In the Student Self-Assessment Profile stud ents responded to the question, What are your personal goals for studying the cello in a variety of ways such as: (a) Would like to have a chamber group and coach chamber groups. (b) To play anything that is set before meI dont want to have limits in terms of technique. To be able to convey to the audience what I feel when I play. (c) Perfect intonation before I graduate, a ttempt to win the concerto competition. (d) To get an orchestra gig, have a quartet/q uintet, and teach students on the side. (e) I want to be able to use the cello in all sorts of ways including orchestral, chamber, rock & roll, and studio recording. These answers are very specific and focused. A teacher, informed about these goals, could modify teaching to address some of these goals. For example, students that have expressed an interest in teaching would find discussions on how one might teach a particular skill very

PAGE 96

96 valuable. If a student expresses the desire to be able to pla y, anything set before me, they would be likely to respond enthusia stically to a rapid, intense surv ey of a wide variety of cello literature. For the student who specifically ment ions perfecting intonatio n as a goal, there are studies and approaches th at would be recommended. The question, What areas of technique do you f eel you need the most work on? elicited even more specific responses such as shifting, general knowledge of higher positions, fluid bow arm, relaxing while playing, e xploring musical phrasing, etc. These responses help give the teacher a window into the students self-aware ness. They could become excellent starting points for examining technique and would go far in helping technical study be goal-directed rather than a mechanical process. The final section of the Student Self-Assessm ent Profile had the students summarize their goals for six months, one, two, four, and ten year periods. Responses showed students had clear ideas about what they wanted to do after school, such as orchestral auditions or graduate school. One revision made for the present study was to ask students what they needed to do to accomplish their goals. A personal commitment in the plan of study is essential for insuring the students motivation to accomplish the goals form ulated by both the teacher and himself. For example, if a student seriously wants to compet e for an orchestral job, preparation must began long before the position opening is announced, thr ough study of orchestral excerpts, a concerto, and the Suites for Unaccompanied Cello by J.S. Bach. It is incumbent upon the teacher to discuss these kinds of issues w ith students who express ambitions to play professionally in an orchestra.

PAGE 97

97 APPENDIX B VALIDITY STUDY A validity study was conducted following the pilot study to determine what extent teachers felt this test measured a students t echnique (Mutschlecner, 2005). Cello teachers ( N = 9) on the college and college preparatory level ag reed to participate in this validity study by reading all sections of the dia gnostic test and then responding to questions in an evaluation form (Appendix C). In answer to the question, To what extent does this test measure a students technique, responses ranged from Very extensively, and, Rat her completely, to, T he written part tests knowledge, not technique. Fifty six percent of the teachers felt the test measured a students technique in a significant way. Sixty seven percent of the re spondents suggested that sightreading difficulties might mask or obscure an accurate demonstration of a students technical ability. As one teacher said, playing the exce rpts shows if they have worked on this repertoire. If they are reading it, it shows their reading ability. Two te achers came up with the same solution: Provide the playing test to stude nts early enough for them to develop familiarity with the passages which they are asked to play. This would not eliminate the inherent advantage students would have who had studied the piece fr om which the excerpt was derived, but it could mitigate some effects, such as anxiety or poor sight-reading skill, which adversely affects performance. These suggestions were implemented in the present study. Criticism of the Written Examination include d the concern that, some fine high school students ready for college might not know intervals yet. In response to this, a new section of the Written Examination was developed (Pitch Loca tion and Fingering) that measures a students capacity to locate pitches on a fingerboard representation wit hout the use of intervallic

PAGE 98

98 terminology. The Interval Identification and Single Position Fi ngering sections of the pilot test were extended to provide greater accuracy in measurement of these skills. Forty four percent of res pondents agreed that the excer pts chosen for the Playing Examination were a valid way of determining a students competence in left hand and bowing technique. Several teachers sugge sted the addition of specific excer pts to reveal other aspects of a students technique such as pizzicato and passages with greater variety of double stops (simultaneously playing on two strings). These suggestions were implemented in the present study. Part two of the Playing Examination (Bas ic Bowing Strokes) was expanded to include Accented Dtach, Flying Spiccato, and Pizzicato Reaction to the choice of excerpts used in the Playing Examination included the suggestion that a better assessment of a students abilities would be to arrange the material in progressive order from easiest to hardest and then see at what point the student began to have difficulty. Ordering and e xpanding the range of difficulty of the excerpts would provide useful information about the students playing level so th at repertoire of an a ppropriate difficulty-level could be assigned. The present study applie d these recommendations by finding additional excerpts and making them sequentially more demandi ng. An effort was made to find excerpts in each category that could be played by undergraduate cellists. Seventy eight percent of the teachers re sponded positively to the Student SelfAssessment Profile. Comments included, I real ly like the Student Self-Assessment page. I think that it is not just valuable to the teacher but important that the students examine their own situations as well. One teacher remarked, It seems the profile w ould be a useful tool to gauge the goals and general level of a new student. A teacher proposed having some more open ended questions as well, noting that, There is musi c beyond solo, chamber and orchestral. As a

PAGE 99

99 result, a line asking for other areas of performa nce interest was added. The study indicated that teachers are either using a similar tool in their studios or would consider doing so. The responses from teachers who participated in the validity study support the premise that the diagnostic test of cello technique is a legitimate way to gather information about a students technical playing ability. The reco mmendations of these teachers were taken into account in developing this present test.

PAGE 100

100 APPENDIX C VALIDATION STUDY EVALUATION FORM The Diagnostic Test of Cello Technique: Validation Study Evaluation Form Instructions: Please read all parts of the test be fore responding to these questions. 1. To what extent does this test measure a students technique? ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ 2. What changes to the test construction do you feel would make the test more valid? ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ 3. What changes in content do you feel would make the test more valid? ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ 4. To what extent does the content of the Wr itten Examination: i.e. Fingerboard Geography, Horizontal Intervals, Linear Intervals, Cl ef Identification and Single Position Fingering demonstrate a basic essentia l knowledge of music theory as applied to the cello? ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ 5. Would you consider using the Written Examina tion as a means of assessing a new students knowledge of music theory as applied to the cello? Why or why not? ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________

PAGE 101

101 6. Are the excerpts chosen for the Playing Examin ation a valid way of determining a students competence ina) Left hand technique? ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ b) Bowing technique? ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ 7. If you feel a particular excerpt is not a good pr edictor of a students ability, what alternative passage do you recommend using? ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ 8. Would you consider using the Playing Examina tion as a means of assessing a new students technique? Why or Why not? ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ 9. How would you use information gathered from the Student Self-Assessment and Goal Setting Profile in working with your students? ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ 10. To what extent would you be willing to particip ate in future Field Test ing of this test through administering it to a portion of the students in your studio? ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ Please include any additional comments here: ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________

PAGE 102

102 APPENDIX D INFORMED CONSENT

PAGE 103

103

PAGE 104

104

PAGE 105

105

PAGE 106

106

PAGE 107

107

PAGE 108

108

PAGE 109

109

PAGE 110

110

PAGE 111

111

PAGE 112

112 APPENDIX F THE WRITTEN TEST EVALUATION FORM Students Name ____________________________ Adjudicators Code _____________ Grade Level _______________________________ Degree Program ______________________________ Audition Day _________ Audition Time _________ Test Section Total Points Students Score Fingerboard Geography 11 points ________ (divide total by 4) Interval Identification 8 points ________ Pitch Location and Fingering 32 points ________ Single Position Fingering 32 points ________ Bass, Treble, and Tenor Clef Note Identification 12 points ________ Total Possible Score Total Students Score and % 95 ____________

PAGE 113

113 APPENDIX G THE PLAYING TEST

PAGE 114

114

PAGE 115

115

PAGE 116

116

PAGE 117

117

PAGE 118

118

PAGE 119

119

PAGE 120

120

PAGE 121

121

PAGE 122

122

PAGE 123

123

PAGE 124

124

PAGE 125

125

PAGE 126

126

PAGE 127

127

PAGE 128

128

PAGE 129

129

PAGE 130

130

PAGE 131

131

PAGE 132

132

PAGE 133

133

PAGE 134

134

PAGE 135

135

PAGE 136

136

PAGE 137

137

PAGE 138

138

PAGE 139

139

PAGE 140

140

PAGE 141

141

PAGE 142

142

PAGE 143

143

PAGE 144

144 APPENDIX H THE PLAYING TEST EVALUATION FORM Students Name ___________________________________ Adjudicators Code ______ Grade Level _________________________________ Degree Program ______________________________ Audition Day _________ Audition Time _________ Part One: Left Hand Technique Scales The students playing of scales exhibits: ( Check All that Apply, worth 2 points each) 95 % accurate whole and half steps. evenly divided bow distribution. steady tempo. effortless position changes. smooth string crossings. Observations/Comments: ________________________________________________________________ _____________________________________________________________________________________ Arpeggios The students playing of arpeggios demonstrates: (Check All that Apply, worth 2 points each) mostly accurate intonation. smooth connections of positions. little audible sliding between notes. clean string crossings. a steady and consistent tempo. Observations/Comments: ________________________________________________________________ _____________________________________________________________________________________ Broken Thirds The students playing of broken thirds: (check One only) 10 demonstrates the highest level of competency. 8 shows a high degree of experience, with only minor performance flaws. 6 indicates a moderate degree of competence or experience. 4 is tentative and faltering with some pitch and/or intonation errors. 2 is undeveloped and results in many inaccu rate pitches and out of tune notes. Observations/Comments: ________________________________________________________________ _____________________________________________________________________________________

PAGE 145

145Double Stops The students playing of double stops features: ( Check All that Apply, worth 2 points each) consistently good intonation with all intervals. a clear, unscratchy tone. the clean setting and releasing of fingers when playing double stops. even bow-weight distribution on two strings. the ability to vibrate on two strings simultaneously. Observations/Comments: ________________________________________________________________ _____________________________________________________________________________________ Position Changes The students technique of changing positions: (check One only) 10 demonstrates well-prepared, smooth shifting between notes, without interruption of the melodic line, or creating a break between notes. 8 shows smooth shifting and uninterrupted melodic line, but includes excessive audible slides. 6 indicates experience with position chang es, but includes some sudden jerky motions when shifting and several audible slides. 4 indicates some experience with shifting but position changes are often either, jerky, unprepared, or filled with audible slides. 2 exhibits un-prepared a nd inaccurate shifting. Sliding between notes is often heard and hand/arm motions are jerky. Observations/Comments: ________________________________________________________________ _____________________________________________________________________________________ Arpeggiated Chords The students playing of arpeggiated chords exhibits: ( Check All that Apply, worth 2 points each) coordinated action between th e left hand and bow arm. even string crossings, with steady rhythm. an ease in preparing chordal fingering patterns. clear tone on all strings. graceful, fluid motion. Observations/Comments: ________________________________________________________________ _____________________________________________________________________________________ Thumb Position The students playing of thumb position reveals that ( Check All that Apply, worth 2 points each) the thumb rests on two strings and remains perpendicular to the strings. the fingers stay curved and do nt collapse while playing. correct finger spacing is consistently used. there is an ease of changing from string to string. the arm and wrist support the thumb and fi ngers versus resting on the side of the cello. Observations/Comments: ________________________________________________________________ _____________________________________________________________________________________

PAGE 146

146Vibrato The students vibrato: (check One only) 10 is full, rich, even, and continuous. It is used consistently throughout the fingerboard. 8 is full and rich, but occasionally interru pted due to fingering/position changes. 6 is mostly utilized, but is irregular in its width or speed and lacks continuity throughout the fingerboard. Excessive te nsion is apparent in the vibrato. 4 is demonstrated, but in a tense, irregular way. It is not used consistently by all fingers in all positions. Vibrato width/speed may be inappropriate. 2 is demonstrated marginally with a tense, uneven application. Vibrato is inconsistently used and lacks appropriate width/speed. Observations/Comments: ________________________________________________________________ _____________________________________________________________________________________ Intonation The students intonation: (check One only) 10 is accurate throughout on all strings and in all positions. 8 is accurate, demonstrating minimal intonati on difficulties, with occasional lack of pitch correction. 6 is mostly accurate, but includes out of tune notes resulting from half-step inaccuracies, inaccurate shifting or incorrect spacing of fingers. 4 exhibits a basic sense of intonation, yet has frequent errors of pitch accuracy and often doesnt find the pitch center. 2 is not accurate. Student plays out of tune the majority of the time. Observations/Comments: ________________________________________________________________ _____________________________________________________________________________________ Part Two: Basic Bowing Strokes Slurred Legato The students legato bow stroke: (check One only) 10 is smoothly connected with no percep tible interruption between notes. 8 is smooth, but has some breaks within phrases. 6 includes some disconnected notes and detached bowing. 4 shows breaks within phrases and is often not smoothly connected. 2 exhibits little skill of smooth bowing. Bowing has many interruptions between notes. Observations/Comments: ________________________________________________________________ _____________________________________________________________________________________

PAGE 147

147Dtach/Accentuated Dtach The students dtach bow stroke is: (check One only) 10 vigorous and active-played on the string. Accentuated Dtach features greater accented attacks. 8 vigorous and active, but occasionally l acking articulation or bow control. 6 moderately active, but lacking articulation or suffering from too much accentuation. 4 not making sufficient contact with the string, or else producing a scratchy sound. 2 undeveloped, and lacking the control to produce a consistent vigorous sound. Observations/Comments: ________________________________________________________________ _____________________________________________________________________________________ Martel The students playing of martel bowing features: ( Check All that Apply, worth 2 points each) a fast, sharply accentuated bow stroke. a heavy separate stroke resembling a sforzando. bow pressure being applied before the bow is set in motion. the bow being stopped after each note. great initial speed and pressure with a quick reduction of both. Observations/Comments: ________________________________________________________________ _____________________________________________________________________________________ Portato The students use of portato bowing demonstrates: ( Check All that Apply, worth 2 points each) a slightly separated legato bow stroke. the pressure of the index finger being applied to pulse each note within a slur. an enunciation of each note through a s light change of bow pressure/speed. the bow does not stop between notes. notes being articulated without lifting the bow from the string. Observations/Comments: ________________________________________________________________ _____________________________________________________________________________________ Staccato/Slurred Staccato The students playing of staccato: (check One only) 10 is crisp and well-articulated, with the bow stopping after each note. 8 demonstrates a high level of mastery, with minor flaws in execution. 6 shows a moderate level of attainment. 4 reveals only a limited amount of bow control. 2 does not demonstrate the abilit y to execute these strokes. Observations/Comments: ________________________________________________________________ _____________________________________________________________________________________

PAGE 148

148Spiccato/Flying Spiccato The students playing of spiccato indicates: ( Check All that Apply, worth 2 points each) a bounced-bow stroke with good control of the bows rebound off the string. good tone production through control of bow pressure and speed. the bow springs lightly from the string. notes are individually activated. even use of bow distribution (Flying Spiccato excerpts). Observations/Comments: ________________________________________________________________ _____________________________________________________________________________________ Sautill The students use of sautill bowing demonstrates: ( Check All that Apply, worth 2 points each) a rapid, natural rebounding of the bow. a primary movement initiated from the wr ist and hand, using a light bow hold. the bows contact with the string is centered around the balance point of the bow. the tempo is fast enough for the bow to continue to bounce of it own momentum. the resilience of the bow stick is used to allow the bow to spring off the string. Observations/Comments: ________________________________________________________________ _____________________________________________________________________________________ Pizzicato The students playing of pizzicato illustrates: ( Check All that Apply, worth 2 points each) confidently played arpeggiated chords, using the thumb. strong, vibrant tone (as demonstrated in the Brahms excerpt). clear ringing sound in the upper register (as in the Kabalevsky excerpt). an absence of snapping sounds caused by pulling the string at too steep an angle. an absence of buzzing or dull, thudding t ones due to inadequate setting of the left-hand fingers. Observations/Comments: ________________________________________________________________ ____________________________________________________________________________________

PAGE 149

149 APPENDIX I REPERTOIRE USED IN THE PLAYING TEST Composer Piece Technique/Bow Stroke Bach, J.S. Arioso (from Cantata 156) Intonation Sonata in G Minor, No. 3, 3rd mvt. Staccato Suite No. 1 in G Major, Allemande Slurred Legato Suite No. 3 in C Major, Allemande Double Stops Suite No. 5 in C Minor, Sarabande Intonation Boccherini, L./Grutzmacher Concerto in Bb Major, 1st mvt. Scales Concerto in Bb Major, 1st mvt. Arpeggiated Chords Concerto in Bb Major, 1st mvt. Thumb Position Concerto in Bb Major, 3rd mvt. Spiccato Beethoven, L. van Sonata in G Minor, Op. 5, No. 2 Spiccato 3rd mvt. Sonata Op. 69 in A Major, 1st mvt. Scales Sonata Op. 69 in A Major, 3rd mvt. Thumb Position Sonata in C Major, O p. 102, No. 1 Accentuated Dtach 3rd mvt. Brahms, J. Sonata No. 1 in E Minor, Op. 38, 1st mvt. Position Changes Sonata No. 1 in E Minor, Op. 38, 1st mvt. Portato Sonata No. 1 in E Minor, Op. 38, 2nd mvt. Slurred Staccato Sonata No. 2 in F Major, Op. 99, 2nd mvt. Pizzicato Breval, J. B. Concerto No. 2 in D Major, Rondo Thumb Position

PAGE 150

150 Debussy, C. Sonata in D Minor, Prologue Portato Dotzauer, Etude Op. 20, No. 13 Arpeggios Dvorak, A. Concerto in B Minor, Op. 104, 1st mvt. Vibrato Concerto in B Minor, Op. 104, 2nd mvt. Double Stops Eccles, H. Sonata in G Minor, 1st mvt. Vibrato Sonata in G Minor, 2nd mvt. Staccato Elgar, E. Concerto in E Minor, Op. 85, 2nd mvt. Pizzicato Concerto in E Minor, Op. 85, 2nd mvt. Sautill Concerto in E Minor, Op. 85, 4th mvt. Arpeggiated Chords Faur, G. lgy Op. 24 Scales lgy Op. 24 Vibrato lgy, Op. 24 Intonation Franck, C. Sonata in A Major, 1st mvt. Slurred Legato Frescobaldi, G. Tocatta Martel Goens, D. van Scherzo Op. 12 Sautill Scherzo Op. 12 Thumb Position Golterman, G. Concerto in G Major, Op. 65, No. 4 Position Changes 3rd mvt. Concerto in G Major, O p. 65, No. 4 Arpeggiated Chords 3rd mvt. Haydn, J. Concerto in C Major, Hob. VIIb. 1 Double Stops 3rd mvt. Concerto in D Major, Op. 101, 1st mvt. Broken Thirds Jensen, H. J. The Ivan Galamian Scale System for Broken Thirds Violoncello

PAGE 151

151 Kabalevsky, D. B. Concerto in G Minor, Op. 49, 1st mvt. Pizzicato Lalo, E. Concerto in D Minor, 2nd mvt. Slurred Legato Locatelli, P. Sonata in D Major, 1st mvt. Flying Spiccato Marcello, B. Sonata in E Minor. Op. 1 No. 2, 2nd mvt. Dtach Sonata in E Minor. Op. 1 No. 2, 4th mvt. Slurred Staccato Popper, D. Gavotte in D Major Flying Spiccato Hungarian Rhapsody Op. 68 Sautill Rimsky-Korsakov, N. Sheherazade, Op. 35, 1st mvt. Arpeggiated Chords Saint-Sans, C. Allegro Appassionato Op. 43 Flying Spiccato The Swan Position Changes Sammartini, G. B. Sonata in G Major, 1st mvt. Arpeggios Sonata in G Major, 1st mvt. Double Stops Schrder, C. Etude, Op. 44, No. 5 Sautill Shostakovich, D. Sonata in D Minor, Op. 40, 1st mvt. Intonation Squire, W.H. Danse Rustique Op, 20, No. 5 Scales Starker, J. An Organized Method of String Playing Position Changes (p. 33) Schumann, R. Fantasy Pieces Op. 73, 1st mvt. Arpeggios Tchaikovsky, P. I. Chanson Triste Op. 40, No. 2. Vibrato Vivaldi, A. Concerto in G Minor for 2 Cellos, RV 531, Scales 1st mvt. Sonata in E Minor, No. 5, 2nd mvt. Martel

PAGE 152

152 APPENDIX J THE STUDENT SELF-ASSESSMENT PROFILE Name_______________________________ Status (year/college)___ _____________ Major__________ Minor___________ Years of study on the Cello_____ Ot her instrument(s) played ___________________ Repertoire previously studied: Methods/Etudes_________________________________________________________________ _________________________________________________________________ ________________________________________________________________________ Solo Literature__________________________________________________________ ________________________________________________________________________ ________________________________________________________________________ ________________________________________________________________________ ________________________________________________________________________ Orchestral Experience: ________________________________________________________________________ ________________________________________________________________________ How interested are you in each of these areas of performance? I am interested in solo performance. Strongly agree Agree Disagree Strongly disagree I am interested in chamber music performance. Strongly agree Agree Disagree Strongly disagree I am interested in orchestral performance. Strongly agree Agree Disagree Strongly disagree Other areas of performance interest? _______________________________________ ________________________________________________________________________ What are your personal goals for studying the cello?___________________________ ________________________________________________________________________ ________________________________________________________________________ What areas of cello technique do yo u feel you need the most work on? ________________________________________________________________________ ________________________________________________________________________

PAGE 153

153 Summarize your goals in musi c and what you need to do to accomplish these goals. 6 months:_______________________________________________________________ ________________________________________________________________________ 1 year:__________________________________________________________________ ________________________________________________________________________ 2 years:_________________________________________________________________ ________________________________________________________________________ 4 years:_________________________________________________________________ ________________________________________________________________________ 10 years:________________________________________________________________ ________________________________________________________________________

PAGE 154

154 APPENDIX K DESCRIPTIVE STATISTICS FOR RAW DATA Table K-1. Raw Scores of the Written Test Items, and Composite Means and Standard Deviations Student Fingerboard Interval Pitch Single-Pos. Note Total Geography Id. Location Fingering Id. Score ______________________________________________________________________________ 1 11 7 30 32 12 92 2 11 7 30 28 11 87 3 11 7 32 31 12 93 4 11 7 31 30 12 91 5 11 8 32 32 12 95 6 11 7 0 32 12 59 7 5 3 16 25 10 59 8 11 6 32 32 12 93 9 11 6 29 32 12 90 10 11 8 29 32 12 92 11 11 5 31 32 12 91 12 11 4 12 30 11 68 13 11 8 22 12 12 65 14 11 5 31 32 11 90 15 11 7 29 27 12 86 16 11 3 2 32 11 59 17 11 7 32 30 8 88 18 11 8 32 31 12 94 19 11 8 32 32 12 95 20 11 8 31 30 12 92 21 11 3 30 30 11 85 22 11 6 14 32 10 73 23 11 5 28 32 12 86 24 11 6 32 26 12 87 25 11 5 27 31 12 86 26 11 8 32 30 12 93 27 11 8 31 32 12 94 28 11 8 32 32 12 95 29 11 6 29 23 12 81 30 11 8 31 32 12 94 M 10.80 6.40 26.70 29.80 11.57 85.20 SD 1.10 1.63 8.82 4.11 0.90 11.38

PAGE 155

155 Table K-2. Raw Score, Percent Score, Frequency Distribution, Z Score, and Percentile Rank of Written Test Scores Raw Percent Frequency Z Percentile Score Score Score Rank ______________________________________________________________________________ 59 62.00 2 -2.30 1.67 62 66.00 1 -2.04 8.33 65 68.00 1 -1.78 11.67 68 72.00 1 -1.51 15.00 73 77.00 1 -1.07 18.33 81 86.00 1 -0.37 21.67 85 89.00 1 -0.02 25.00 86 91.00 3 .07 28.33 87 92.00 2 .16 38.33 88 92.00 1 .25 45.00 90 95.00 2 .42 48.33 91 96.00 2 .51 55.00 92 97.00 3 .60 61.67 93 98.00 3 .69 71.67 94 99.00 3 .77 81.67 95 100.00 3 .86 91.67

PAGE 156

156 Table K-3. Raw Scores of the Pl aying Test Items, Composite M eans, and Standard Deviations Student Scales Arpeggios Broken Double Position Arpeggiated Thirds Stops Changes Chords _____________________________________________________________________________________ 1 10 10 8 10 10 10 2 10 10 8 8 6 6 3 10 10 8 8 10 10 4 10 8 10 8 8 10 5 8 8 6 6 8 10 6 8 10 10 8 8 8 7 10 10 8 8 10 8 8 8 10 8 6 8 8 9 8 10 8 4 8 6 10 8 10 8 8 10 8 11 6 8 8 8 10 4 12 8 10 8 6 10 10 13 8 8 6 8 6 8 14 6 6 6 4 6 6 15 6 6 6 4 6 8 16 6 4 4 4 6 6 17 6 6 6 8 6 2 18 8 8 4 6 4 4 19 8 8 8 8 8 10 20 6 10 6 4 8 8 21 4 6 6 6 10 8 22 6 6 4 4 6 4 23 0 2 2 2 6 2 24 6 6 4 4 4 4 25 8 10 6 4 6 2 26 8 8 6 8 8 8 27 8 8 8 8 10 10 28 10 10 8 6 8 10 29 10 8 8 8 8 10 30 10 10 8 8 8 8 M 7.6 8.13 6.8 6.4 7.67 7.2 SD 2.19 2.10 1.86 1.99 1.83 2.66 (Table K-3 continues on next page)

PAGE 157

157 Table K-3. (continued) Student Thumb Vibrato Intonation Slurred Dtach Martel Position Legato _____________________________________________________________________________________ 1 8 10 8 8 8 8 2 6 8 8 8 8 10 3 10 10 6 10 10 10 4 10 8 8 10 8 4 5 10 10 8 10 8 10 6 8 8 6 10 8 8 7 6 10 6 10 10 10 8 4 8 6 10 8 6 9 8 4 6 4 10 6 10 6 10 8 10 10 10 11 6 8 6 8 10 8 12 6 10 8 8 8 8 13 8 10 6 8 10 2 14 6 4 4 8 8 4 15 6 8 4 6 6 8 16 4 6 2 8 8 2 17 6 4 4 8 8 2 18 4 6 8 8 8 4 19 6 8 8 8 10 8 20 10 8 6 9 10 10 21 2 8 6 6 2 2 22 8 6 6 4 8 6 23 6 8 4 8 8 4 24 8 10 4 8 10 4 25 6 4 4 4 8 4 26 8 8 6 10 6 6 27 8 10 8 8 10 10 28 10 8 8 10 10 8 29 10 8 6 10 10 8 30 6 10 8 10 8 10 M 7.0 7.93 6.2 8.23 8.47 6.67 SD 2.08 2.00 1.69 1.85 1.72 2.84 (Table K-3 continues on next page)

PAGE 158

158 Table K-3. (concluded) Student Portato Staccato Spiccato Sautill Pizzicato Total Score _____________________________________________________________________________________ 1 10 8 8 10 8 152 2 8 10 8 4 10 136 3 8 8 10 10 8 156 4 8 8 6 10 10 144 5 8 8 8 4 4 134 6 10 10 8 8 10 146 7 10 10 10 8 8 152 8 10 10 8 6 4 128 9 4 6 8 10 6 116 10 10 10 8 10 8 152 11 6 4 4 4 6 114 12 10 8 8 8 6 140 13 10 6 2 8 6 120 14 10 8 2 4 4 96 15 10 8 8 10 6 116 16 4 6 2 2 2 76 17 10 8 6 8 4 102 18 4 10 6 6 2 100 19 8 10 8 10 10 142 20 10 10 3 8 8 134 21 6 6 2 4 2 86 22 0 6 2 2 4 82 23 8 8 4 0 4 76 24 4 8 8 8 4 104 25 0 6 6 10 4 92 26 10 6 4 10 4 124 27 10 8 10 10 8 152 28 8 8 8 4 6 140 29 10 8 6 10 10 148 30 6 8 8 8 6 140 M 7.67 7.93 6.3 7.13 6.07 123.33 SD 2.97 1.62 2.61 3.00 2.55 25.18

PAGE 159

159 LIST OF REFERENCES Abeles, H.F. (1973). Development and va lidation of a clarin et performance adjudication scale. Journal of Research in Music Education, 21 246-255. Aristotle, trans. 1943, Jowett, B. Politics (1340b24) New York: Random House. Aristotle, Nichomachean Ethics Bk. 2 (1103a26-1103b2) as para phrased by Durant, W. (1967). The Story of Philosophy New York: Simon and Schuster. Asmus, E.P. & Radocy, R.E. (2006). Quantita tive Analysis. In R. Colwell (Ed.), MENC handbook of research methodologies (pp.95-175). New York: Oxford University Press. Bergee, M. J. (1987). An application of th e facet-factorial approach to scale construction in the development of a rating scale for euphonium and tuba music performance. Doctoral dissertation, Un iversity of Kansas. Berman, J., Jackson, B. & Sarch, K. (1999). Dictionary of bowing and pizzicato terms. Bloomington, IN: Tichenor Publishing. Blum, D. (1997). Casals and the art of interpretation. Berkeley and Los Angeles, CA: University of California Press. Boyle, J. (1970). The effect of prescribed rhythmical movements on the ability to read music at sight. Journal of Research in Music Education, 18 307-308. Boyle, J. (1992). Evaluation of music ability. In D. Boyle (Ed.), Handbook of research on music teaching and learning (pp. 247-265). New York: Schirmer Books. Boyle, J. & Radocy, R.E. (1987). Measurement and evaluation of musical experiences New York: Schirmer Books. Brophy, T. S. (2000). Assessing the developing child mu sician: A guide for general music teachers Chicago: GIA Publications. Colwell, R. (2006). Assessments potential in music education. In R. Colwell (Ed.), MENC handbook of research methodologies (pp.199-269). New York: Oxford University Press. Colwell, R. & Goolsby, T. (1992). The teaching of instrumental music. Englewood Cliffs, NJ: Prentice Hall. Eisenberg, M. (1966). Cello playing of today. London: Lavender Publications.

PAGE 160

160 Ekstrom, R., French, J., Harman, H., & Dermen, D. (1976). Kit of factor-referenced cognitive tests. Princeton: Educational Testing Service. Elliott, D. J. (1995). Music matters: A new philo sophy of music education New York: Oxford University Press. Epperson, G. (2004). The Art of string teaching. Fairfax, VA: American String Teachers Association with National School Orchestra Association. Farnum, S. E. (1969). The Farnum string scale. Winona, MN: Hal Leonard. Gardner, H. (1983). Frames of mind: The theory of multiple intelligences. New York: Basic Books. Gillespie, R. (1997). Rating of violin and viola vibrato performance in audio-only and audiovisual presentations. Journal of Research in Music Education, 45, 212-220. Gromko, J. E. (2004). Predictors of mu sic sight-reading abilit y in high school wind players. Journal of Research in Music Education, 52, 6-15. Hoover, H., Dunbar, S., Frisbie. D., Oberley, K., Bray, G., Naylor, R., Lewis, J., Ordman, V., & Qualls, A. (2003). The Iowa tests. Itasca, IL: Riverside. Jensen, H. J. (1994). The Ivan Galamian scale system for violoncello. Boston MA: ECS Publishing. Jensen, H.J. (1985). The four great families of bowings. Unpublished manuscript, Northwestern University. Katz, M. (1973). Selecting an achievement test : Principles and procedures. Princeton: Educational Testing Services. Kidd, R.L. (1975). The construction and validation of a scale of trombone performance skills Doctoral dissertation, University of Illinois at Urbana-Champaign. Lehman, P.B. (2000). The power of the nati onal standards for music education. In B. Reimer (Ed.), Performing with understanding: The challenge of the national standards for music education (pp. 3-9). Reston, VA: MENC. Magg, F. (1978). Cello exercises: A comprehensive surv ey of essential cello technique. Hillsdale, NY: Mobart Music. Mooney, R. (1997). Position pieces. Miami, FL: Summy-Birchard Music. Mutschlecner, T. (2004). The Mutschlecner diagnostic test of cello technique: Pilot study Unpublished manuscript, University of Florida.

PAGE 161

161 Mutschlecner, T. (2005). Development and validation of a diagnostic test of cello technique. Unpublished manuscript, University of Florida. Reimer, B. (1989). A philosophy of music education (2nd ed.) Englewood Cliffs, NJ: Prentice Hall. Reimer, B. (2003). A philosophy of music educa tion: Advancing the vision. (3rd ed.) Upper Saddle River, NJ: Pearson Education. Renwick, J. M. & McPherson, G. E. (2002). Interest and choice: student selected repertoire and its effect on practicing behavior. British Journal of Music Education, 19 (2) 173-188. Sand, B. L. (2000). Teaching genius: Dorothy dela y and the making of a musician Portland, OR: Amadeus Press. Saunders, T. C. & Holahan, J. M. (1997). Cr iteria-specific rating sc ales in the evaluation of high-school instrumental performance. Journal of Research in Music Education, 45, 259-272. Spiro, R. J.; Vispoel, W. P.; Schmitz, J. G.; Samarapungavan, A.; & Boeger, A. E. (1987). Knowledge acquisition for applica tion: Cognitive flexib ility and transfer in complex content domains. In B. K. Britten & S. M. Glynn (Eds.). Executive control processes in reading (pp. 177-199). Hillsdale, NJ: Lawrence Erlbaum Associates. Starer, R. (1969). Rhythmic training New York: MCA Music Publishing, Starker, J. (1965). An organized method of string playin g: Violoncello exercises for the left hand. New York: Peer Southern Concert Music. Watkins, J., & Farnum, S. (1954). The Watkins-Farnum performance scale. Milwaukee, WI: Hal Leonard. Warren, G. E. (1980). Measurement and eval uation of musical behavior. In D. Hodges (Ed.), Handbook of music psychology (pp. 291-392). Lawrence, KS: National Association for Music Therapy. Zdzinski, S.F. (1991). Measurement of solo instrumental music performance: A review of literature. Bulletin of the council for Re search in Music Education, no. 109, 47-58. Zdzinski, S. F., & Barnes, G. V. (2002). Development and validation of a string performance rating scale. Journal of Research in Music Education 50, 245-255.

PAGE 162

162 BIOGRAPHICAL SKETCH Timothy Miles Mutschlecner was born on N ovember 17, 1960 in Ann Arbor, Michigan. A middle child with an older brother and younge r sister, he grew up mostly in Bloomington, Indiana, but finished high sc hool in Los Alamos, New Mexico, graduating in 1979. He earned his Bachelors in Music from Indi ana University in 1983 where he studied cello with Fritz Magg. In 1992 Tim graduated from the Cleveland Inst itute of Music with a Masters degree in Performance and Suzuki Pedagogy. He taught in the Preparatory Department at The Cleveland Institute of Music from 1992 to 1995 before accep ting the position of director of the cello program at the Suzuki School of Music in Johnson City, Tennessee. In Tennessee, Tim taught a large cello studio and played in two re gional orchestras. He taught students as well through Milligan College and East Tennessee State University. Along with giving recitals and playing in the Meadowlark Trio, Tim was a featured soloist with the Johnson City Symphony. In 2003, Tim began work on his Ph.D. in Music Education at the University of Florida in Gainesville. During the next four years he taught university cello students as a graduate assistant while completing his degree. Tim remained an active performer while st udying at the University of Florida, serving as principal cellist in th e Gainesville Chamber Orchestra from 2003 to 2007, and performing with the Music Schools New Mu sic Ensemble. He maintained a private studio with students of all ages and levels. Upon the completion of his Ph.D. program, Ti m will begin teaching at the University of Wisconsin-Stevens Point in the Aber Suzuki Ce nter. This position will provide opportunities to work with beginning and intermediate level cello students, as well to offer cello pedagogy for university-level students. He an ticipates continuing to do research in the field of string music

PAGE 163

163 education, particularly in the areas of string pe dagogy and assessment. Tim has been married to Sarah Caton Mutschlecner, a nurse practitioner, for 18 years. They have three daughters: Audrey, age 16; Megan, ag e 14; and Eleanor, age 10.