|UFDC Home||myUFDC Home | Help|
This item has the following downloads:
1 PERFORMANCE BASED BUDGETING AA MEASURES OUTCOMES IN FLORIDA COMMUNITY COLLEGES By KAREN BAKUZONIS A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY UNIVERSITY OF FLORIDA 2007
2 2007 Karen Bakuzonis
3 This dissertation is dedicated to all those persons who pursue lifelong learning, who are not afraid to tackle the unknown and overcome their fears, and who view c hallenges as opportunities to grow as individuals.
4 ACKNOWLEDGMENTS I wish to acknowledge the unending support of my dissertation committee: Dr. Linda Serra Hagedorn who has been my mentor, role model, and sounding board; Dr. Lawrence Tyree whose communica tion is always upbeat and positive; Dr. David Honeyman whose course assignment sparked the concept of this dissertation; and Dr. Janet Silverstein who, throughout the time I have known her, has always maintained that there is nothing I could not do if I pu t my to return to graduate school. I extend my utmost gratitude for the data, explanations, and reports provided by the individuals in the Florida Department o Colleges: Dr. Patricia Windham, Nancy Copa, Edward L. Cisek, Maybelle Montford, and Patti Askins. Special thanks must go to Santa Fe Community College fellow faculty members and also to my students for their constant encouragement. Deep appreciation also goes to Dr. Jackson Sasser who always made time to see me, answer questions, and share a perspective on community colleges. I also acknowledge the values instilled in me as a child from my mother, Marie who said tha t every woman needs to be educated, my father, George who told me to decide what you want to do and then do it, and my sister Elaine, whose strength and courage inspire me daily. My special thanks and love go to my sons, Jason and Eric, who are more prec ious to me than life. And finally a thank you goes to my husband, Craig Bakuzonis, for his infinite love and support. I am
5 TABLE OF CONTENTS p age ACKNOWLEDGMENTS ................................ ................................ ................................ ............... 4 LIST OF TABLES ................................ ................................ ................................ ........................... 7 LIST OF FIGURES ................................ ................................ ................................ ......................... 8 ABSTRACT ................................ ................................ ................................ ................................ ... 10 CHAPTER 1 INTRODUCTION ................................ ................................ ................................ .................. 12 Ba ckground ................................ ................................ ................................ ............................. 12 Purpose of the Study ................................ ................................ ................................ ............... 14 Research Questions ................................ ................................ ................................ ................. 16 Defini tions of Terms ................................ ................................ ................................ ............... 16 Significance of the Study ................................ ................................ ................................ ........ 19 Limitations of the Study ................................ ................................ ................................ ......... 20 2 REVIEW OF LITERATURE ................................ ................................ ................................ 24 Brief History of Community Colleges ................................ ................................ .................... 24 History of the Florida Community College System ................................ ............................... 25 Funding ................................ ................................ ................................ ................................ ... 27 Impetus for Changes in Funding ................................ ................................ ............................. 28 Accountability ................................ ................................ ................................ ......................... 32 History of Performance Measures and Performance based Budgeting ................................ .. 35 Indicators and Performance ................................ ................................ ................................ .... 40 Florida Community College Experience ................................ ................................ ................ 43 3 METHODS ................................ ................................ ................................ ............................. 46 Introduction ................................ ................................ ................................ ............................. 46 The Performance Challenge ................................ ................................ ................................ ... 46 Purpose of the Study ................................ ................................ ................................ ............... 47 Data Submission Guidelines ................................ ................................ ................................ ... 49 Full time Equivalent ................................ ................................ ................................ ............... 51 Funding Categories ................................ ................................ ................................ ................. 51 Acquisition of Study Data ................................ ................................ ................................ ...... 52 Data Analysis ................................ ................................ ................................ .......................... 53 4 ANALYSIS OF DATA ................................ ................................ ................................ .......... 60 Overview ................................ ................................ ................................ ................................ 60
6 Research Question 1 ................................ ................................ ................................ ............... 6 1 Total Operating Budget vs. the Performance based Budget ................................ ........... 61 Components of the Operating Budget ................................ ................................ ............. 61 Research Question 2 ................................ ................................ ................................ ............... 62 FTE Enrollment ................................ ................................ ................................ ............... 62 Cumulative Performance based Budget Funds by College ................................ ............. 64 Performance based Incentive Funding by College ................................ .......................... 64 Perfor mance based Budget per FTE ................................ ................................ ............... 67 Research Question 3 ................................ ................................ ................................ ............... 70 Research Question 4 ................................ ................................ ................................ ............... 73 Utilization of Performance based Budget Funds ................................ ............................ 73 Viewpoints ................................ ................................ ................................ ....................... 74 Limitations of the Study ................................ ................................ ................................ ......... 76 5 DISCUSSION ................................ ................................ ................................ ......................... 96 Florida Performance based Budgeting Overview ................................ ................................ .. 96 Introspective Review ................................ ................................ ................................ .............. 97 The Florida Community College System ................................ ................................ ............... 99 Findings ................................ ................................ ................................ ................................ 100 Conclusions ................................ ................................ ................................ ........................... 105 Implications of the Findings ................................ ................................ ................................ 109 Consistency ................................ ................................ ................................ ........................... 109 Use of the Da ta ................................ ................................ ................................ ..................... 110 Selection of Measures ................................ ................................ ................................ ........... 111 Policy Recommendations Specific to the Florida Community College System .................. 112 National Policy Recommendations to Community College Systems ................................ ... 113 Recommendations for Further Research ................................ ................................ .............. 115 REFERENCES ................................ ................................ ................................ ............................ 118 BIOGRAPHICAL SKETCH ................................ ................................ ................................ ....... 127
7 LIST OF TABLES Table page 1 1 The Florida Community College System (2006 2007) ................................ ..................... 21 3 1 Florida Community Colleges ................................ ................................ ............................. 58 3 2 Florida Community College Overall Funding Formula ................................ ..................... 59 4 1 Florida Community College Overall Funding Formula ................................ ..................... 78 4 2 Florida Community College Unduplicated S tudent Headcount per FTE Enrollment (Funded) for 2005 2006 ................................ ................................ ................................ ..... 79 4 3 Comparison of Gulf Coast Community College and Santa Fe Community College ........ 80 5 1 Performance based Budgeting from 1996 1997 through 2006 2007 .............................. 117 5 2 Comparison of Key Variables between Community College Systems (Illinois and Florida) ................................ ................................ ................................ ............................. 117 5 3 Variances among College Associate of Science Degree Cost for a Selected Program for 2004 2005 ................................ ................................ ................................ ................... 117
8 LIST OF FIGURES Figure page 1 1 2006 2014. ................................ ................................ ................................ ......................... 22 1 2 growth occupations by o ccupational education requirements, 2006 2014. ................................ ................................ ................................ .. 23 2 1 Growth of community colleges. ................................ ................................ ......................... 45 4 1 Total operating budget versus perform ance based budget, 1996 through 2005 ................ 81 4 2 Percentage of Performance based Budget of the total operating budget, 1996 through 2005 ................................ ................................ ................................ ................................ .... 81 4 3 Components of the operating budget, 1997 through 2005. ................................ ................ 82 4 4 Comparison of the types of fees per Full Time Equivalent, 1997 through 2005 ............... 82 4 5 Total operating budget versus FTE enrollment (funded), 1996 through 2005 .................. 83 4 6 Performance based incentive funding versus student FTE enrollment (funded), 1996 thr ough 2005 ................................ ................................ ................................ ...................... 83 4 7 Full Time Equivalent enrollment (funded) versus unduplicated headcount enrollment, 1996 through 2005 ................................ ................................ ................................ ............. 84 4 8 Tot al operating budget versus Full Time Equivalent enrol lment (funded) vs. unduplicated student headcount, 1996 through 2004 ................................ ........................ 84 4 9 Cumulative performance based budget funds per college, 1996 thr ough 2005 ................ 85 4 10 Cumulative performance based budget funds per college, ranked by FTE enrollment (funded) group, 1996 through 2005 ................................ ................................ 85 4 11 Percentage of performance based incentive funds by college, for FTE Group 1, 2000 through 2005 ................................ ................................ ................................ ...................... 86 4 12 Percentage of performance based incentive funds by college, for FTE Group 2, 2000 through 2005 ................................ ................................ ................................ ...................... 86 4 13 Percentage of performance based incentive funds by college, for FTE Group 3, 2000 through 2005 ................................ ................................ ................................ ...................... 87 4 14 Percentage of performance based incentive funds by college, for FTE Group 4, 2000 through 2005 ................................ ................................ ................................ ...................... 87
9 4 15 Percentage of performance based incentive funds by college, for FTE Group 5, 2000 through 2005 ................................ ................................ ................................ ...................... 88 4 16 Performance based incentive funds per FTE enrollment (funded) versus the FTE enrollment (funded), 1996 through 2005 ................................ ................................ ........... 88 4 17 Performance based incentive funding per FTE enrollment, for FTE Group 1, 2000 through 2005 ................................ ................................ ................................ ...................... 89 4 18 Performance based incentive funding per FTE enrollment, f or FTE Group 2, 2000 through 2005 ................................ ................................ ................................ ...................... 89 4 19 Performance based incentive funding per FTE enrollment, for FTE Group 3, 2000 through 2005 ................................ ................................ ................................ ...................... 90 4 20 Performance based incentive funding per FTE enrollment, for FTE Group 4, 2000 through 2005 ................................ ................................ ................................ ...................... 90 4 21 Performance based incentive funding per FTE enrollment, for FTE Group 5, 2 000 through 2005 ................................ ................................ ................................ ...................... 91 4 22 Total AA measure points I, II, and III, 1996 through 2005 ................................ ............... 91 4 23 AA measure completion points ver sus FTE enrollment (funded), 1996 through 2005 ..... 92 4 24 AA measure completion points, by Measure I, Measure II, and Measure III, 1996 through 2005 ................................ ................................ ................................ ...................... 92 4 25 Total AA measure points I, II, and III by Community College Group 1, 2000 through 2004 ................................ ................................ ................................ ...................... 93 4 26 Total AA measure points I, II, and III by community college Gro up 2, 2000 through 2004 ................................ ................................ ................................ ...................... 93 4 27 Total AA measure points I, II, and III by community college Group 3, 2000 through 2004 ................................ ................................ ................................ ...................... 94 4 28 Total AA measure points I, II, and III by community college Group 4, 2000 through 2004 ................................ ................................ ................................ ...................... 94 4 29 Total AA measure points I, II, and III by community college Group 5, 2000 through 2004 ................................ ................................ ................................ ...................... 95 4 30 Total cumulative AA measure funding 1996 2005 versus All Performance based Funding Index 2005 2006 ................................ ................................ ................................ .. 95
10 Abstract of Dissertation Presented to the Gr aduate School of the University of Florida in P artial Fulfillment of the Requirements for the Degree of Doctor of Philosophy PERFORMANCE BASED BUDGETIN G AA MEASURES OUTCOMES IN FLORIDA COMMUNITY COLLEGES By Karen Bakuzonis December 2007 Chair: Linda Serra Hagedorn Major: Higher Education Administration The Florida Community College System is a major component of the economics in Florida, and it is a major player in the preparation of the workforce for the future. The purpose of this study was to con duct a descriptive analysis of trends in the Florida community college system relative to the performance based budgeting funding The conceptual framework is the relationship that performance based budgeting will lead to increased performance. The study focused on the performance funds related to the associate of arts ( AA ) degree program portion of the performance based and incentive based allocation funding for the 28 Florida community colleges from 1996 1997 to 2005 2006. Because this is a full popula tion, the study therefore provides a true picture without the need for inferential statistics that may infer conclusions and thus includes no statistical testing. The community college system was the first sector of Florida higher education to be develope d as an incentive process for the colleges to receive additional students who were not part of the base funding. A relationship now existed between funding and performance that would hold agencies more accountable The f our research questions were asked: What has been the overall change to percentage of funding by category as a result of the implementation of Florida community college performance based budgeting ?
11 Utilizing full time equivalent (FTE) as a leveling approach, what are the differences in per formance based budgeting trends by individual Florida community colleges? Does the Florida community college performance based budgeting result in measurable improvement for selected associate of arts measures? How do Florida community colleges use perf ormance based budgeting funds? The study focused on the relationship of the total operating budget, the performance based budget, the FT E enrollment (funded), the unduplicated student headcount the AA measure points from a system and an individual comm unity college perspective. Results of the analyses revealed that while the current performance based incentive funding program has added additional funds to the total operating budget, some aspects need to be addressed. Community college missions vary an d how they add value to their service area may not be reflected in the current measures.
12 CHAPTER 1 INTRODUCTION Background The creation of the concept of the present day community college system dates back to 1892 when William Rainey Harper first coined the phrase s junior college and senior college Harper advocated that universities focus on research rather than teaching since he and other university presidents of the time believed that the first 2 years of college are not required to be part of univer sity level education. In 1892, the University of Chicago was divided into a junior college and a senior college division (Drury, 2003). In the beginning years, the junior colleges concentra ted more on liberal arts courses that could be transferred to the universities where college preparatory courses were the main focus; little attention was given to occupational training. By the 1920s, competing visions of junior colleges emerged and led to the founding of the American Association of Junior Colleges, presently named the American Association of Community Colleges. The early leaders of the American Association of Junior Colleges, Leonard Koos and Walter Eells, promoted the development of term inal vocational education, labeled by Koos as semi professional training (Drury, 2003, p. 4). After World War II, the term junior college was replaced with community college and vocational, technical, and adult education needs emerged with the ever incre asing demand for college education (Drury, 2003). The graduates and attendees of community colleges provided skilled labor for many industries. At that time community colleges became more heavily linked to the economy. The early 20th century funding for community colleges came from local resources, but by 2000, state funding and tuition made up 65% of the support for the majority of the states (Boswell & Wilson 2004). This increasing dependence on state funding has placed the
13 community college in competi tion with other state funding priorities. The community colleges simultaneously were facing growing enrollment and increased student demands. In the 1980s, external concerns moved from economy to quality. Two thirds of the 50 states mandated by legislat ion that public colleges and universities adopt plans for assessing student learning. By the 1990s, higher education faced increasing demands for efficiency and effectiveness (Burke, 2005 a ). Congress conducted a series of hearings on scientific fraud, pub lishers churned out multitudes of books about the political conformity of universities, and news articles focused on the rapid rise of tuition (Bok, 1992). In 2005, Bailey presen ta t ion at the 21 st Century Workforce Conference advocating the need for pub lic engagement. The evidence of effectiv eness and need can be the basis for public engagement and understanding. The public needs to know that higher education resources are being used efficiently. The public needs to have a better sense of the types a nd skills needed and occupations needed in the modern economy. The public needs a better understanding of the role in economic growth and prosperity and the potential payoffs to their investments in higher education (p. 35) With increasing voices, public stakeholders continue to draw attention to the concerns about the state of higher education in America. These matters resulted in increased government attention, new policy goals, and modifications of the more traditional funding models for higher educat ion. The concepts of performance based budgeting (PBB) and performance based funding (PBF) began to be incorporated into higher education funding formulas. Performance based budgeting allows governors, legislators, or boards to consider campus performanc e as one factor in determining total allocation, but the link often holds great latitude. Performance based funding is associated with specific results on each of the designated indicators. The funding is driven by formulas with far less latitude. The c oncepts behind performance based budgeting and performance based funding are designed to reward those school s that achieve the desired outcomes.
14 This dissertation will focus on the performance based funding outcomes in Florida community colleges. Under Title XVI, Chapter 240, Section 240.324 Florida Statues, the State Board of Community Colleges and the community college boards of trustees were required to efficien law mandated that the plans address the following issues: Graduation rates of AA (associate of arts) and AS (associate of science) degree seeking students compared to fi rst time enrolled students seeking the associate degree Minority student enrollment and retention rates Student performance, including student performance rates on college level academic skills tests, mean grade point averages for community college AA tran sfer students, and community college student performance on state licensure examinations Job placement rates of community college vocational students Student progress by admission status and program Community colleges were targeted for the 1996 1997 fiscal year as the first educational agency in the state to adopt performance based budgeting initiatives. Mercer (2002) defined a performance budget as an integrated annual performance plan and annual budget that shows the relationship between program fundin g levels and expected results. The performance budget indicates that a goal or a set of goals should be achieved at a given level of spending. The state of Florida uses a qualitative or quantitative indicator to assess state agency performance. For the Fl orida Community College System, appropriation is done upon a point system for allocation, based on the outputs, special categories, and outcomes. Purpose of the Study In Florida for 2006 2007, 61% of the $1.463 billion budget came from the general revenu e with a concerted effort under way to modify the overall funding formula. The modified
15 funding formula was to decrease the gap between need and funds available and improve the horizontal equity among the community colleges relative to college size. As i n other states, Florida has started to link outputs and outcomes with funding. The Florida Department of Education reported that from academic years 1995 1996 to 2004 2005, the cumulative increase in full time equivalent (FTE) enrollments has been 21% whil e degrees/certificates increased by 37%. The emphasis on performance funding has had measurable impact in terms of increasing the graduates in Florida community colleges. The purpose of this study is to conduct a descriptive analysis of trends in the Flori da community college relative to the performance based budgeting funding, focusing on the associate of arts measures. The associate of arts components under review are for the following three measures: AA Program Measure I Number of AA degrees Number of d ual e nrollment credit hours AA Program Measure II Pro rata share of the number of AA graduates who o r equired remediation based upon CPT scores o q ualified as economically disadvantaged o w ere reported as disabled o a re black males o t ested into English for academ ic purposes AA Program Measure III Pro rata share of the number of completers or partial completers who were placed in jobs or transferred to the State University System. o Full time job earning $10/hour o Graduate transfer to the State University System o Non graduate transfer to the State University System with 30 to 45 credits o Non graduate transfer to the State University System with 45 to 59 credits o Non graduate transfer to the State University System with 60 or more credits The associate of arts indicator allocation and it has been the most consistent indicator since the funding began. From 2000 2001 to 2005 2006 the components for the AA measures included the number of AA degrees,
16 the number of dual enrollment, the pro rata share of AA graduates who were college preparatory, economically disadvantaged, disabled, black males and tested into English and pro rata share of completers or partial completers who were placed in jobs or transferred to the state university system. From 1996 1997 to 1999 2000, the components did not include dual enrollment, the pro rata share of AA completers for black males did not begin until 1998 1999; and placements at jobs at least at $10per hour did not begin until 1999 2000. Excess hours for measure III, only existed from 1996 to 1998. Partial completion points only existed in 1998 1999. Other research studies have focused on faculty awareness of performance based funding, equity among colleges, or inferential a nalysis for future outcomes and outputs. focus is a descriptive statistical analysis using a secondary data set from the Florida Department of Education. The conceptual framework is the relationship that performance based funding will lea d to increase performance. Research Questions What has been the overall change to percentage of funding by category as a result of the implementation of Florida community college performance based budgeting? Utilizing full time equivalent (FTE) as a leveli ng approach, what are the differences in performance based budgeting trends by individual Florida community colleges? Does the Florida community college performance based budgeting result in measurable improvement for selected associate of arts measures? How do Florida community colleges use performance based budgeting funds? Definitions of Terms Performance based budgeting (PBB ) is d esigned as a way to focus government on results, with monetary incentives for agencies that meet their performance goals. It is a mechanism that allows governors, legislators, or boards to consider campus performance as one
17 factor in determining total allocation. The link, however, is loose and discretionary (Burke, Modarresi & Serban, 1999) Performance based funding (PBF ) is a method tying performance indicators with funding. The budget varies according to the performance of the entity in a previous period based on indicators for performance. Performance funding links specific dollar allocations to measured institutional r esults (Burke, Modarresi & Serban, 1999) Performance indicators r eference outcomes as the quality of the benefit or impact of programs, activities, and services (Burke 2002), such as test scores and job placement. Outputs involve the quantity of products produced, such as the number of graduates, the process, and the methods used to deliver programs, activities, or services, including an assessment of learning, use of technology, and the evaluation of faculty. Full time E quivalent (FTE) in the case of st udents i s student semester hours divided by 30 advanced and professional post secondary vocational instruction, and college prep. For all other instruction, instructional clock hours divided by 900 hours ( Florida Fact Book, 2006 ). College Level Academi c Skills Test (CLAST) is an achievement test to measure students' attainment of the college level communication and mathematics skills. The CLAST consists of four subtests: essay, English language skills (ELS), reading, and mathematics. Common placement t est for graduates entering Florida c olleges is a way h igh school certain basic skills before beginning college level work. The Florida College Level Placement Test (C appropriate placement for college level work.
18 The Florida College Entry level Placement Test (CPT) is the Florida version of the College Board's Accuplacer Common Placement Test used by all state of Florida community colleges as their common placement test [South Florida Community College] Incentive f unds are d efined as either efficient in terms of time to degree or college preparatory programs. Efficiency (time to degree) is intended to reward colleges for students who graduate on a timely basis and actually save state funds. The college preparatory rata share of the number of students passing the highest level of college preparatory course in each subject area ( Florida Department of Education Fact Book 2006). Performance f unds in the Florida community college is m onetary funding proportional or related to the number of associate of arts degrees awarded plus Workforce Programs/Development. Workforce programs/development refers to education, training, and services to those entering the job market and those upgrading their skills, as well as those needing retraining. Workforce Programs/Development i s a joint goal of preparing individuals for employment, as well as providing employers with a skilled labor force needed to compete in a global economy. Post secondary Adult Vocational Certificate (PAVC) is described by the Florida Department of Education as occupations that generally require completion of career and technical training. Some programs last only a few weeks while others last more than a year. In some occupations, a license is needed that requires an examination after completion of training College Credit Certificate, Applied Technology Diploma, Associate of Applied Science degree or Associate of Art s degree are t h o se occupations that generally require the completion of college credit and sometimes a credential. In the state of Florida, Co llege Credit
1 9 certificates and Applied Technology diplomas are specialized college credit credentials that vary in the number of credits required. Associate in Science are programs consisting of college level courses to prepare for entry into employment. D egrees generally require at least 2 years of full time equivalent academic work Significance of the Study The Florida Community College System is a major component of the economics in Florida, and it is a major player in the preparation of the workforce f or the future. Florida labor market statistics highlight that the fastest growing occupations from 2006 to 2014 will require post ( Table 1 1). In addition, the 50 Florida declining or slow growth occupations for the 2006 2014 time periods reflect the greatest decline for individuals with less than a high school diploma, a high school diploma, or GED (Table 1 2). The decade of the 1990s began with a statute mandating that b oth the community college system and the state university system report performance. A second law required both systems to submit annual budget requests based on performance. The appropriation bill provided limited incentive funds for performance. At the end of the 1990s, a third statute tied the largest percentage of incentive funds to the performance of community colleges and school districts in workforce development programs (Wright, Dallet, & Copa, 2002). For pub l ic money, the public demanded accounta bility and effectiveness from higher education (McClenney, 2004). Cumulatively, during the past 11 years, the Florida performance based budgeting for community colleges was $112,042,342 NIR (Florida Department of Education Fact Book 2006). For fiscal ye ar 2006 2007, a proposal was made to increase PBB by $10 million, bringing the total allocation to more than $28 million, but the additional funding was not allocated in the i t would be
20 beneficial to review what has been the impact to date now that several years of data can be reviewed. Performance funding is in place for forty percent of the states, but while some states have joined the bandwagon, other states abandoned the ir programs (Burke et el, 2002). For example Arkansas repealed a law mandating that agencies use a performance based model because the model does not meet business goals (Songini,2005). Limitations of the Study The limitation of this study is that the r eview is restricted to those indicators reported by the community colleges to the state of Florida. This study analyzes the information reported during a certain period of time, but not the accuracy of the info rmation submitted to the state. The other li mitation of the study is the changes in the components of the AA measures; however,
21 Table 1 1 The Florida Community College System (2006 2007) System profile Community Colleges 28 Capit al Assets $5.2 billion Staff Resources All Employees 43,379 Faculty 22,749 Funding General Revenue $972 million Student Fees $477million Lottery Funding $115 million Enrollment Total Unduplicated Annual Headcount (2005 2006) 793,517 Degrees/Certificates Awarded Total Degrees/Certificates Awarded 66,431 Bachelors Degree Program 398 AA Degrees 33,398 AS Degrees 11,596 Vocational and College Credit Certificates 21,039 ( Florida Department of Education Facts at a Glance ( n.d.)
22 Figure 1 1 Florida 100 f astest growing occupat ions by occupational education requirement 2006 2014
23 Figure 1 2 Florida 50 declining or slow growth occupations by occupational education requirements 2006 2014
24 CHAPTER 2 REVIEW OF LITERATUR E Brief History of Community Colleges A review of development usually begins with William Rainey Harper, first p resident ( 1891 1906 ) of the University of Chicago Harper championed the cause of a post graduate high school program that w ould alleviate the need for universities to teach the basic skills during freshman and sophomore years. Joliet Junior College, America's oldest public community college began in 1901 as an experimental postgraduate high school program B y December 1902, t he courses were available tuition free. The American Association of Community Colleges website (200 7 ) stated : Great challenges faced the United States in the early 20th century, including global economic competition. National and local leaders realized th at a more skilled workforce was key to the country's continued economic strength a need that called for a dramatic increase in college attendance yet three quarters of high school graduates were choosing not to further their education, in part because they were reluctant to leave home for a distant college. Growth of the community colleges is closely linked to the educational growth of the United States. In response to the D epression of the 1930s, community colleges served the needs of the community by of fering job Readjustment Act (GI Bill of Rights) in 1944 provided financial assistance for World War II veterans seeking post secondary education. This congressional act began a series of events that bridge d the gap for individuals who previously could not afford to attend college. The next milestone in education wa s the 1947 report from the Higher Education for American Democracy, also known as the Truman Commission R eport. The re port called for among many other recommendations the establishment of a network of public community colleges that would charge little or no tuition, serve as cultural centers, be comprehensive in their program offerings with emphasis on civic responsibilit ies, and serve the
25 area in which they were located. This directive for a network coupled with economic factors resulted in a major expansion of the junior/community college. History of the Florida Community College System Based upon information from the Florida Association of Community Colleges St. Petersburg Junior College was founded in 1927 as a private, 2 year college to meet the needs private junior co lleges were formed, but they were unable to sustain their existence. In 1933, Palm Beach Junior College was established as the first public junior college in Florida In 1947, Howell Watkins was charged with making a report to the Florida Legislature abo ut the junior college section of the Florida Citizens Committee on Education. Watkins assigned the task to James Wattenbarger, a graduate student at the University of Florida. As a result of the report, the 1947 Florida L egislature revised the 1931 Publi c Free School Funds and provided for the State Minimum Foundation Program Fund. Included in the law was that junior colleges become operational components of the local school systems provided that the County Boards of Public Instruction received approval from the State Board of Education to operate junior college s In 1951, the Junior College Steering Committee of the State Advisory Council on Education presented a study conducted by C. C. Colvert and James W. Reynolds which recommended the establishmen t of an unspecified number of new junior colleges. In 1953, two key actions facilitated the further development of the Florida community college system: the Board of Control established the Council for the Study of Higher Education and Wattenbarger publis hed A State Plan for Public Junior Colleges In 1955 the Florida Legislature created the Community College Council to create a long range plan for the establishment and coordination of community colleges and Wattenbarger was placed in charge to formulat e the study. For his vision and leadership, Wattenbarger is often referred to as the
26 father of the Florida community college system In 1957, the Community College C ouncil issued its report titled and rec ommended a state plan that would lead to the establishment of 28 junior colleges located within commuting distance of 99% of the population ( T able 2 1) The report also included the recommendation for the general education agreement that established the t wo plus two system which would guarantee the transfer of all general education credits from public community colleges to the State University System. In 1957, the Florida L egislature separated the junior colleges from the K 12 programs and the Division of Community Colleges was created. The Florida L egislature did not release colleges from the jurisdiction of local boards of public instruction until the academic year 1967 19 68. This change led to the establishment of locally autonomous district board community/junior colleges. In 1979, the State Board of Community Colleges replaced the Community College Coordinating Board and assumed statewide leadership in overseeing and coordinating the p erformance based budgeting. Two major changes to the community college system occurred in 1997 First, Sena te Bill 1688 permitted all school boards and community colleges to offer w orkforce d evelopment programs S econd, a performance based funding model for workforce programs was established. The 1998 legislative action clarified the role of the Division of Wor kforce Development, and revised the funding formula to one based on 85% of prior year funding; 15% was to be awarded based upon performance. In two other actions, the Florida Legislature eliminated the abinet and appointed a new board to oversee education in the state. By 1999, access issues were again in the spotlight. The Florida
27 Legislature passed a bill to encourage articulation agreements between associate of science (AS) degree programs to the Sta te University System, and also to push the offering of baccalaureate degrees on community college campuses by inducing partnerships between community colleges Education Governance Reorganization Act, created the Florida Board of Education to oversee the Division of Workforce Development with the Division of Community Col leges, and created the role of a Chancellor of Community Colleges and Career Preparation appointed by the Commissioner of Education. Effective July 1, 2001, the State Board of Community Colleges and the Board of Regents were abolished ducational needs now came under one overseeing board, the Florida Board of Education. Funding Initially, community colleges across the nation were primarily funded locally But as of 2007, the funding for overall public colleges has been broken down as fol lows: 42% from state funds ; 18% from tuition and fees ; 24% from local funds ; 6% from federal funds ; and 10% from other sources. Historically, the budgets were based upon current costs, student enrollments and inflationary increases. Burke (2002) of the Rockefeller Institute of Government stated that this method encouraged growth in expenditures and programs, despite a declining enrollment. According to the Florida State Higher Education Finance F iscal Year 2005 report, higher education represents a subs tantial commitment on the part of state and local governments. In economic downturns and recoveries during the past 25 years the constant dollar and state and local support per student have declined and then have recovered, often exceeding previous lev els of support. However, increases in enrollment and inflation have exceeded state and local support and a higher share of the cost has been shifted to the students. At the height of
28 community college expansion (1970s), college prices were stable but so on after this peak increases in tuition and fees became larger than inflation and the growth of the family income. Impetus for C hanges in Funding The 1983 report titled A Nation at Risk published by the U.S. Department of Education's National Commissio n on Excellence in Education is often cited as the origin of higher education reform efforts. The c ommission advanced the following recommendations: Graduation requirements should be strengthened so that all students establish a foundation in five new b asics: English, mathematics, science, social studies, and computer science. Schools and colleges should adopt higher and measurable standards for academic performance. The amount of time students spend engaged in learning should be significantly increased The teaching profession should be strengthened through higher standards for preparation and professional growth. Two important factors have resulted in the private sector and governmental leaders having a different focus and interest in higher education funding. First, in the early 1990s the economic fluctuations led to decrease d funding of higher education. Higher education was receiving a decreasing percentage of state expenditures and was in competition for funds with health care, education for K 1 2, corrections, and other state demands. At the federal level, higher education received a decreasing portion of the federal budget. Second, the previously untouchable colleges and universities were now under scrutiny. The government was moving from the t raditional higher education self regulatory processes to a plan to pressure institutions to become more accountable, more efficient, and more productive (Alexander, 2000). DePalma (1992), in an article titled Critics Say Institutions Spend Carelessly, Teac h Poorly and Plan Myopically outlined some reasons for athletic scandals, anti trust suits against Ivy League schools and questionable use of research dollars for personal items. In
29 1992, Thomas Kean, then p resident of Drew U niversity declared "Our ivory tower is under siege. People are questioning our mission and questioning who we are. They claim we cost too much, spend carelessly, teach poorly, plan myopically, and when we are questioned, we act defensively" (DePalma, 199 2, p. 1). The 1990s showed an ever increasing public distrust of higher education and a growing questioning of the value of higher education. To address this concern, state leaders began to incorporate performance budgeting wherein agencies receive fund ing on the basis of their achievements (Carter, 1994). When California adopted its plan for performance based budgeting, they patterned it after the state of Texas and the San Francisco Bay area city of Sunnyvale. The plan provided more freedom in the run ning of departments, but held the departments more accountable for their performance (Weintraub, 1993). Other industries and organizations have used performance indicators to promote degree of excellence, effectiveness and a measure of accountability. Si multaneously, accredit ing associations have placed a higher priority on measures of operational effectiveness (Cleary, 2001). The Education Commission of the States in 2001 published a review of master/strategic plans and found that out of the 31 states t hat have current master/strategic plans, 11 included performance indicators (Education Commission of the States (2001). Burke and Minassians (2001) Sixth Annual Report Performance Reporting, found that when surveyed, the jury is still out regarding the d egree of impact of performance budgeting on campus performance. The League f or Innovation white paper titled An Assessment Framework f or the Community College: Measuring Student L earning and Achievement as a Means of Demonstrating Institutional Effectiv eness (2004), identified community college stakeholders as students, administrators, trustees, faculty, staff, parents, colleges, accreditation boards, businesses and the community. How do community colleges
30 quantify their value to this wide array of sta keholders? Community college leaders called for a systematic or systemic data driven, comprehensive approach to understanding the quality of 2 year and 4 year post secondary education with direct, valid, and reliable measures of student learning (Dwyer, M illett, & Payne, 2006). By 2000, 37 states (74%) implemented some form of performance funding, performance budgeting or some format borrowing from both approaches. Performance funding and budgeting depart from more traditional funding that focus on inp ut factors and focus more on achieved results (Burke, Rosen, Minassians & Lessard, 2000). The 2005 National Commission on Accountability for Higher Education report, which was titled Accountability for Better Results: A National Imperative for Higher Educa tion, highlighted the need for change in higher education. For over 50 years, speaker after speaker university presidents, business leaders, Presidents of the United States have praised our system of higher education as the finest in the world. For the first time in decades the United States no longer leads the developed world in the rate of college completion. In addition, large developing economies, especially China and India, are successfully educating thousands of scientists and engineers in ord er to compete in the global economy. Four out of ten students in colleges and universities fail to graduate within 6 years; one of those four is still enrolled. One fourth of low income students in the top quartile of academic ability and preparation fail to enroll in college within 2 years of high school graduation. While more minorities and low majority of minority students do not graduate. Both the price students pay and higher education costs have grown persistently faster than the consumer price index. State support and federal programs like Pell Grants are increasingly falling behind enrollment demand and inflation. Large percentage of our workforce in science and technology comes from highly motivated and able in ternational students. Other nations are competing more successfully for scientific talent; we cannot rely on imported talent to meet future needs. (p. 6) The National C ommission on Accountability for Higher Education believes that the status quo in higher failure to develop and implement accountability approaches that help improve performance in a
31 s not unique to the United States, but is present in Canada and European governments. Two shared key points with the United States and other governments are an increasing strain between policymakers and higher education because of different missions and a change in the role of government in higher education. Should the government have more oversight or be more involved in operational oversight? Can higher education meet the economic and community demands that only higher education can provide? In 2006, Secretary of Education Margaret Spellings announced plans to address the recommendations from the Commission on the Future of Higher Education. Spellings proclaimed an urgent need for change in America's higher education system since higher education is t he key to our children's future and the American dream Y et it is becoming less affordable and less attainable Spellings further stated that w hile our universities are known as the best in the world, 90 % of the fastest growing jobs require post secondary education and only one third of Americans have such a degree During her speech in September 2006 at the National Press Club in Washington, Spellings stated : Over the years, we've invested tens of billions of dollars in taxpayer money and just hoped for the best we deserve better. To remain competitive in the 21st c entury global economy, we must act now and continue the national dialogue and work together to find the right solutions. Believe it or not, we can't answer the most critical and basic questions about student performance and learning at colleges and that's unacceptable . Information will not only help with decision making it will also hold schools accountable for quality. In 2006, John V. Lombardi president of University of Massac husetts Amherst (2002 2007), voiced the opinion that the primary value of federal commission reports on higher education is that they provoke an extensive discussion of almost everything. Lombardi stated : The American model has the peculiarity that it perm its anyone to attempt higher education at some level and in some place. The commitment to opportunity pushes the competency filter to the educational institution, where in other countries the filter takes place much
32 earlier in the process, and eventually to post graduation testing (for professional programs) or employment criteria in various industries. This commitment to preserving opportunity creates the highly varied nature of higher education in America. Professionals have argued that higher education is different and not easily quantified The argument results in a growing cry for accountability and proof of the value of higher education. Accountability Utter the word accountability in higher education and a spectrum of reactions quickly unfolds. T he 1998 United Nations Educational, Scientific and Cultural Organization (UNESCO) statement from its World Conference on Higher Education included the following assertion: The academic freedom of higher education institutions and their wide autonomy whic h have to be strengthened and protected are essential if these institutions are to carry out their mission. Autonomy presupposes accountability to society. (Leveille, 2005, p. 3) At one point, accountability referred to the design of statewide governance structures capable of accommodating the simultaneous need for institutional autonomy and external oversight of campus decision making. McLendon, Hearn, and Deaton (2006) referred to leges and universities (e.g., academic programs, budgets, tuition setting, and so forth) should be dictated by the state and which should be left to the discretion of campuses (p. 1). During the past 20 years, the new definition of accountability reflects the refocusing of attention on outcomes. The focus has shifted from inputs, such as the number of students enrolled, to more outputs, such as the number of graduates. By the fall of 2004, 44 states initiated some form of accountability. For some states, no specific mandates were identified, but rather that the institutions of higher learning identify what outcomes to measure. While autonomy was granted to institutions of higher pacted by who controls the purse strings (Leveille, 2005). A balance should exist between autonomy and accountability. Without that balance, higher education will face additional scrutiny by political
33 interests and accrediting agencies as a reaction to pub lic outcries for validation of the value of education. In the fall of 2006, Secretary Spellings stated at the National Press Club in Washington: "No current ranking system of colleges and universities directly measures the most critical point student perf ormance and learning." So what do you measure in higher education and how do you do the measuring? How do you quantify the value of higher education? Whom do you hold accountable? When discussing accountability, part of the problem is gaining a consensu s as to who should define accountability and who should be held accountable. College Community Week annually publishes the top 100 associate degree producers, but many question whether or not graduation rates really are the best measurement of community c olleges. Bradley (2007) compare d and contrast ed the conflicting views regarding the value of judging community colleges based upon graduation rates. Less than one in four students graduate d from public community colleges, but many students who enter a co mmunity college do not have the intent of graduating. Community college students may be part time, taking one or two courses, taking courses beyond the 3 year time period to be counted in the government statistics, or transferring to a 4 year institution p rior to graduating at the community college. The questioning of the value of ranking colleges is not unique to the United States. In 2006, the president of the University of Toronto, David Naylor, stated success in things like sports and sales, where winning generally comes down to a single number. But no single measure can accurately reflect even a mid sized university. (p. 1) Linda S. Hagedorn, Chair of the Department of Educational Administration and Policy at t he University of Florida, stated, Community colleges are open access, and we are being measured the same way as traditional 4 year schools. The graduation rates, to a large extent, measure what you put in. Why are we surprised when a college like Harvard has a 99 percent graduation rate?
34 grape. (Hagedorn as cited in Bradley, 2007, p. 8) rate picture of student outcomes, allowing colleges to evaluate program efficiency and better explain themselves to State departments of education collect massive amounts of data and produce reams of reports, but w ho analyzes the data and what actions are taken to impact change on the future? Headlines in newspapers declare that the pubic and lawmakers are holding post secondary institutions more accountable, more efficient and more productive. But how do you hold higher education accountable ? H ow do y ou know that higher education is accountable? Is the ultimate goal of accountability systems lost in the maze of defining performance standards and specifying rewards and consequences for high or low performance as suggested by Lingenfelter (2003)? We do not have a shortage of accountability systems in the United States, and over the years the popularity waxes and wanes. Which systems are effective? How do we know that answer? Does one system fit all players in pos t secondary education? Are the systems effective ? Accountability has three factors: A performance standard must be set for each program; program performance must be monitored against that standard; and sanctions must be applied when standards are not met (Grizzle & Pettijohn, 2002). What is accountability? As a general policy construct, accountability refers to higher providing quantifiable evidence that it has ful filled its duty. Leveille (2005) stated Accountability systems for higher education are the systematic collection of input, process, and outcome data, their analysis and information dissemination, contributing to internal and external decision making by policy makers, educational leaders, and other stakeholders discussion has the following intended definition: Accountability is a systematic method to
35 assure those i nside and outside the higher education system that colleges and universities and students are moving toward desired goals. (p. 10) W hat are the characteristics of an effective accountability system? Is it based upon laws and regulations? Is it based upon s tructures put into place for K 12? Is it left in the hands of regional accrediting organizations to determine accountability? Past models of accountability in higher education include bureaucratic, professional, political, managerial, market and managed market (Burke, 200 2 ). The crux of the issue is whether a connection should exist betwe en accountability and budgeting. History of Performance Measures and Performance b ased Budgeting nce based budgeting has been incorporated into numerous commission reports, acts, and proposals. Government has tried to report objectively on performance for decades. There has been planning/programming/budgeting, management by objectives, and zero based budgeting techniques tried in the 1960s and 1970s (Theurer, 1998). The Government Performance and Results Act (1993) mandated for the first time that federal agencies become specifically results oriented including the requirement to specify measurable p erformance goals for all program activities in their budgets The required annual reports changed the focus from process to accountability for results or outcomes. In the 1990s, policymakers were generally less likely to accept voluntary institutional imp rovement of the 1980s, and they looked for more focus on a system of mandated public accountability (Gaither, Nedwek, & Neal, 1995, p. 1). The 1990s ushered in the requirement for reporting graduation and job placement rates. In 2003, the Office of Manage ment and Budget highlight ed the importance of using performance measurements in the budget process by requiring agencies to show that measurable results are being achieved in
36 the planning, performance measurement and budgeting process. According to the Go vernment Accounting Standards Board (GASB) : The assessment of services rendered by a governmental entity requires more than information about the acquisition and use of resources. It also needs information about the outputs and outcomes of the services pro vided and the relationship between the use of resources of those outputs and outcomes. Employing a variety of measures of inputs, outputs, and outcomes (measures that relate efforts to accomplishments), and additional explanatory materials will more fully assist users of general purpose external financial reports (GPEFR) assess governmental performance. (GASB Concepts Statement No. 1 1987 p. 1 ) In April 2007, the Governmental Accounting Standards Board announced that it had added a performance measureme nt project to its current agenda. The Service Efforts and Accomplishments (SEA) Reporting will help governments report the measures they develop and use to gauge achieved outcomes in pursuit of their public policy goals. Performance based budgeting has be en described as a way to strengthen the link between budget decisions and government performance. Performance funding according to Burke ( 2002 ) closely ties specific resources to institutional results on each of the designated indicators. If a campus ach ieves a set target on a designated indicator, it receives a specific amount of performance money. Performance budgeting allows for governors, legislatures or coordinating or system boards to consider campus performance on the indicators collectively as me rely one factor in determining the total allocation. John G. Morgan, c ontroller of the Treasury for Tennessee in 2003, outlined the benefits of performance based budgeting : increased accountability ; increased efficiency ; increased knowledge about state se rvices and programs on the part of policymakers ; improved public management ; enhanced program evaluation ; identification of opportunities for multi agency coordination ; and improved communication with citizens
37 Poister and Strieb (as cited in Kong, 2005), aptly stated worker efficiency was clearly part of the scientific management approach at the turn of the (p. 92). Kong (2005) suggest ed that performance based budgeting is a Performa nce reporting focuses attention on important outcomes as a means of motivating and facilitating improvement efforts. Performance funding goes the next step by connecting explicit monetary incentives with performance in order to get better results (Lingenfe lter 2003 ). Burke (2000) found that 37 states considered performance in budgeting. In performance budgeting, the link to funding is loose, discretionary, and uncertain ; in performance funding the tie of resources to results is fixed. While performance fu nding has many strong advocates, typically only a small percentage of the higher education budget is distributed according to performance measures. It has proven difficult and it may be undesirable to make performance funding a high stakes game because sta tes have large political and financial investments in institutions. This situation creates a Catch 22 for the states. It is hard to create a powerful incentive for action if the stakes are low On the flip side, if the stakes are too high, the impact to th e budget may create a totally chaotic system. The 2005 report by the National Commission on Accountability in Higher Education paper, titled Accountability for Better Results: A National Imperative for Higher Education stated Budgeting lies at the core affects behavior, and programs need financial resources to be effective. Institutional, state, and federal policymakers must send consistent signals about priorities by responding to results and budgeting resources to achieve pubic goals (p. 30) The report further asserted that budget allocations should reflect what is necessary for continuity and predictability, as well as what is required for improvement and change. Since the second half of th e 20 th century, various budgeting strategies have been implemented, notably
38 zero based budgeting, performance funding, and management by objectives. Performance budgeting expanded from less than a third of the states in 1997 to more than half in 2001 nearl y a 70% increase in 3 years (Burke & Minassians, 2001). Performance based budgeting systems often are similar to the continuous quality improvement tool of PDCA: P for Plan or identification of what you want t o improve; D for Do or implement a change; C fo r Check or study the res ults w hat have you learned; and A for Act which is to accept the change or abandon it. The PDC A c ycle was originally developed by Walter Shewhart in the 1930s and later adopted and tweaked by W. Edwards Deming. Pay for performance is not a concept unique to higher education. For example, more than half of commercial health maintenance organizations are using pay for performance A $4.9 million grant program called Rewarding Results administered by the National Health Care Purchasin g Institute rewards physicians and hospitals for higher quality. Under the program, providers become eligible for financial or nonfinancial incentives after meeting specific quality goals tied to medical outcomes and clinical performance. Rosentha l and D udley (2007) identified five key elements for pay for performance programs: (a) i ndividual versus group incentives ; (b) paying the right amount ; (c) selecting high impact performance measures ; (d) making payment reward all high quality care ; and (e) priori tizing quality improvement for undeserved populations. The authors stated, To date, widespread experimentation has yielded important lessons and highlighted critical challenges to paying for performance. Several recently published evaluations have demonstr ated both the potential of pay for performance and the need for careful design of programs to ensure their effectiveness Despite purchasers' enthusiasm for pay for performance, it has become clear that it should not be a foregone conclusion that these pr ograms will benefit patients or even significantly assist providers who want to improve care. (p. 1) In August 2007, the San Francisco Business Times reported that Blue Shield of California is again awarding $31 million in pay for performance bonuses to C alifornia medical groups and
39 care providers financial incentives to increase performance we hope over time to make an impact on how medicine is practiced in California (Rauber, 2007, p. 4). This relationship between performance and funding is also an issue in Canada. In 2004, Peter George chaired the Council of Ontario Universities Quality and Financing Task Forc e (George, 2004) The task force tackled the following major challenges facing universities in Canada and elsewhere in the world: What is quality in higher education? How can it be sustained and enhanced? Who should pay for it? How should it be paid for? Without a Roadmap: pointed out that trying to compare Canadian post secondary systems within the country makes compari sons at the national level problematic. Community colleges in the province of Alberta are more similar to those in Florida than to colleges in the province of Ontario (Snowdon, 2005, p. 21). Starting in the 1990s, significant changes occurred in funding mechanisms in many Canadian provinces. Under accountability came more targeted funding categories and additional reporting. These funding categories, referred to as funding envelopes were used to encourage and recognize performance differences and improv ements in areas such as employment rates and graduation rates. These performance funds represented 1% to 2% of the total funding and were the only new government monies available. The province of Alberta developed its own performance envelope with indica tors, for example, enrollment growth, satisfaction of recent graduates, employment of recent graduates, and sponsored research awards.
40 I ndicators and Performance A review of the literature reflects hundreds of performance indicators in use in higher educat ion and the mass media in the United States, Canada, and Western Europe for stakeholders, such as government, higher education trustees and administration, faculty, students, and their parents. The Canadian Council on Learning (2006) conducted a scan to pr ovide a summary of relevant research documents related to measures of quality in post secondary institutions. The report reflects upon the rationale, relevance, and implications of the indicators, and tends to question the qualitative dimensions of an ind icator. The Canadian Council on Learning noted that measuring quality can be difficult since simultaneously it may mean efficiency, effectiveness, diversity, and access. Can externally driven outcomes be relevant, practical, or fair? In a Policy and Iss ue paper entitled Performance Reporting in Higher Education in the Nation and in New York State by the Office of Research and Information Systems (ORIS) of the New York Department Office of Higher Education, the research analysis of performance indicators in use concluded that they tend to cluster around broad state policy goals (ORIS, 1996). Common indicators nationwide often include items such as admission standards, remediation activities, retention and graduation rates, number of degrees awarded, and g raduate placement data. Unfortunately, even with the best of intentions, the result has not always met the expected achievement of goals. Often the reaction to the results is to collect more data, but that data collection does not necessarily mean better accountability or performance. four reports (2001, 2002, 2004, 2006) en titled Measuring Up compare and evaluate the performance of each state on six dimensions to assess national performance in higher education :
41 Preparation for college: How well are young people in high school being prepared to enroll and succeed in college level work? Participation: Do young people and working age adults have access to education and training beyond high schoo l? Completion: Do students persist in and complete certificate and degree programs? Affordability: How difficult is it to pay for college in each state when family income, the cost of attending college, and student financial assistance are taken into accou nt? Benefits: How do workforce trained and college educated residents contribute to the economic and civic well being of each state? Learning: How do college educated residents perform on a variety of measures of knowledge and skills? Also at the national level, the multi year national initiative, which is titled Achieving the Dream: Community Colleges Count is to help more community college students succeed. Each institution identifies student populations that experience low rates of success, develops int erventions to improve student outcomes, and measures changes in student success. The initiative is funded by Lumina Foundation for Education, Knowledge Works Foundation, Nellie Mae Educational Foundation, Houston Endowment Inc., College Spark, and a Heinz endowment. The first group of 27 colleges in five states joined the initiative in 2004. The term of the grant runs from October 1, 2005, through September 30, 2009, and the initiative involves 15 states. Participating colleges enroll high percentages of low income students and students of color, who are less likely to attain their educational goals. The goal is for more community college students to succeed by completing courses, earning certificates, and getting degrees. To achieve significant improvem ents in student success rates, institutional change must occur. The data analysis focuses on student outcomes, such as completion of developmental courses and earning a certificate or degree.
42 It is not easy to substantiate that states using performance fu nding have better outcomes, have lower costs, are more effective or are more accountable. The difficulty is in defining what are the appropriate outcomes : H ow do you define effective and how do you define accountable ? But is performance funding impacting behavioral aspects rather than the budget? The State Higher Education Executive Officers ( SHEEO ) 2005 survey asked respondents to assess the impact of performance reporting on bringing about greater effectiveness, productivity, and quality. Half the respo ndents reported that performance measures had made a major positive impact or some positive impact ; the other half noted that it is too early to assess t heir effects. No respondents indicated that performance measures had shown no impact or a negative impa ct. Performance indicators fall into four types: i nputs, p rocesses, o utputs and o utcomes (Burke, 2002). Inputs are resources received to support programs, such as people, funds and space. Processes are the methods and procedures used to deliver service s, such as the use of technology. Outputs focus on the quantity of measurable items, such as the number of graduates. Outcomes represent the quality or value of programs or activities. With the discussion of performance indicators comes the related disc ussion about effectiveness. Alfred, Ewell, Hudgins, and McClenney (1999), in their book titled Core Indicators of E ffect iveness for Community Colleges stated The heart of any definition of institutional effectiveness remains the ability of an institutio n to match its performance to its established purpose as stated in its mission. There are two caveats with this definition. First, both purposes and results must be consistent with a growing variety of stakeholder needs. Second, results must be produced efficiently within the constraints of available resources. (p. 6) As higher education struggles to agree upon the right measures, the larger question is what do we do with the collected data?
43 Florida Community College Experience The state of Florida spending was growing at an alarming rate but the policymakers were at a loss to figure out how the money was being used. Florida was recognized nationally for its led the budget system. Led by then Governor Lawton Chiles (1991 1998), the legislature retooled the state budget system adopting a version of performance based budgeting (Hosansky, 1994). The 1990s was a decade of changing directions, new measures, new p riorities and a computer system not totally prepared to meet the expectations. The 1996 19 97 General Appropriations Act allocated $12 million for community colleges tied to pre set measures. So now the Florida system included two forms of performance fu nding performance based program budgeting and incentive funding, but the key question still remained : H ow closely should the allocation of funds be tied to performance ? Subsequent funding and indicators varied, and the importance of workforce development i n the state economy brought new measures that were added to the formula. Two strategic imperatives of the Florida Department of Education Strategic Plan ( As cited in the Department of Education Fact Book, 2005) can be linked directly or indirectly with th e data captured by the performance based funding measures: Strategic Imperative 6: Align workforce education programs with skill requirements of the new economy. Strategic Imperative 7: Align financial resources with performance. By the academic year 2 006 2007, the incentive funds budget was $2,068,409, and the performance funds budget was $16,007,587. For the performance funds, 40% was allocated for outputs, 20% for special populations, and 40% for outcomes. Wright et al. (2 0 02) described performance f (p. 137). Literature
44 specifically on the Florida performance based budgeting experience is fairly limited. Each year the Flori da Community College System prepares a report for each college outlining the funding and allocation by performance measure, but the format is more reporting versus analysis or The Effectiveness of Per formance b ased Outcome Measures in a Community College System demonstrated a varying impact on performance measures related to the number of degrees and certificates granted by the community colleges. Phillips asked this research question: Does implementi ng performance based funding cause a discrete change in institutional performance? The conclusion was that it did, but the change was statistically insignificant. It also was not conclusive if the change was due to the implementation of performance based b udgeting or due to other legislative actions. system during the first 5 years after implementation of performance funding. Yancey concluded that while performance funding is popular, insufficient evidence existed to conclude whether or not it is effective. In a report to the Florida Board of Governors, the Performance and down the track . (Florida Board of Governors, 2004).
45 Figure 2 1 Growth of community colleges. Based on material from National Profile of Community Colleges: Trends & Statistics (as cited in Phillippe & Patton, 2000)
46 CHAPTER 3 ME THODS Introduction The Florida C ommunity C ollege S ystem encompasses 28 colleges with multiple campus es and centers with varying FTE enrollment. As summarized in Table 3 1, the expansion of the system was very methodical with the system in 2005 2006 servin g 793,517 unduplicated student headcount. For the 1980s and most of the 1990s, the base plus funding approach for community colleges essentially was unchanged except for the transformation in 1996 from full time equivalent [FTE] to performance. In 1994, higher education systems to allocate a portion of new funds, using performance based incentives and performance based program budgeting ( Community College Funding Model, 20 06) as summarized in Table 3 2 The other major change occurred in 1997 when the Florida Legislature created separate funds for workforce development funds. The Performance Challenge In November 1998, the State of Florida Division of Community Colleges published the Agency Strategic and Accountability Plan for the Community College System : Strategic Issue II. The Performance Challenge Goal I. Monitor student performance and evaluate student achievement as it relates to system wide accountability goals ( p. iii, 1998) The community college system was the first sector of Florida higher education to be developed as an incentive process for community colleges to receive additional students who were not part of the base funding. A relationship now exist ed b etween funding and performance that would (2006) is the following :
47 Increase the proficiency of all students with one seamless, efficient system by providing them with the opportunity to expand their knowledge and skills through learning opportunities and research valued by students, parents, and communities, and to maintain an accountability system that measures student progress toward the following goals: Highes t student achievement Seamless articulation and maximum access Skilled workforce and economic development Quality efficient services Purpose of the Study This dissertation presents a descriptive analysis of trends among the Florida Community College System performance based budgeting measures. The study will focus on the performance funds re lated to the associate of arts ( AA ) degree program portion of the performance based and incentive based allocation funding for the 28 Florida community colleges fr om 1996 1997 to 2005 2006. This dissertation was a secondary data analysis of district data from the Community College Office of Budget and Financial Services and from the Florida Department of Education (FLDOE). The dissertation focuses on the following four research questions: What has been the overall change to percentage of funding by category as a result of the implementation of Florida community college performance based budgeting? Utilizing full time equivalent (FTE) as a leveling approach, what ar e the differences in performance based budgeting trends by individual Florida community colleges? Is there a measurable improvement in the Florida community colleges resulting from performance based budgeting for selected associate of arts measures? How d o Florida community colleges use performance based budgeting funds? Specifically, strategic imperative 7 of the plan is to align financial resources with performance. The study focused on the impact of the new funds for associate of arts degree measures, as defined by the Florida Community College System:
48 AA Program Measure I (a) the number of AA degree graduates for the reporting year and (b) the number of dual enro llment credits hours generated for the reporting year divided by 60 (the credit hour requirement for an AA degree). AA Program Measure II the number of AA graduates w ho (a) required remediation based on College Placement test results (one point for each subject area requiring remediation ) ; (b) qualified as economically disadvantaged under federal guidelines; (c) were reported as disabled using federal guidelines; (d) i s a black male; or (e) tested into English for Academic Purposes (EAP). Colleges receive one point for each special category of a student. AA Program Measure III th e number of completers or partial completers who were placed in jobs or transferred to the State University System (SUS). The AA degrees represent 60 credit hours. The points associated with this measure are calculated as follows: One point for each comp leter placed in a full time job earning at least $10 per hour One point for each AA graduate who transferred to the State University System 0.50 point for each student (not counted as a graduate) who transferred to the State University System with 30 credi t hours or greater and less than 45 credit hours 0.75 point for each student (not counted as a graduate) who transferred to the State University System with 45 to 59 hours of college credit One point for each student (not counted as a graduate) who transfe rred to the State University System with 60 or more hours of college credit The performance funds for associate of arts degree measures were selected since historically their allocations are more than 50% of the performance funds allocation. The AA measur es have also been the most consistent over the time period. The incentive funds ( efficiency defined as time to degree and college preparatory completion ) and remaining performance funds measures for associate of science degree programs, apprenticeship pr ograms, post secondary vocational program, adult high school program, general equivalency diploma,
49 and adult literacy program w ere not be used for the trend analysis by individual community colleges. This study includes a full population. It used data fr om all 28 Florida community colleges; hence, no sampling was required. The study therefore provides a true picture without the need for inferential statistics that may infer conclusions and thus includes no statistical testing. It is noted that Miami Dad e College and St. Petersburg College offer 4 year programs, but they are included in the distribution of the performance based budgeting funds and therefore were included in the study. References to the Florida community colleges, includes Miami Dade Colle ge and St. Petersburg College. Data Submission Guidelines The Florida Department of Education website provides the guidelines and procedures to provide coordination and consistency among community colleges for the submission of data ( Florida Department of Education, Office of Accountability, Research, and Measurement, 2007). The procedures provide a consistent set of processes within the Community College Division relative to college reports, requests, and college audits. Changes to the manual are recomme nded Division of Community Colleges. All Florida community colleges are required to submit data to the Florida Department of Education. At the Florida Department of E Measurement website (under community colleges and technical centers) is the Community College & Technical Center Management Information Systems (CCTCMIS), Community College Data Dictionaries. The data dictionaries, in Word or PDF format, include the chart of reports, data submission procedures, personnel data base, annual personnel reports database,
50 facilities and capital outlay database, integrated database, FTE database, and admissions database. As outlined in th e Florida Community College System, Chapter 2, titled Data Submission Procedures (2007), three elements to the process are used by which colleges submit data to CCTCMIS: A period of time during which the CCTCMIS is prepared to receive data for a particul ar submission A process by which colleges submit and verify their data through verification/exception reports. If data are unreasonable, colleges can resubmit and clean up all errors found during verification A cut off date At the beginning of the reporti ng year, the colleges receive a time line indicating the start dates and cutoff dates for all CCTCMIS data submission periods. In addition, the colleges are notified in writing approximately one month before the due date for each data submission coordinat ed by the CCTCMIS. The CCTCMIS establishes the mechanisms by which colleges submit data for a particular subm ission. The mechanisms include : (a) instructions for using the Florida Information Resource Network ( FIRN ) for data transmission ; (b) machine re cord formats that specify the order of data in the data submission package ; (c) programs for the colleges to run to initiate any trigger files r equired to process college data ; (d) programs that generate reports (if applicable) for the colleges to use in correcting data that f ailed one or more edit criteria ; and (e) programs that generate appropriate verification reports (if applicable) when the data pass the edit criteria to help colleges analyze the accuracy of their data. After the Submission Period Start Date, colleges begin sending their data for processing.
51 Full t ime Equivalent The full time equivalent (FTE) data are calculated using the Student Semester Hours (SSH) and Instructional Clock Hours (ICH). Prior to 1991 1992, the colleges submitted the data as SSH or Credit Hour Equivalents (CHE) on paper forms (FA2, FA3, and FA4). Beginning with 1991 1992, the SSH and ICH are reported in the Student Data Base. The reporting year for FTE is summer, fall, and winter/spring. All FTE data are non wei ghted. The number of FTE students for the community college program fund is the college credits for which students register divided by 30 plus the hours of instruction for which students register in other instruction divided by 900. The FTE enrollments a re counted and reported in the term in which the course begins (when students register). If a course begins in one reporting term or year and ends in another, student semester hours or credit hour equivalents will be reported in the term and year in which the course begins. It is noted that the community college funding model uses data from the previous 2 years in the distribution of the performance based funds. This 2 year lag time allows the community colleges to make adjustments to ensure a more accura te data submission. Funding Categories The funding categories are defined as general revenue community college program fund (CCPF) lottery (CCPF) and performance based budgeting. Performance based budgeting is further divided by incentive funds (time to degree and college preparatory completion) and performance funds (associate of arts degrees, post secondary adult vocational certificates, apprenticeship occupational completion points, adult high school diplomas, and adult literacy completion points). This study focused on AA measures, which are aggregated as either AA programs or college preparatory completion.
52 Acquisition of Study Data Each year the Florida Community College System generates a Performance Funding Report, which is sent to community co llege presidents and chief financial officers. The report includes the explanation of the performance based budgeting measures used and how the funds were allocated based upon data from the previous year. Edward L. Cisek, v ice c hancellor for Financial Po licy, Division of Community Colleges at the Florida Department of Education, was contacted about receiving copies of the Florida Community College System performance funding reports, and he was queried regarding the availability of data in electronic forma t. Patricia Windham, a ssociate v ice c hancellor for Evaluation for the Division of Community Colleges, Florida Department of Education, was contacted via telephone asking if performance based budgeting data, from its inception in 1996 1997 through 2005 200 6, could be released for use in a proposed dissertation project at the University of Florida. The file s s in a mixture of Lotus and Excel formats, were sent by mail to the investigator via a CD R. The data are available by fiscal year as a system, by ind ividual college, and by performance based budgeting measures and sub components. The data files included the performance based budget distribution from 1996 1997 through 2005 2006, the appropriation history from 1980 1981 through 2005 2006, the FTE enroll ment (funded from 1981 1982 through 2004 2005, and the operating budget from 1981 1982 through 2004 2005. The other main source of data is from the Department of Education Fact Book a report on the Florida community college system, which is available to the public via the Department of Education website. The information format is such that individuals cannot be directly or indirectly identified. The University of Florida Institutional Review Board has reviewed and approved this proposal. Because only e xisting data will be used, this proposal is exempt from further review by the Institutional Review Board.
53 Data Analysis The focus of the study was a series of trend analyses comparing inputs, such as overall budget and performance based budget, with the ou tputs, such as the number of AA degrees for 1996 1997 to 2005 2006. For each section of the review, the approach began with the overall system, progressing to analyses of specific college groups, and then the specific colleges. To address research Quest ion 1 (What has been the overall change to percentage of funding by category as a result of the implementation of Florida community college performance based budgeting?), the study began with a graph of the Florida Community College g budget (including the amount from the general revenue, the Florida Lottery, performance based budgeting, and workforce funding), as compared to the amount of the performance based incentive funding from 1996 1997 through 2005 2006, using a double Y axis scale. Specifically, as the total operating budget (including the amount from the general revenue, the Florida Lottery, performance based budgeting, and workforce funding) changed by year, the question is : W hat was the relationship between the amount of th e performance based incentive budgeting from 1996 to 2005 and the total operating budget ? Then for each year the question is: What was the performance based incentive budgeting amount as a percentage of the total operating budget (including the amount fr om the general revenue, the Florida Lottery, performance based budgeting, and workforce funding) from 1996 1997 through 2004 2005? Was the percentage of performance based incentive funding consistent or did variations exist? Then using the operating budg et, which includes the general revenue, the Florida L ottery funds and the student fees, a stacked column graph for the system was generated depicting the amount of general revenue, Florida L ottery funds, and student fees as part of the operating budget for the time period 1997 1998 through 2005 2006 A graph was constructed for 1997 1998 through
54 2005 2006, looking for any relationship among the four lines of the general revenue per FTE, the Florida L ottery funds per FTE, the student fees per FTE and the t otal fees per FTE. The final assessment at the overall level was a comparison of the total operating budget (including the amount from the general revenue, the Florida Lottery, performance based budgeting, and workforce funding) for 1996 1997, converted to 2004 dollars. The budget allocation was evaluated first by the U.S. retail price inflation and then by the U.S. wages inflation using a free inflation calculator available on the Internet (Halfhill, 2007). With inflation, the question is : Will the budget ary allocations maintain the purchasing power for the colleges? To address research Question 2 (Utilizing full time equivalent [FTE] as a leveling approach, what are the differences in performance based budgeting trends by individual Florida community co budget (including the amount from the general revenue, the Florida Lottery, performance based budgeting, and workforce funding) with the student FTE enrollment (funded) for 1996 1 997 through 200 5 200 6 using a double Y axis scale. As the enrollm ent fluctuated, the question is : Did the overall budget fluctuate with the changes in enrollment? As FTE enrollment increased, the question is : How did the performance based incentive budge t compare with the changes in FTE enrollment? A graph was generated comparing the performance based incentive budget with the FTE enrollment (funded). An analysis was done comparing the FTE enrollment (funded) with the unduplicated headcount enrollment fo r the time period 1996 1997 through 2004 2005 with a follow up graph comparing the total operating budget versus the FTE enrollment (funded) versus the unduplicated student headcount for the time period 1996 1997 through 2004 2005.
55 Using the FTE enrollmen t (funded) by college for the time period 1995 1996 through 2005 2006 the community colleges w ere ranked from lowest to highest in cumulative FTE enrollment (funded). The community colleges were then ranked in order for 2002, 2003 and 2004 to ascertain if more recent activity would change the ranking order of the colleges. The 28 colleges were divided into groups defined by natural breaks in the enrollment. Group 1 includes those community colleges whose annual FTE enrollment is less than 4,500. Group 2 includes those community colleges whose annual FTE enrollment is between 4,501 and 10,000. Group 3 includes those community colleges whose annual FTE enrollment is between 10,001 and 15,000. Group 4 includes those community colleges whose annual FTE en rollment is between 15,001 and 20,000. Group 5 includes the community college whose annual FTE enrollment is so much greater than any other community college that it is in a grouping by itself. A graph was generated showing the cumulative performance bas ed budget funds by college from 1996 1997 through 2004 2005. As a follow up, the data were then graphed by sorting the community colleges from lowest to highest based upon the FTE enrollment (funded). For each of the colleges in the FTE enrollment (funded ) groupings, for the time period 2000 2001 through 2004 2005, a graph depict ed based funds. The review focused on any similar ities or differences with changes over time of the percentage categories relative to FTE enrollment (funded) groupings of the community colleges. The data were then graphed comparing the performance based incentive funding per FTE enrollment (funded) vers us the FTE enrollment (funded) over time for the total system for the time period 1996 1997 through 2005 2006. The data were then analyzed by looking at each college within their FTE enrollment
56 (funded) groups, focusing on similarities and differences of the performance based budget per FTE enrollment for 2000 2001 through 200 5 200 6 To address research Question 3 (Is there a measurable improvement in the Florida community colleges resulting from performance based budgeting for selected associate of arts measures?), a series of analysis was completed for each of the three AA measures. For the time period 1996 1997 through 2004 2005, a bar graph was constructed showing the total AA measure points A line graph was constructed comparing the total system total points for AA Measure I, AA Measure II, and AA Measure III with the FTE enrollment (funded) Specifically, as enrollment increased, the question is: D id the number of points increase? As a follow up, a graph depict ed the total AA measure completion p oints for Measure I, Measure II, and Measure III for the time period 1996 1997 through 200 5 200 6 As the enrollment fluctuated, the question is : Did the AA measure total points generated follow a similar fluctuation? If the performance challenge of the a plan for the Florida Community College System Strategic Issue II was to increase the number of AA completers, the question is : Has a change occurred with the implementation of performance based budgeting? relative to th e AA measure points? Using the ranking of community colleges each of the three AA measures for the time period 2000 2001 through 200 5 200 6 was produced focusing on any trends over time by college and by grouping of colleges. Grouping the community colleges by FTE enrollment (funded) from lowest to highest, the information was used to determine if community college size, defined by FTE enrollment, showed any patt erns relative to the total AA measure points Specifically, has there been a change in the pattern by individual community college or by similarities of differences by FTE enrollment groupings?
57 The last review for this section was the analysis of the u nd uplicated s tudent h eadcount per FTE e nrollment (funded) ratio for 2005 2006. To address research Question 4 (How do Florida community colleges use performance based budgeting funds?), the final phase of the study focused on how the performance based ince ntives are utilized. Random sampling w as used to select representative individuals and then generalize from these individuals to the population. Due to the geographic distance of the community colleges, via email, a telephone interview w as set up with th e college president or chief financial officer relative to how the performance based budgeting funds are utilized. The specific question w as h ow the funds are used and what are their views if the percentage of overall funding should be more closely linked to outcomes. The rationale for the interviews was ed performance based budgeting and to gain insight into how the funds are used. The performance based budgeting funds data, and these funds are included in the general revenue category. Currently, no stipulation indicates that the funds must be used in a certain manner. The scope of this study was lim ited to the AA measures since this category comprises 52% cumulative AA measures funding from 1996 to 2005 as compared to the total performance based budgeting total r elative index for 2005 2006. Specifically analyzing the pattern by college when ranked by FTE enrollment (funded) was significantly different when looking at a portion of the funds allocation (the AA measures) versus the total allocation. For the total relative index data, 2005 2006 data were used since they most closely relate to the current community college experience.
58 Table 3 1 Florida C ommunity C olleges College Main l ocation # of campuses and centers Year e stablished FTE enrollment (funded) 2005 2006 Annual unduplicated student headcount 2005 2006 Brevard Cocoa 6 1960 10,036.2 25,713 Broward Ft. Lauderdale 11 1960 22,219.5 52,684 Central Florida Ocala 5 1957 4,578.2 18,520 Chipola Marianna 1 1948 1,661.6 5,209 Daytona Beach Daytona Beach 6 1 958 11,794.5 27,911 Edison Ft. Myers 4 1962 7,090.7 17,111 Florida Community at Jacksonville Jacksonville 12 1966 19,618.5 64,493 Florida Keys Key West 1 1966 771.5 2,814 Gulf Coast Panama City 5 1957 4,722.9 22,140 Hillsborough Tampa 4 1 968 16,395.0 43,915 Indian River Fort Pierce 10 11,968.2 35,928 Lake City Lake City 1 1932 2,381.2 7,198 Lake Sumter Leesburg 3 1962 2,312.4 6,581 Manatee Bradenton 3 1958 6,629.1 20,036 Miami Dade Miami 8 1960 50,447.4 132,060 North Florida Madison 1 1958 1,009.6 3,175 Okaloosa Walton Niceville 6 1964 4,738.1 12,841 Palm Beach Lake Worth 5 1933 15,405.6 47,572 Pasco Hernando New Port Richey 4 1972 5,282.1 13,209 Pensacola Pensacola 3 1947 7,932.5 20,288 Polk Winter Haven 3 1965 4,636.1 18,471 St. Johns River Palatka 6 1958 3,687.0 9,296 St. Petersburg St. Petersburg 12 1947 15,304.2 47,694 Santa Fe Gainesville 6 1966 11,514.7 22,897 Seminole Sanford 5 1966 10,646.1 30,374 South Florida Avon park 4 1966 3,046.0 7,624 Tallahassee Tallahasse e 6 1967 11,012.6 27,381 Valencia Orlando 7 1967 20,872.4 50,382 TOTAL 287,713.9 793,517 Note : The data in columns 3 and 4 are from individual community college websites. The data in columns 5 and 6 are from the Florida Community College System Jan uary 2007 Fact Book.
59 Table 3 2 Florida C ommunity C ollege O verall F unding F ormula Funding Purpose Direct instructional funding Instructional faculty funding and the instructional support funding plus Academic support funding Services to help suppo rt and supplement the instructional programs provided by the college such as computer labs, academic administration, curriculum development, and support plus Library f unding Includes library materials, library technology, library staffing, and library ope rational expenses plus Student services funding Student services to assist students in pursuit of their educational goals and objective. These support services include registration and record keeping, counseling and advising, administration of financial aid, assistance to the disabled and placement services plus Special projects funding Historical appropriations for unique services provided at eight of the colleges plus Technology f unding Internet bandwidth, computer labs, disk space for student coursew ork, computer labs, classroom media technology, wireless networks, and administrative systems plus Institutional support funding Functions or services that support basic operations, such as human resources, accounting and finance, and purchasing plus Phy sical plant operations & maintenance funding Building and equipment maintenance, police and campus security services, grounds operations and maintenance, utilities, facilities planning, and custodial services plus District cost differential (DCD) f unding The district cost differential factor is an effort to equalize funding based on differing costs of living for employees equals Total calculated funding minus Standard fee revenues minus Projected public education capital outlay (PECO) Maintenance eq uals Funding Formula State Support Note : Additional funding from performance based incentives and performance based program funding are defined as incentive measures (efficiency in terms of time to degree and college preparatory programs) and performanc e funds. The measures and amount allocated vary by year. Note : The explanation of the Community College Funding Model is from the Community College Funding Model prepared by the Community College Budget Office, Florida Department of Education.
60 CHAPTER 4 ANALYSIS OF DATA Overview This dissertation studied the trends in Florida community colleges relative to the performance based budgeting funding, focusing on the associate of arts measures. This chapter presents a summary of the findings related to the fo ur research questions: What has been the overall change to percentage of funding by category by colleges as a result of the implementation of Florida community college performance based budgeting? Utilizing full time equivalent (FTE) as a leveling approach what are the differences in performance based budgeting trends by individual Florida community colleges? Does the Florida c ommunity c ollege performance based budgeting result in measurable improvement for selected associate of arts measures? H ow do Flor ida c ommunity colleges use performance based budgeting funds? As shown in Table 4 1, the Florida total calculated funding model includes a series of funds to be added and subtracted in order to determine the total calculated funding. The performance based budgeting funding is comprised of different components that can vary by year. For example, from academic years 2000 2001 through 2004 2005, the programs and incentives included the associate of arts, college prep, and time to degree. Starting in academi c year 2005 2006, the components included the associate of arts, college prep, time to degree, and workforce. College credit performance measures include degrees and certificate, and they target student populations (economically disadvantaged, disabled, B lack males, and English as a second language (ESL)/ English for Non n ative Speakers (ENS). For the analyses, the focus was on total operating budget, FTE enrollment and u nduplicated s tudent h eadcount.
61 Research Question 1 Total Operating Budget vs. the Per formance based Budget In reviewing the performance based data, the operating budget (the sum of the general revenue, Florida L ottery funds, and student fees) was used in the analysis. In evaluating Figure 4 1, the total operating budget from 1996 through 2005 had no relationship with the amount allocated for the performance based budget. With the performance based budget remaining constant at $7,674,374 from 2001 2002 through 2004 2005, the end result for a college could be achieving more points, but rec eiving less dollars. Since the performance based budget appropriation is decided each year, other economic factors and priorities often are the deciding criteria in the budgetary amount. In order for measures to be a priority, a value must be assessed th at would generate a As shown in Figure 4 2, the percentage of performance based budgeting, as part of the total operating budget, fluctuated from .47% to 1.32%, which was a very small component of the total op erating budget. From 1999 2000 to 2004 2005, the percentage of the performance based budget as part of the total operating budget, declined. Depending on your perspective, the value of a million dollars is subjective. T he question is : H ow much effort wil l the community college place on a small portion of its budget. With performance based budgeting amounts remaining stagnant for several years and having no relationship with the total operating budget growth, what priority is placed on the performance bas ed budget associated measures? Components of the Operating Budget Figure 4 3 shows how student fees, Florida L ottery funds, and general revenue from 1997 through 2005 have changed as components of the operating budget. Analyzing the trend from 1997 thro ugh 2005, the student fee component is occupying an increasing percentage of the
62 operating budget starting at 24% in 1997 and ending at 32% in 2005. Noting this figure in conjunction with Figure 4 4 for the same time period, total fees per FTE and genera l revenue per FTE have similar patterns of an initial increase, a decrease in 2000 2001, a period of almost a flat line and then in 2004 2005, beginning an elevation. The Florida L ottery funds per FTE have remained almost constant but the student fees p er FTE have maintained an upward slope from 1997 1998 to 2005 2006. The general revenue per FTE began at $2,657 in 1997 and reached $3,077 in 2005. Florida Lottery funds per FTE in 1997 were $462.34 and declined to $345.43 in 2005. Student fees per FTE w ere $1,009 in 1997 and reached $1,594 in 2005. Comparisons of budget increases can be misleading if inflation factors are not taken into consideration. Using the 1996 total operating budget of $911,970,588 and converting the amount into 2005 dollars is o ne way of assessing if the colleges have maintained their purchasing power. Using the 1996 total operating budget of $911,970,588, an inflation calculator was used to determine if the community college system has maintained its purchasing power. In 1996, the funding per FTE was $3,921 ; in 2005, the funding per FTE was $4,923. The 1996 figure converted to 2005 dollars would be $4,707 using the U.S. wage inflation and $4,730 using the U.S. retail inflation index. If the 1996 dollar amount was converted to 2005 dollars using the U.S. wage inflation index or the U.S. retail inflation index, the purchasing power was maintained. Research Question 2 FTE Enrollment Students are the heart of community colleges, and the Florida community college system counts students in two ways: FTE enrollment (funded) and unduplicated headcount enrollment. According to the explanations in the Florida Department of Education Florida Fact Book (2006) FTE enrollment (funded) refers to the student semester hours divided by 30 for a dvanced and
63 professional, p ost secondary v ocational instruction, and c ollege p rep. For all other instruction, instructional clock hours are divided by 900 hours. Adults with disabilities are usually grant funded so their funding can fluctuate from y ear to year and this designation impacts how the sub population is accounted for in the reports. For academic years 1996 1997 and 1997 1998 fund distribution, the previous 19 99 forward, the fund distribution used data fro m the previous 2 years of data collection Figure 4 5 compares the trend from 1996 to 2005 of the operating budget versus FTE enrollment (funded). For the 10 year period, the total operating budget increased, but FTE enrollment started with a rather fla t slope, had a sudden increase, and is now leveling off Figure 4 6 compares the changes from 1997 1997 to 2005 2006 of the performance based incentive funding with student FTE enrollment. Performance based budgeting is used to add funds to the system w ithout changing the base allocation. Looking at the graph, the funding amount does not appear to have any relationship with FTE enrollment for the system anticipated or actual The unduplicated headcount enrollment refers to the unduplicated count of stud ents served by each college, excluding recreation and leisure students. Figure 4 7 shows the comparison o f FTE enrollment (funded) versus the unduplicated headcount enrollment, dramatizing the difference in the populations served depending on how to asses s the student count. The greater the difference between the unduplicated student headcount and the FTE enrollment ( funded) the more part time students. Par t time students may be pursuing a degree or certificate, but may take double the amount of time to complete the program as compared with a full time student. Some students also attend community colleges without the intent of completing a degree, but take a small number of courses to achieve their very specific goals
64 (Bailey, Leinbach & Jenkins, 2006 ). Community colleges have an open door enrollment policy and have limited control relative to the number of classes a student might take. While this serves col based budget. Figure 4 8 shows the comparison between the total operating budget, the unduplicated headcount, and FTE enrollment (funded). This graphic display shows that while FTE enrollment (funded) remained almost const ant in the late 1990s, the unduplicated student count decreased. When FTE enrollment (funded) started to increase in 2000 2001, the unduplicated headcount increased. Cumulative Performance based Budget Funds by College Figures 4 9 and 4 10 show the cumul ative performance based budget funds by college from 1996 to 2005 first in alphabetical order and then by FTE enrollment (funded) groupings. Figure 4 9 presents the wide fluctuations by community college s relative to the cumulative distribution of the fun ds. Figure 4 10 uses the same data and shows the relationship among college size, FTE enrollment (funded), and the cumulative distribution of funds. As a general trend, the greater the FTE enrollment (funded) the greater the performance based funds alloc ated. The cumulative performance based budget funds from 1996 to 2005 for seven community colleges (Florida Community College at Jacksonville South Florida, Seminole, Indian River, Daytona Beach, Palm Beach and Hillsborough ) were below the levels of the ir counterparts in the FTE enrollment (funded) groups. Valencia Community College was above the level of cumulative funds within its group and Miami Dade College received the most funds during the time period. Performance based Incentive Funding by Co llege Figures 4 11 through 4 15 show the percentage of performance based incentive funding from 2000 to 2005 for each college within the five FTE enrollment (funded) groups. Figure 4 15
65 focuses on those in Group 1 and generally shows that the greater the FTE enrollment (funded), the more funds the college receives. Looking at the trends by college for the time period 2000 through 2005, 7 of the 10 experienced a decrease in the percentage of funds from the total performance based budget allocation. Flori da Keys, Lake City, and South Florida community colleges in 2005 showed an increase, however each college is receiving less than 1% of the total amount. As a group, these 10 community colleges collectively received 10.4% to 11.5% of the allocation. Group 2 in Figure 4 12 also shows a trend that the greater the FTE enrollment (funded), the greater the percentage of the college share of the performance based budget. Seminole is the exception to the trend, with a lower percentage of funds than those communi ty colleges with more similar FTE enrollment (funded). Seminole Community College in 2005 showed an increase in the percentage of funds, going from just over 2% in 2000 to not quite 4% in 2005. Gulf Coast Community College experienced an almost flat perc entage from 2000 to 2005 with a percentage range of 1.8% to 1.7%. Pasco Hernando Pensacola and Seminole community colleges showed an increase in the percentage of funds for 2005, but the remaining colleges showed a decrease. Brevard Community College r emained at 4.5 % from 2002 to 2005, the most consistent of all 28 community colleges. As a group, these nine community colleges collectively received 28.4% to 29.6% o f the allocation Figure 4 13 or Group 3 by FTE enrollment (funded) reflected a differe nt pattern for each of the five colleges in the group. This grouping did not follow the pattern of the greater the FTE enrollment (funded), the greater the percentage of funds, within the group. Indian River Community College from 2000 to 2004 experience d an almost constant percentage, but in 2005, had a n increase in percentage going from 2.5 % in 2004 to 3.9% in 2005. Daytona Beach Community College showed a slight increase and then a decrease, but in 2005, it started to show an increase in percentage. T he range
66 of percentages was 2.9% to 3.8 %. St. Petersburg College showed a slight increase from 2000 to 2001 and then experienced a declining perc entage for 2003, 2004, and 2005 with the percentage at 7.0 % in 2001 but declining to 5.5% in 2005. Palm Bea ch Community College showed an increase in the percentage from 2000 to 2002, a decrease in 2003 followed by an increase in 2004, and then its lowest percentage of the 5 years in 2005. Hillsborough Community College had a decrease in percentage in 2001, a gradual increase from 2001 through 2004, and a decline again in 2005 As a group, these five community colleges collectively received 22.5% to 23.6 % of the allocation. Figure 4 14 or Group 4 by FTE enrollment (funded), shows a different pattern for eac h of the three colleges in the group. Valencia Community College from 2000 to 2003 showed a steady increase in the percentage of funds, but saw a decline in the percentage for 2004 and 2005. For 2003, the percentage was 11.2 % with a dollar value of $869 ,568 and for 2005 the percentage was 9.2% with a dollar value of $1,671,197. The percentage decreased but the dollar increased due to the change in performance based budgeting from 2003 ($7,676,374) to 2005 ($18,078,001.) Florida Community College at J acksonville experienced a slight increase in percentage of funds from 2000 to 2001, followed by a decrease in 2002 and a plateau for 2003 and 2004. The 2005 data reflect the largest percentage of funds during the 6 year time period. Broward Community Col lege experienced an increasing percentage of the funds from 2000 to 2005. As a group, these three community colleges collectively received 21.8 % to 25.1 % of the allocation. Figure 4 15 or Group 5, shows that Miami Dade College continued the trend that t he greater the FTE enrollment (funded), the greater the percentage of the allocation, but not a consistent pattern of percentage. Miami Dad e C ollege received 12.8% to 14.8% of the allocation. Figures 4 12 through 4 15 do not show if the increase or decre ase is due to a change
67 at the community college reflecting a change in performance, or if the other community colleges are taking a different share of the distribution. Figure 4 16 compares the p erformance based incentive funds per FTE enrollment (funded ) versus the FTE enrollment (funded). From 1996 through 199 9 no discernable relationship can be seen between the two figures. From 2000 to 2004, the FTE enrollment increased and the performance based funding per FTE continued to decrease. In 2005, a n i ncrease occurred in the funding per FTE, but the FTE enrollment started to decline. This graph infers that performance based funding allocation for the system is decided upon without using FTE enrollment (funded) as a consideration. If the amount of perf ormance based budget funding remains constant, but the number of completion points increase and the value per point declines. If the community colleges strive to increase the number of points possible from the AA measures, the college s could actually see less funding. It can be seen that the funding per FTE enrollment (funded) is impacted by the performance based budget allocation. In 2005 2006, the allocation changed from $7,676,371 to $18,075,996 ( T able 5 1.) Performance based Budget per FTE Figures 4 17 through Figure 4 21, depict the p erformance based budget per FTE by college within the FTE enrollment (funded) groupings for the time period of 2000 2001 through 2005 2006 The updated FTE enrollment (funded) data from the Florida Department of Educati on Fact Book (2006 and 2007 ) were used in the calculation of the ratio since these data were more accurate than the initial performance based budget information received. For all community colleges, 2001 reflected a decrease in the performance based budge t per FTE, which may be related to the decrease in funding from $8,320,831 in 2000 to $7,674,372 In 2005, all colleges reflected an increase in the performance based budget per FTE but that is more related to the increase in funding from $7,676,372 for 2001 to 2004 to $18,078,001 for 2005. For 2001
68 through 2004, the budget amount remained constant at $7,676,372. In Figure 4 17, the predominant pattern for those community colleges in Group 1 is a steady decrease from 2000 to 200 4 The FTE enrollment ( funded) increased but the performance based budget was constant. North Florida Community College was the only college which experienced the smallest range for the group of $27.76 to $31.77. Three of the community colleges (North Florida, Lake Sumter and Okaloosa Walton) showed an increase in the performance based budget per FTE in 2004. Due to the large increase in funds available in 2005 2006, all colleges reflected an increase in the budget per FTE. In Figure 4 18 or Group 2, the overall pattern sho wed a decline from 2000 to 2001, but then the dollar value appeared to level off. Gulf Coast, Pasco Hernando and Tallahassee community colleges have similar patterns in that they experienced a decline during the 2000 to 2004 time period. Manatee Communi ty College declined in the dollar value from 2000 to 2004 but the pattern is decline, hold ing that level for 2 years and then decline again. Edison Community College dollars per FTE are very erratic from 2000 to 2002 and then a decline in 2003 and 2004. Four of the community colleges (Pensacola, Brevard, Santa Fe and Seminole) experienced a decline in the dollars per FTE from 2000 to 2001, but the remaining years have almost remained constant. Figure 4 19 or Group 3, reflects an overall decrease in th e performance based budget per FTE from 2000 to 2001, but the patterns vary from 2001 to 2004. pattern. There was a decline in the dollars per FTE from 2000 to 2004 bu t the pattern is decline, hold ing that level for 2 years, and then decline again. Daytona Beach Community College and St. Petersburg college both experienced a decline in the dollars per FTE from 200 0 to 2004.
69 St. Petersburg range i s from $46. 07 in 2000 to $29.08 in 2004 where as Daytona Beach Community College decline was from $21.28 in 2000 to $17.28 in 2004. Palm Beach and Hillsborough community colleges experienced a decline in performance based budget per FTE from 2000 to 2001, but from 2002 to 2004, the dollar amount remained almost constant. For Palm Beach Community College, the ratio was $29.74 for 2002, $27.65 for 2003, and $29.31 for 2004. For Hillsborough Community College, the ratio was $24.27 in 2002, $23.12 for 2003, and $23.8 4 for 2004. Figure 4 2 0 or Group 4, depicts a different pattern for each college for the performance based budget per FTE. Valencia Community College experienced a decline from 2000 to 2001, but it is the only college in all of the groupings that has sho wn an increase in the dollar value from 2001 to 2004. The performance based budget per FTE for 2001 was $35.19, for 2002 was $38.10, for 2003 was $40.42 and for 2004 was $41.07. Florida Community College at Jacksonville experienced a decline in the dollar value from 2000 to 2001, but the value is relatively constant from 2002 to 2004. For 2002 the ratio was $21.92, for 2003 the ratio was $21.13 and for 2004 the ratio was $22.36. Broward Community College experienced a decline from 2000 to 2002 and the n an increase from 2003 to 2004. For 2000 to 200 4 the performance based budget per FTE at Broward Community College remained fairly level with the ratios being $28.82 in 2000, $26.05 in 2001, $24.32 in 2002, $25.65 in 2003 and $26.95 in 2004. Figure 4 2 1 or Group 5 shows Miami Dade College with its pattern being very similar to Indian River and Manatee c ommunity c ollege pattern. There was a decline in the dollars per FTE from 2000 to 2004 but the pattern is decline, hold ing that level for 2 years, and then decline again The ratio was $30.60 for 2000, $21.59 for 2001, $20.89 for 2002, $18.00 for 2003 and $18.45 for
70 2004. Overall, by year, as the FTE enrollment (funded) increased, no consistent pattern occurred for the performance based budget per FTE. Research Question 3 The column graph in Figure 4 22 shows the trend during the past 10 years of the total AA measure points for the system. From 1996 to 1997, a 30% increase occurred in completion points, the highest increase in the 10 year period bu t from 1997 to 2001, the overall percentage of increase was 1%. The increase from 2001 to 2002 was 3%, from 2002 to 2003 was 11%, from 2003 to 2004 was 7% and from 2004 to 2005 the increase was 11%. It is important to note that the graph reflects the c ompletion points by year but the funding allocation is based upon completion points from the previous 2 years. For example, 2004 completion points are used to determine the performance based budget allocation for 2006 2007. Also, this study focused on p erformance funds do AA measure IV was excluded since it is categorized as an incentive fund and not a performance fund. Figure 4 23 compares AA measure completion points versus FTE enrollment (funded) for the time period 1996 1997 through 2005 2006 The F TE enrollment declined from 188,415 in 1996 to 185,773 in 1998 and then started to increase to 297,795 in 2003. The FTE enrollment (funded) then started to decline to 287,714 in 2005. The AA measure completion points increased from 41,793 in 1996 to 89, 730 in 1999. A decrease occurred in completion points for 2000 and 2001 and then the completion points increased to 90,286 in 2002. From 2002 to 2005, the completion points increased from 90,286 to 102,793. As FTE enrollment increased, no corresponding increase occurred in completion points. A lag time occurred between enrollment and completion, but from 1999 to 2003, the degree of increase for FTE enrollment was not the same for AA measure completion points.
71 Follow up Figure 4 24 compared AA Measures I, II, and III from 1996 through 2005. Measure I (completers) started to include dual enrollment in 2000 2001 and had a steady increase in points Measure II (special categories) in 1998 1999, started to include black males as a special category and show ed slight fluctuation, but had a steady increase in points. Measure III had a change in measure components in 1998 1999 with the inclusion of partial completion points In 1999 2000, partial completion points were deleted from Measure III and job placeme nts greater than $10 per hour were included. There was an increase in Measure III points from 1997 1998 to 1999 2000 with the peak in 2001 2003, a period of decline and then an increase again in 2005 2006. Figures 4 25 through 4 29 focus on the total AA m easures from each college, with the colleges divided into five groups, based upon reported FTE enrollment (funded) from 2000 through 200 5 The comparison by FTE enrollment was done to level the playing field between colleges to see if size was a factor in the generation of completion points. Figure 4 25 compares the completion points for those colleges in Group 1 with FTE enrollment of less than 4,500. For six of the community colleges (Florida Keys, North Florida, Chipola, St. J ohns River, Polk, and Oka loosa Walton), as the FTE enrollment ( funded) increased, the AA measure completion points increased. Four community colleges (Lake Butler, Lake City, South Florida and Central Florida) showed an increase but not to the level congruent with the other comm unity colleges in the same group. Three community colleges reported a decrease in completion points from 2004 to 2005 : North Florida, St. Johns River and South Florida. Figure 4 26 compares the completion points for those colleges in Group 2 with FTE enr ollment between 4,501 and 10,000. The graph shows FTE enrollment (funded) increases with an increase in the number of AA completion points with the exception of Seminole
72 Community College. While Seminole Community College generally increased in completio n points, it was not at the same level of other community colleges closer to its FTE enrollment size. This relationship between FTE enrollment and AA measure completion points was not the same as identified in Group 1. Tallahassee, Pasco Hernando, Pensac ola, Santa Fe and Seminole community colleges all experienced a continued increase in the number of completion points. its fluctuation in measure points. There were 3,119 points for 2000, 3,261 for 2001, 2,424.65 for 2003, 2,719.75 for 2004 and 3,144 for 2005 Figure 4 27 compares the completion points of those community colleges in Group 3 with FTE enrollment (funded) between 10,001 and 15,000. While all community colleges showed an increase in completion points since 2000, they did not foll ow the same levels of increase as FTE enrollment increased and they did not exhibit the same pattern For all five community colleges, 2005 reflected an increase in the measure points. Two co mmunity colleges (Indian Ri ver and Daytona Beach) completion points are lower for all 6 years than the remaining colleges in the group as well as for half of the colleges in G roup 2, with a lower FTE enrollment (funded) St Petersburg College and Palm B each Community College had an increase in completion funds for 2002 through 2005. Figure 4 28 compares the completion points of those colleges in Group 4 that have FTE enrollment (funded) between 15,001 and 20,000. All three community colleges have a dif ferent pattern and the grouping does not follow the pattern of the more FTE enrollment (funded) the more completion points generated. Valencia Community College has been the most consistent with the range of completion points going from 9,788 in 2000 to 10,907 in 2005. Florida Community College in Jacksonville showed growth in completion points from 2000 to 2003 with
73 a difference in points of 802 points, but saw more of an increase from 2003 to 2005, a difference of 1,111 points. Broward Community Coll ege showed a continued increase in AA measure points with 7,216 points in 2000 and 11,034 in 2005, a difference of 4,513. Figure 4 29 focuses on Miami Dade College in Group 5 by itself since its FTE enrollment is so much greater than the other Florida comm unity colleges. Miami Dade generated 13,169 points in 2000 and 18,726 by 2005. are 68 % greater than the total AA m easure points for all of Group 1, encompassing 10 colleges. The last analysis for this section was a re view of Table 4.2, comparing the ratio of unduplicated student headcount per FTE enrollment (funded). The current performance based budgeting measures are for all community colleges, and the analysis so far has revealed that community colleges are not all the same. Table 4.2 reveals the range of the unduplicated student headcount per FTE enrollment (funded) ratio is 1.99 for Santa Fe Community College and 4.69 for Gulf Coast. The lower the ratio, the more full time students occur versus part time student s. Table 4.3 compares the two extremes of the ratio and shows that for the cumulative performance based funding from 1996 to 2005, Santa Fe Community College received $4,663,741 as m easure points, from 2000 2001 to 2004 2005, Santa Fe Community College accumulated 2 3,502 and Gulf Coast accumulated 7 940 a difference of 1 5,562 points This difference shows that the more full time students, the more completers for the system, but tha t was not a consistent finding for all community colleges. Research Question 4 Utilization of Performance based B udget Funds An initial email was sent to the president or vice president of finance for all 28 community colleges, requesting a telephone inter view to discuss how the performance based
74 budgeting funds are utilized. Of the 11 community college representatives who agreed to the interview, five were presidents four were the equivalent of chief financial officers and two held other titles, but all were members of the senior leadership. In order to encourage candor in their responses, the investigator agreed that no comments would be attributed to any individual or community college The common theme in the responses was that the performance base d funds were included in F und 1 accounts, also known as current unrestricted funds or operating budge t and the college can use its discretion as to how the funding is spent. Responders indicated the performance based budget funds are used to fund salarie s, pay for utilities or just pay for the cost to continue to operate. One responder stated that while share of the performance based budget was not great, the college used the funds as seed money to establish new programs to meet the needs of its service area Many responders voiced a similar comment: S ince the performance based budget fund was such a low percentage of the total, it often was overlooked. Viewpoints One responder felt the measures did not address the issue of credit versu s non credit. If a community college is meeting the needs of its community, that need does not always equate to a credit course. Another responder pointed out that the concept of performance based budgeting was supposed to be implemented at the universit y level, but that has not taken place. One responder expressed concern that changes to the current system might be at the expense of something else or place the college at more risk for its funding. Several responders reflected that the current measures are judging community colleges on factors that the college s have little control over relative to their students, since each distribution of the funds. One responder commented that he felt his college did extremely well
75 with the distribution of funding and acknowledged that his college focus es on areas to generate more funding, but they are careful not to lose sight of their mission. One responder indicated that after the college looks at the data they are concerne d with the outcomes, but not a great deal of interest was shown due to t he small amount of the funds. One responder felt some of the variances in the data might be attributed more to the accuracy of reporting versus actual performance. One responde r comm ented that the concept of performance based budgeting was sold as extra money but that really is not the case Several responders also made vague references to the games some community colleges play alluding to the fact that community colleges are suppo sed to have open enrollment, but some colleges do well because they have placed limits on enrollment. One college responder indicated that his college incorporated the performance measures into its strategic plan. Another responder shared that when his c ollege tried to discuss the measures with the faculty, through misunderstanding, the faculty feared that One college responder indicated that program mix has a part in the distribution of the funds. One r esponder indicated that the current measures are good, but the community college has limited control over the outcomes. One responder stated there needs to be a greater incentive in order for more attention to be focused on the measures. When asked if the y were familiar with the annual Performance Funding Report prepared by the Community College Office of Budget and Financial Services, their response was affirmative and many responders commented on the importance of the report. However, the majority of re sponders did not articulate how the college actually used the report. One responder indicated that his college does not reference the report and does not have the space to store the
76 report. One responder commented that it would be beneficial to have a n e xecutive summary of the Performance Funding Report. The study focused only on the AA measures and did not look at the other measures in performance based budgeting. While not the focus of the study, Figure 4 30 ranked the community colleges by FTE enrol lment (funded) groups and compared the total cumulative AA measure funding from 1996 to 2005 versus the total relative index for 2005 2006 of all the performance based budget allocations. With the exception of St. Petersburg College, the two lines are alm ost identical t hus inferring that the study is reflective of the performance based funding allocation by community college. Limitations of the Study The limitation s of this study includes that the review is restricted to those indicators reported by the community colleges to the state of Florida. This study analyzes the information reported during a certain period but not the accuracy of the information submitted to the state. The other limitation of the study is the changes in the components of the AA measures; however, have been consistent. In the process of analyzing the data, differences in the data occurred based upon when the source reports were generated or which definition was used for the report compone nts. For example, a ccording to the explanations in the Florida Department of Education Florida Fact Book (2006) FTE enrollment (funded) refers to the FTE for advanced and professional, college preparatory, post secondary vocational, apprentice, continuin g workforce education, vocational preparatory, adult basic education, adult/GED preparation, and adults with disabilities. The Florida C ommunity C ollege S ystem procedures and definitions exclude adults with disabilities in its definition of funded FTE Of the 294,818 FTE enrollment reported in 2006 for the
77 academic year 2004 2005, only 594 or 0.20% were counted as adults with disabilities so the study findings had limited impact Another example of discrepancies between reports centered on FTE enrollmen t (funded). As a system, for 2000 2001 data, the initial FTE enrollment (funded) was 193,477, as compared to the 2006 updated data of 267,344 a difference of 73,866. Some reports updated the values and other reports were snapshots as of that moment. For the analyses, when available, the updated information was used.
78 Table 4 1 Florida C ommunity C ollege O verall F unding F ormula Funding Purpose Direct Instructional Funding Instructional faculty funding and the instructional support funding plus Acad emic Support Funding Services to help support and supplement the instructional programs provided by the college, such as computer labs, academic administration, curriculum development, and support plus Library Funding Includes library materials, library t echnology, library staffing, and library operational expenses plus Student Services Funding Student services to assist students in pursuit of their educational goals and objective. These support services include registration and record keeping, counselin g and advising, administration of financial aid, assistance to the disabled, and placement services. plus Special Projects Funding Historical appropriations for unique services provided at eight of the colleges plus Technology Funding Internet bandwidth, computer labs, disk space for student coursework, computer labs, classroom media technology, wireless networks, and administrative systems plus Institutional Support Funding Functions or services that support basic operations, such as human resources, ac counting and finance, and purchasing. plus Physical Plant Operations & Maintenance Funding Building and equipment maintenance, police and campus security services, grounds operations and maintenance, utilities, facilities planning, and custodial services plus District Cost Differential (DCD) Funding The district cost differential factor is an effort to equalize funding based on differing costs of living for employees. equals Total Calculated Funding minus Standard Fee Revenues minus Projected Public Education Capital Outlay (PECO) Maintenance equals Funding Formula State Support Note : Additional funding from performance based incentives and performance based program funding defined as incentive measures (efficiency in terms of time to degree and college preparatory programs) and performance funds. When the program started in 1996 1997, it included the associate of arts (AA), associate of science (AS), vocational certificates (PSAV), college prep, and time to degree. Each fiscal year the programs and incentives included in the performance based budget are reviewed and changes can be made. Note : The explanation of the Community College Funding Model is from the Community College Funding Model prepared by the Community College Budget Office, Flori da Department of Education.
79 Table 4 2 Florida Community College Unduplicated Student Headcount per FTE Enrollment (Funded) for 2005 2006 College # of campuses and centers FTE enrollment (funded) 2005 2006 Annual unduplicated student headcount 2005 2006 U nduplicated student headcount/FTE enrollment Brevard 6 10,036.2 25,713 2.56 Broward 11 22,219.5 52,684 2.37 Central Florida 5 4,578.2 18,520 4.04 Chipola 1 1,661.6 5,209 3.13 Daytona Beach 6 11,794.5 27,911 2.37 Edison 4 7,090.7 17,111 2.41 Florida Community at Jacksonville 12 19,618.5 64,493 3.29 Florida Keys 1 771.5 2,814 3.65 Gulf Coast 5 4,722.9 22,140 4.69 Hillsborough 4 16,395.0 43,915 2.68 Indian River 10 11,968.2 35,928 3.00 Lake City 1 2,381.2 7,198 3.02 Lake Sumter 3 2,312.4 6,581 2. 85 Manatee 3 6,629.1 20,036 3.02 Miami Dade 8 50,447.4 132,060 2.62 North Florida 1 1,009.6 3,175 3.14 Okaloosa Walton 6 4,738.1 12,841 2.71 Palm Beach 5 15,405.6 47,572 3.09 Pasco Hernando 4 5,282.1 13,209 2.5 0 Pensacola 3 7,932.5 20,288 2.56 Polk 3 4,636.1 18,471 3.98 St. Johns River 6 3,687.0 9,296 2.52 St. Petersburg 12 15,304.2 47,694 3.12 Santa Fe 6 11,514.7 22,897 1.99 Seminole 5 10,646.1 30,374 2.85 South Florida 4 3,046.0 7,624 2.50 Tallahassee 6 11,012.6 27,381 2.49 Valencia 7 20,8 72.4 50,382 2.41 TOTAL 287,713.9 793,517 2.76 Note : The data in columns 3 and 4 are from the Florida Department of Education Fact Book (2007)
80 Table 4 3 Comparison of Gulf Coast Community College and Santa Fe Community College Gulf Coast Santa Fe R atio of Unduplicated Student Headcount per FTE enrollment (funded) 4.69 1.99 Cumulative p erformance based funding from 1996 2005 $1,619,739 $4,663,741 AA Measure Points 2000 2001 1653 4312 2001 2002 1562 4493 2002 2003 1550 4822 2003 2004 1531 4822 2004 2005 1645 5054 Total 7,940 2 3 502 Note : The data in this table were generated from the Florida Community College System Fact Book ( Florida Department of Education, 2006 ) and the performance based data provided by the Florida Community College System ( Community College Office of Budget and Financial Servic es 2006 )
81 Figure 4 1 Total operating budget v ersus performance based budget, 1996 through 2005 Figure 4 2 Percentage of P erformance based Budget of the t otal o perating b udget, 1996 thro ugh 2005
82 Figure 4 3 Components of the o perating b udget, 1997 through 2005 Figure 4 4 Comparison of the t ypes of f ees per F ull Time Equivalent 1997 through 2005
8 3 Figure 4 5 Total o perating b udget v ersus FTE e nrollment (funded), 1996 thro ugh 2005 Figure 4 6 Performance b ased i ncentive f unding v ersus s tudent FTE e nrollment (funded), 1996 through 2005
84 Figure 4 7 Full Time Equivalent e nrollment (funded) v ersus u nduplicated h eadcount e nrollment, 1996 through 2005 Figure 4 8 Tot al o perating b udget v ersus Full Time Equivalent e nrollment (funded) vs. u nduplicated s tudent h eadcount, 1996 through 2004
85 Figure 4 9 Cumulative performance based budget funds per college, 1996 through 2005 Figure 4 10 Cumulative p erformance b ased b udget f unds per c ollege, ranked by FTE e nrollment (funded) g roup, 1996 through 2005
86 Figure 4 11 Percentage of p erformance based i ncentive f unds by c ollege, for FTE Group 1, 2000 through 2005 Figure 4 12 Percentage of performance based incentive funds by college, for FTE Group 2, 2000 through 2005
87 Figure 4 13 Percentage of p erformance based i ncentive f unds by c ollege, for FTE Group 3, 2000 through 2005 Figure 4 14 Percentage of p erformance based i ncentive f unds by c ollege for FTE Group 4, 2000 through 2005
88 Figure 4 15 Percentage of performance based incentive funds by college, for FTE Group 5, 2000 through 2005 Figure 4 16 Performance based i ncentive f unds per FTE e nrollment (funded) v ersus the FTE e nrollment ( funded), 1996 through 2005
89 Figure 4 17 Performance based incentive funding per FTE enrollment, for FTE Group 1, 2000 through 200 5 Figure 4 18 Performance based incentive funding per FTE enrollment, for FTE Group 2, 2000 through 200 5
90 Figur e 4 19 Performance based incentive funding per FTE enrollment, for FTE Group 3, 2000 through 200 5 Figure 4 20 Performance b ased i ncentive f unding per FTE e nrollment, for FTE Group 4, 2000 through 200 5
91 Figure 4 21 Performance b ased i ncentive f unding per FTE e nrollment, for FTE Group 5, 2000 through 2005 Figure 4 22 Total AA measure points I, II, and III, 1996 through 200 5
92 Figure 4 23 AA m easure c ompletion p oints v ersus FTE e nrollment (funded), 1996 through 2005 Figure 4 24 AA m easure c ompletion p oints, by Measure I, Measure II and Measure III, 1996 through 2005
93 Figure 4 25 Total AA m easure p oints I, II, and III by Community College Group 1, 2000 through 2004 Figure 4 26 Total AA m easure p oints I, II, and III by c o mmunity c ollege G roup 2, 2000 through 2004
94 Figure 4 27 Total AA m easure p oints I, II, and III by c ommunity c ollege G roup 3, 2000 through 2004 Figure 4 28 Total AA m easure p oints I, II, and III by c ommunity c ollege Group 4, 2000 through 2004
95 Figure 4 29 Total AA m easure p oints I, II, and III by c ommunity c ollege Group 5, 2000 through 2004 Figure 4 30 Total c umulative AA m easure f unding 1996 2005 v ersus All Performance based Funding Index 2005 2006
96 CHAPTER 5 DISCUSSION The Florida Community College System is a major component of the economics in Florida, and it is a major player in the preparation of the workforce for the future. The purpose of this study was to conduct a descriptive analysis of trends in Florida community college s relative to the performance based budgeting funding, focusing on the associate of arts measures. Other research studies have targeted faculty awareness of performance based funding, equity among colleges, or inferential analysis for future outcomes and a descriptive statistical analysis using a secondary data set from the Florida Department of Education. Florida Performance based B udgeting Overview The concept behind performance based budgeting and funding is rewarding those who achieve the desired outcomes. From a community college perspective, two questions remain: (a) How to determine the indicators that reflect the outcomes? and (b) How to provide the incentives to change the paradigm college wide to focus on the o utcomes and not the inputs or processes? The Florida Legislature during its 1996 1997 session, instituted an appropriation for the creation of the performance funding program for community colleges. The program began with workforce education consisting of a ssociate of s cience, p ost secondary, vocational, and p ost secondary a dult v ocational. S pecific w orkforce funding ended in 2004 2205 when w orkforce education was melded into the community college program fund. Wright, Dallet and Copa (2002) described the based program for community colleges as Ready, Fire, Aim. Performance funding in Florida is a story of scattered shots of legislative policies that tried to tie state funding to campus performance. The s tate legislature fired several shots in the direction of performance funding, but its aim often appeared random and erratic (p. 138)
97 Performance Funding Report for 2006 2007 state d : the emphasis on performance funding has had measurable impact in terms of increasing the graduates in the Florida Community College System (p 3). Wright, Dallet showed with the indicators in place since the beginning of the p erformance based budgeting that community colleges have maintained or improved performance. But it is not clear if the increases can be traced to performance funding, better data, institutional policy modifications, articulation agreements and time to de gree legislation or all of these factors. According to the Florida Department of Education Office of Program Policy and Analysis titled Performance based P rogram B udgeting (no date) during the early years of the pe rformance based program, the Florida L egislature was greatly involved in the selection and monitoring of the performance measures and results. In 2006, the Florida Legislature passed Chapter 2006 122, Laws of Florida which created Section 216.1827 Florid a Statutes to disjoin the approval of performance measures and standards from the legislative appropriation process. Agencies now provide information on their legislatively approved performance measures and standards in their long range program plans. Int rospective Review While the current funding model may not be ideal, the process has begun to link funding with performance. Under the tutelage of the Florida Community College System staff, the Florida Association of Community Colleges Funding Formula Com mittee created a subcommittee to review the p erformance based budgeting measures. At its September 2006 meeting, the subcommittee discussed the following questions: Are the current performance based budgeting measures appropriate? Are the values (points) for each measure valid?
98 What measures and values should be established for Educator Preparation Institutes (EPIs)? Is the allocation of funds appropriate for each category? What measures should be added? What measures should be deleted? What other conside rations/issues need to be addressed? The self review of the current model of measures and points is a practice repeated elsewhere in the nation. Alfred, Shults and Seybert (2007) recently released the third edition of Core Indicators of Effectiveness fo r Community Colleges : The rapid evolution of e business, the emergence of new forms of competition, the movement towards a global economy, the tragic events of September 11, and the war on terrorism have occurred since the second edition of Core Indicator s of Effectiveness for Community Colleges was published. These events have forced most organizations to reexamine their priorities and to place more emphasis on providing value to stakeholders in an environment in which change is the only constant (p. v) An Assessment Framework for the Community College: Measuring Student Learning and Achievement as a Means of Demonstrating Institutional Effectiveness (2004), identified community college stakeholders as stud ents, administrators, trustees, faculty, staff, parents, colleges, accreditation boards, businesses, and the community. How do community colleges quantify their value to this wide array of stakeholders? A review of the literature reflects hundreds of per formance indicators in use in higher education and the mass media in the United States, Canada, and Western Europe. The third edition of Core Indicators of Effectiveness for Community Colleges recommen ds that core indicators include the following: Mission : Student Progress Core Indicator 1 Student goal attainment Core Indicator 2 Persistence Core Indicator 3 Graduation rates Core Indicator 4 Student satisfaction
99 Mission: General Education Core Indicator 5 Success in subsequent and related coursework Core Indicator 6 Program learning outcomes and mastery of discipline Core Indicator 7 Demonstration of general education competencies Mission: Outreach Core Indicator 8 Regional market penetration rates Core Indicator 9 Responsiv eness to Community Needs Mission: Workforce Development Core Indicator 10 Placement rates Core Indicator 11 Licensure and certification pass rates Core Indicator 12 E mployer satisfaction with graduates Core Indicator 13 Client satisfaction wit h programs and services Mission: Contribution to the Public Good Core Indicator 14 Value added to the community Mission: Transfer Preparation Core Indicator 15 Transfer rates Core Indicator 16 Performance after transfer The Florida Community Colle ge System The Florida Community College System is comprised of 28 community colleges with a student profile of 37% full time and 63% part time ; the average age of students is 25 years, and 70 % of Florida students in public higher education attend a communi ty college, as reported in the community colleges Facts at a Glance (2006). The Florida Community College S ystem is heralded nationally and internationally for many reasons : Integrated and cohesive community college system Robust data systems where proced ures are accessible and updated, terms are defined in relationship to specific reports and the data collected are outcomes. Articulation agreements between high schools and community colleges and community colleges and sta te universities Statewide numbering system
100 Under Title XVI, Chapter 240, Section 240.324 F lorida Statutes the State Board of and implement a plan to improve and eval uate the instructional and administrative efficiency and Florida Department of Educa tion, 2004, p. 1). This dissertation focused on the AA m easures component of the performance based funding outcomes i n Florida community colleges. Findings Research question 1 : What has been the overall change to percentage of funding by category as a result of the implementation of Florida community college performance based budgeting? The total operating budget consi sts of the general revenue, Florida L ottery funds, and student fees. The performance based incentive funding is a way to add funds to the system So the performance based incentive funds are placed in the general revenue F und 1 category and not as a separate line item Analyzing the total operating budget and the performance based budget by year from 1996 to 2005, no identifiable relationship was discerned between th e two budget figures. Using the 1996 total operating budget of $911,970,588, an inflation calculator was used to determine if the community college system has maintained its purchasing power. In 1996, the funding per FTE was $3,921 ; in 2005, the funding per FTE was $4,923. The 1996 figure converted to 2005 dollars would be $4,707 using the U.S. wage inflation and $4,730 using the U.S. retail inflation index. If the 1996 dollar amount was converted to 2005 dollars using the U.S. wage inflation index or t he U.S. retail inflation index, the purchasing power was maintained. Analysis of the components of the total operating budget revealed that the student fees are occupying an increasing percentage of the total operating budget. In the academic year 1999
101 2000, student fees represented 22% of the budget ; in 2005 2006 student fees represent ed 33% of the budget. The Florida L ottery funds component and the general revenue component have experienced a slight decrease in percentage of the total operating budge t from 1997 through 2001, but they have remained rather constant as a percentage from 2002 through 2005. Research question 2 : Utilizing full time equivalent (FTE) as a leveling approach, what are the differences in performance based budgeting trends by i ndividual Florida community colleges? The study focused on the AA measure component of the performance based budgeting allocation from the perspective of FTE. (The AA measure was selected since it traditionally has been the largest portion of the performa nce based budget allocation and the most consistent over time .) Using FTE enrollment (funded) to level the field, each of the 28 community colleges was analyzed to find an association between community college size, as defined by FTE enrollment (funded) a nd percentage of the fund allocation or measure points. Community college enrollment is affected by the economic factors of its service area. In general, during times of prosperity enrollment decreases When the region s economy is impacted, enrollment increases. When FTE enrollment (funded) was used as a way of ranking the community colleges, some natural breaks occurred between FTE enrollment (funded) to group the colleges. Group 1 included those with an FTE enrollment of less than 4,500; Group 2 in cluded those with an FTE enrollment (funded) between 4,501 and 10,000; Group 3 included those with an FTE enrollment (funded) between 10,001 and 15,000; Group 4 included those with an FTE enrollment (funded) between 15,001 and 20,000; and G roup 5 included those with an FTE enrollment (funded) greater than 20,000. In general, the larger the FTE enrollment (funded), the greater the cumulative performance based budget ing funds from 1996 to 2005, but several colleges did not fit that pattern. The cumulative p erformance based budget funds from 1996 to 2005 for seven community colleges (Florida Community College at Jacksonville South Florida, Seminole,
102 Indian River, Daytona Beach, Palm Beach and Hillsborough ) were below the levels of their counterparts in the FTE enrollment (funded) groups. Valencia Community College was above the level of cumulative funds within its group Analyzing the percentage of performance based budgeting funds allocated by FTE enrollment group from 2000 to 2005 differences occurred i n the percentages among the groups. Group 1 which accounts for 10 of the community colleges, collectively received from 2000 to 2005, 10.4% to 11.5% of the allocation. For Group 2, which accounts for nine of the community colleges, collectively received between 28.4% and 29.6% of the allocation. For Group 3, which accounts for five community colleges collectively accounted for 22.5% to 23.6% of the allocation. For Group 4, which accounts for three colleges collectively received 21.8% to 25.1% of the allocation. Group 5, which consists only of Miami Dade, received 12.8% to 14.8% of the allocation. Miami Dade College percentage was gr e ater than the collective percentage of the 10 community colleges in Group 1. The other component is the relationship between the unduplicated student headcount and FTE enrollment (funded). The individual student enrollment is not necessarily as a full time student or with the intent of completing a degree or certificate. The lower the ratio of unduplicated student hea dcount per FTE enrollment (funded), the more full time students. As a community college system, that ratio has consistently decreased from 3.296 in 1996 1997 to 2.714 in 2004 2005. However, looking at the data by individual community college s some wide variations occur in the ratio. The range in 2005 2006 was Santa Fe Community College with 1.99 to Gulf Coast Community College with 4.69. This ratio is a reflection of how the community college is serving its current service area and meeting its mission. Students returning to college may seek a degree or certificate but some are just taking courses to further their skills for the job market.
103 Th e concept of one set of measures for all community colleges, is where the one size fits all performance measu r es concept does not work well for all colleges being true to their mission of serving the needs of their community. The current model favors the larger FTE enrollment (funded) community colleges to receive a large share of the allocation T he range of th e unduplicated student headcount per FTE enrollment (funded) ratio is 1.99 for Santa Fe Community College and 4.69 for Gulf Coast. The lower the ratio, the more full time students versus part time students. Table 4.3 compares the two extremes of the rati o and shows that for the cumulative performance based funding from 1996 to 2005, Santa Fe Community College AA m easure points, from 2000 2001 to 2004 2005, Santa F e Community College accumulated 20,850 and Gulf Coast accumulated 6,871, a difference of 13,989. So the more full time students, the more completers for the system, but that was not a consistent finding for all community colleges. The current measures do not take into consideration the students who attend community colleges without the intent of completing a program. Research question 3 : Is there a measurable improvement in the Florida community colleges resulting from performance based budgeting for sele cted associate of arts measures? The study reviewed the history of AA measure points for each of the community colleges from 1996 to 2005. From 1996 to 1997 a 30% increase occurred in completion points, but from 1997 to 2001, the increase was 1%. Comple tions points increased by 11% from 200 2 to 200 3 7% from 200 3 to 200 4 and 11% from 2004 to 2005. From 2002 to 2005, the completion points increased from 90,286 to 102,793. As FTE enrollment increased, no corresponding increase occurred in completion poi nts. This showed a lag time between enrollment and completion, but from 1999 to 2003, the degree of increase for FTE enrollment was not the same as for AA measure completion points.
104 In order to level the playing field between colleges to see if size was a factor in the generation of completion points the community colleges were divided in five groups, based upon natural breaks in the number of FTE enrollment (funded). In general, as the FTE enrollment increased, the total AA measures increased. For 57 % of the time, the greater the FTE enrollment (funded), the greater the number of completion points. The remaining 43% (Lake Sumter, Lake City, South Florida, Gulf Coast, Pasco Hernando, Seminole, Indian River, Daytona Beach, Hillsborough, Florida Communi ty College at Jacksonville and Broward ) did not have the number of completion points similar to the colleges closest to them in FTE enrollment (funded). T hree community colleges reported a decrease in completion points from 2004 to 2005, (North Florida, St. Johns River and South Florida). points are 68% greater than the total AA m easure for all of Group 1, encompassing 10 colleges. No study to date has been able to attribute the increase in measure points solely to the imp lementation of performance based budgeting. The current performance based budgeting measures are for all community colleges, and the analysis so far has revealed that community colleges are not all the same. Table 4.2 reveal s the range of the unduplicat ed student headcount per FTE enrollment (funded) ratio is 1.99 for Santa Fe Community College and 4.69 for Gulf Coast. The lower the ratio, the more full time students versus part time students. Table 4.3 compares the two extremes of the ratio and reveal s f or AA m easure points, from 2000 2001 to 2004 2005, Santa Fe Community College accumulated 2 3,502 and Gulf Coast accumulated 7,940 a difference of 1 5,562 Research question 4 : How do Florida community colleges use performance based budgeting funds? T he last part of the study explored how the community colleges used the funds from performance based budget allocation. In order to encourage candor in their responses, the
105 investigator agreed that no comments would be attributed to any individual or commu nity college. Most responders indicated that since the funds are included in the general revenue, it is difficult to track their use. In preparing the community college budget, the funds often are used for paying other components of Fund 1 accounts. Fun d 1 accounts, also known as current unrestricted funds or operating budget are used to accomplish the primary and supporting objectives of the college. Some of the uses include direct instruction, academic support, student services, institutional support physical plant operation for salary and benefits, travel, materials and supplies and furniture and equipment. One college responder indicated that his college incorporate d the performance measures into its strategic plan A nother responder shared that when his college tried to discuss the measures with the faculty, through misunderstanding, the faculty fear ed that A third responder stated that while share of the performance based budget w as not great, the college used the funds as seed money to establish new programs to meet the needs of its service area. From the random quantitative sampling, no ground swell of activity occurred trying to affect those components of the college that would generate more points and thus reap the corresponding monetary allocation. Many responders voiced similar comments: S ince the performance based budget fund was such a low percentage of the total, it often was overlooked. While it is true that the percent ages by college were low, Table 5 1 reflects how many millions are set aside for performance based budgeting. Conclusion s Performance based budgeting in Florida has been in existence for over 10 years for the community college system. It provides a meth od for the distribution of additional funds each year, without modifying the basic funding methodology for the system. It also is a way to meet the strategic imperative of aligning financial resources with performance. The measures
106 developed to improve p erformance have been tweaked many times, but the AA measures have been the most consistent. Points are awarded for the categories of number of AA degrees, number of dual enrollment credit hours, a pro rata share of the number of AA graduates who are part of subpopulations and the number of completers or partial completers who were placed in jobs or transferred to the State University System. The relationship between enrollment and mission of the community colleges with the performance based budgeting is loose and flexible. If the FTE enrollment increased or the completion points increased, the performance based budget amount was not impacted and other criteria decided the budget value. The performance based budget amount was not sensitive to the FTE enro llment (funded) or the unduplicated student headcount. Since the system shares the funds if the budget remained the same and more points were generated as a system the value per point would decrease. As one college ening to increase your measure points and receive less Another component is the relationship between the unduplicated student headcount and FTE enrollment (funded). The lower the ratio of unduplicated student headcount per FTE enrollment (fund ed), the more full time students, and the current model favors the community colleges with the larger FTE enrollment (funded). The more full time students, the more completers for the system, although that was not a consistent finding for all community c olleges. The most common pattern was the greater the FTE enrollment (funded) the greater the share of the performance based funding for the AA measures. The current performance measures are not sensitive to the conditions of the education of the student relative to being part time or fulltime. The individual student enrollment is not necessarily as a full time student or a student with the intent of completing a degree or certificate In addition, the c ompletion rates are based upon a
107 set period and do n ot necessarily take into consideration students whose completion time exceeds the period. For example, students who complete a degree or certificate in 1996 1997 are tracked from 1992 1993 through 1996 1997. Students returning to college may seek a degre e or certificate but some students are just taking courses to further their skills for the job market. Th e concept of one set of measures for all community colleges is where the one size fits all performance measu res concept does not work well for all col leges Looking at the data, there are some wide variations in enrollment and percentage of fulltime students at Florida community mission of meeting the needs of the community. From 1996 to 1997, the first year of performance based budgeting, a 30% increase occurred in completion points, the highest increase in the 10 year period From 1997 to 2001, the overall percentage of increase was 1% the increase from 20 01 to 2002 was 3%, from 2002 to 2003 was 11%, from 2003 to 2004 was 7%, and from 2004 to 2005, the increase was 11%. What cannot be ascertained from the performance based data, is if the increase in measure points is due to increased enrollment, an increa se in full time students, changes to the criteria for the AA measures, increased accuracy of the data submission or related to the performance based budgeting having a measurable impact in terms of increasing the graduates in Florida community colleges. F or example, there were increases in the measure points when performance based budgeting funds remained constant from 2001 2002 to 2004 2005. With the differences among Florida community colleges, the growth of the systems measure points may be more impact ed by the actions of subset of the system rather than for the system performance as a whole. For points for all 10 community colleges in Group 1.
108 The strategic plan includes aligning the financial resources with performance. However, the current model does not require that funds received from performance based budgeting be reverted to the programs that generated the measure points nor that the funds are used to gener ate more points in general. The performance based budgeting funds are received in the unrestricted fund 1 account and there is no mechanism in place to track how the funds are used. The college can use its discretion as to how the funding is spent. Resp onders indicated the performance based budget funds are used to fund salaries, pay for utilities, or just pay for the cost to continue to operate. As a percentage of the total budget the performance based budget value is such a low percentage of the total it often is overlooked. There appeared to be more sensitivity of the potential influx of funds by the smaller colleges versus the larger colleges. However, if the percentage of performance based budgeting increased and placed more of the total operatin g budget in jeopardy, colleges may loose funds based on factors that the colleges have little control the funds. Community colleges are designed to be sh ort term education suppliers, yet the examination of the data has a long lag time. By the time the data are gathered, submitted, analyzed, and policy changes implemented, new groups of students are attending the community may be addressing problems from 5 years ago issues. Higher education has a long history of demands for significant change to improve the quality, efficiency, and accountability, but despite the reports, studies and recommendations, progress i s not sufficient if the United States is to retain its international leadership role in innovation and higher education (Diamond, 2006).
109 The systems are in place at a state level to generate useful information, but the value of the information can be def ined only by accuracy and its use to focus more on outcomes versus inputs. The crux of the issue is how to motivate every community college to increase its performance of the measures without affecting its mission of serving the needs o f the community. Additionally, what will induce the individual community colleges to look at its data longitudinally and act upon the analysis? Implications of the Findings The Florida Department of Education database of information about higher educatio n is heralded by many for the depth of information collected. However, the data are only as good as the level of accuracy submitted and if the collected information is utilized. Consistency In the research for this study, several different sources were u tilized to help define the terms used in the source reports. The similarity in terms made the analysis at times difficult to understand and often led to confusion if the investigator was comparing the same or similar information. The Community College & Technical Center MIS (CCTCMIS) assumed the combined functions of two previous entities : the Bureau of Research and Information Systems (BRIS) for community colleges and Workforce Education Information Systems (WEIS) for the technical centers. These data a re then used to produce a variety of reports, including the yearly Fact Book and the Performance Funding Report Since reports are generated by different divisions within the Department of Education, sometimes the definitions were different but accurate fo r their specific use. For example, the Florida Department of Education Florida Fact Book (2006) included FTE enrollment (funded) adults with disabilities while the Florida C ommunity C ollege S ystem procedures and definitions excludes adults with disabili ties in its definition of funded FTE. For instance for
110 2004 2005, the 294,818 FTE enrollment as stated in one report included 594 or 0.20% students with disabilities, but another report excluded the students Data for this study were prepared based upon the information submitted by the community colleges. Looking at the data longitudinally, as other reports were generated, the data values change d For example, the definition of how to compute FTE enrollment (funded) is well defined, and it would be ass umed that the data submitted would be accurate. As a system, for 2000 2001 data, the initial FTE enrollment (funded) was 193,477, as compared to the 2006 updated data of 267,344 a difference of 73,866. Some reports updated the values and other reports we re snapshots as of that moment. Use of the Data theme was identified : T he responder was aware of the performance based budgeting funding model and the Performance Funding Rep ort generated by the Community College Office of Budget and Financial Services However, in most cases, the responder could not articulate how that college used the data and little time was expended on looking at the details of the report. Several colle ge senior management leaders stated that they received the report, but did not use it and did not have the space to store it. The Division of Community Colleges strives to impress upon the colleges the importance of verifying the accuracy of the data. At one particular community college, a faculty member Performance Funding Report Going through the chain of command at the community college, it was almost 18 months before an identified error was acknowledged and corrected.
111 Selection of Measures each communit y has different needs. One set of measures does not fit all needs due to the differences among community colleges relative to location, population, community economic factors, competition with for As identified in this study, the ratio of unduplicated student headcount pe r FTE enrollment (funded) ranges from 1.99 to 4.69. The difference in the amount of part time versus full time students has implications as to how courses are scheduled and Many students return to college to improve their sk ills, not necessarily to complete a degree or certificate (Bailey et al., 2006) The community college adds value to the community, but the college may not reap any rewards for meeting this community need. M easures should be in place to address the large r colleges as well as the smaller colleges and the college sub populations. There are significant differences in upbringing, family, and job responsibilities and academic outlook especially when contrasting students under 21 and those considered non tradi tional students (Adelman, 2005) Students attend more than one institution and with on line courses, may simultaneously attend more than one institution. The true outcome of a student may not be attained until several years beyond the current tracking sys National longitudinal studies reflect that students change institutions within a state, but also across state lines, but current tracking mechanisms do not capture this information (Callan, Doyle & Finney, 2000). Th is gap in the data skews our understanding of the current state of education. The conceptual framework for the study was the relationship that performance based budgeting will lead to increased performance. Resources (inputs) are related to activities (st ructure and results (outcomes) (Goldstein, 2005). One of the criticisms performance based budgeting is that the performance measures at the state
112 level are not easily linked with individual the college mission. This leads to the question, is the mission set by the state or is the mission set by the community college? Policy Recommendations Specific to the Florida Community College System Review of core measures: Alfred, Shults and Seybert (2007) succinctly pointed out that organizations need to place m ore emphasis on providing value to stakeholders. This value may not be the same throughout the Florida C ommunity C ollege S ystem and the current measures of the performance based incentive funding program may not be the best reflection of the value added to the community and the advancement of education in the sta t e of Florida. This investigator recommends that the current measures be reviewed, perhaps using the list of core measures outlined in the third edition of the Core Indicators of Effectiveness fo r Community Colleges as a starting point. Conceivably, the Chancellor of the Community College division could appoint a task force to analyze the data in the Performance Funding Report not just from a system perspective but also from the community colleg e level The t ask force could explore the use of the Tennessee model (Burke, 2005 b) where a college selects an indicator that simultaneously meets the state The fungibility of the new system would simultaneously meet the objectives of the Florida Community College Strategic plan and be congruent with the individual community college mission of serving the needs of the community. Establish a link between measures and funding: Es tablish a reporting mechanism where the c ommunity college reports how the performance based funds were utilized to enhance their performance defined by the measures. Establishment of performance based budgeting based upon specific performance expectations: The Oregon business Council developed a sample unified performance based budget for preschool to grade 20 ( a s cited in Callan Finney, Kirst, Usdan, & Venezia (2006) ) that allocated funds relative to specific performance expectations f or example, XX% of entering AA students complete their deg ree. Updating of databases: This investigator recommends that a schedule be developed whereby the Division of Community Colleges units that receive data from the Community College & Technical Center MIS (CCTCMIS) update their databases and reports. I n addition, one glossary of terms and definitions should be developed and utilized throughout the D ivision of Florida Community College s Availability of the Performance Funding Report : This investigator recommends that the Community College Office of Budg et Performance Funding Report be posted on the Florida Community College website along with the annual Fact Book. With broader distribution, the data would be reviewed and utilized by d eans, c hairs and program coordinators.
113 Nation al Policy Recommendations to Community College Systems This was a population based study focusing on the Florida C ommunity C ollege S ystem. This investigator poses the question: While not the initial intent of the study, could the findings and recommenda tions be generalized to other state systems ? The specific findings probably cannot be generalized to other community college systems for two reasons: (a) the core measures and assignment of point values are unique to the state of Florida and (b) the stud ent demographic data are not the same. For example, Table 5 2 compares specific data elements for Florida and Illinois. Illinois was selected for the comparison since the 2000 Population C ensus ranked Florida 4 and Illinois 5 Table 5 2 shows the di fferences in student headcount, FTE enrollment, the ratio of student headcount per FTE enrollment and especially the percentage of full time and part time students. Another example of the differences is in the measures by state or commonwealth For examp le, th e Virginia Community College System measures fall into three broad categories : student experience, s ystem wide measures facilities and operations and faculty productivity. The student experience measures include the student achievement rate which looks longitudinally as students enroll, transfer, complete and graduate from a program, the number of transfer students to a 4 year institution in Virginia, the percentage of undergraduate courses with fewer than 20 students, the percentage of undergradu ate courses with 50 or more students, the percent age of lower division courses taught by full time faculty and the first time, full time graduation rate after 3 years. The system wide measures facilities and operations include classroom and laboratory spa ce utilization (how many hours per week an institution offers courses in its classroom and laboratories and the occupancy rate, the funds spent on instruction and academic support with the focus that an institution places on instruction versus other activ ities such as administration, departmental research and public service, the percentage of
114 management standards met and the debt service to expenditure ratio. The faculty productivity focuses on the credit hours taught per FTE faculty (McHewitt, 2004). S tates offer a range of efforts toward improved accountability, critical achievement measures and evidence based progress toward reaching their objectives (Broom, 1995). It is also recognized that while there is considerable momentum for K 12 accountabilit y, there are differences between K 12 and higher education. K 12 accountability systems are based on a common set of learning standards and statewide assessments, while in higher education; there is diversity of learning goals, institutional missions and no single curriculum (Wellman, 2003). performance based budgeting is such that these findings cannot be generalized however, there are some advisory comments that other systems may consider: Establishment of core measures: Community colleges serve the needs of their stakeholders and one set of measures set fits all may not reflect the total value added by the community college. This investigator recommends that the current measures be reviewed, perhaps u sing the list of core measures outlined in the third edition of the Core Indicators of Effectiveness for Community Colleges as a starting point Perhaps using Tennessee as a model (Burke, 2005b) where a college selects an indicator or indicators that simu Establishment of performance based budgeting based upon specific performance expectations: This would set thresholds for the measures in order for the funds to be obtained similar to the perf ormance based budget developed by the Oregon business Council (as cited in Callan et al., 2006) f or example, XX% second year retention of incoming freshmen Establishment of a link between measures and funding: The investigator recommends evidence based d ocumentation that demonstrates how a community college is using the funds to improve the measure outcomes. Best practices could be identified and utilized by other community colleges Mercer (2003) recommends a cascade performance budgeting framework that integrates budget and performance information and links long term goals to day to day activities. Lingenfelter (2003) believes performance funding should connect specific monetary incentives with performance. Expanded access to data to community colleg e specific outcomes data: The value s of data are defined by accuracy, the construct validity, and eventual use. By increasing the number of individuals who have access to the information, the greater the
115 probability of the data being viewed and used. The more exposure of the data the more use, and ultimately more focused effort to improve the outcomes. Recommendations for Further Research Part of the P erformance F unding R eport generated by the Community College Office of Budget and Financial Services, consists of a series of tables that first summarize the number of degrees by program title and then break s down the information by community college. For example, for associate of science degrees, a table includes the following information for t he system : the program title, the system standard degree length in hours ; the number of degrees awarded for the past 2 years ; the annual change in degrees awarded ; and the annual percent change in degrees awarded. For each community college, the informati on is then presented reflecting the following information: the program number ; the program title ; the number enrolled ; the number of degrees awarded for the past year, the system standard degree length in hours ; the college associate of science degree cost ; the program cost factor relative to the AA degree ; and the weighted completers relative to the AA degrees. A study could look at the proliferation of programs throughout the system whose enrollment and graduation trends may not be reflective of efficien t use of funds. A follow up study could explore the variances in associate of science degree cost per program across the system. Table 5 3 showing a range of associate of science degree costs. S takeholders have questioned the value of higher education and have challenged how colleges are managed Alternatively, the study could review the efficiency of having programs where the enrollment is less than 10 per year or the number of completers per year is less than 5 per year. While the blueprint for community colleges in Florida built an easily accessible network for accessing post secondary education
116 with the infusion of online courses and a more mobile population, perhaps some adjustments need t o be made for the 21 st century. This study summarized the number of campuses and centers per community college. A follow up study could investigate the start up cost and recurring costs to maintain these additional sites. In particular, investigators nee d to look at the class size, utilization rate of the facilities, and perhaps do a longitudinal study regarding the students who s tart their education at the non primary campus or center. Frequently Asked Questions About Co llege Costs proportion of the overall budget pie. The culture of community colleges is such that while new programs are reviewed and approved, there is little effort in assessing if existing programs should continue. Investigators could conduct a study to look at the accuracy of the data provided to the Florida Department of Education Community College Division. Some responders have made comments about the accuracy of the information provided for the generation of the reports. The Office of Accountability, Research, and Measurement of the Florida Department of Education provide s a consistent se t of processes, but it is up to the community college to follow the procedures and verify the college submission. In the process of this study, a program code at a particular college showed no graduates for a 2 year time period, even though that college h ad graduates for that time period.
117 Table 5 1 Performance based B udgeting from 1996 1997 through 2006 2007 Fiscal y ear Amount 1996 97 $ 12,000,000 1997 98 12,000,000 1998 99 4,800,000 1999 00 8,074,032 2000 01 8,318,834 2001 02 7,674,371 2002 03 7,674,371 2003 04 7,674,371 2004 05 7,674,371 2005 06 18,075,996 2006 07 18,075,996 ( Community College Office of Budget and Financial Services 2007 ) Table 5 2 Comparison of Key Variables between Community College Systems (Illinois and Flor ida) Rank Population 2000 Census Year Student Headcount FTE Enrollment Ratio of Student headcount /FTE Enrollment Average Age Full time Part time Florida 4 15,982,378 2005 793,517 394,818 2.01 25 37 % 63 % Illinois 5 12,419,293 2004 363,204 218,891 1.66 2 8.81 36% 64% ( Illinois Community College Board, July 2005 the Florida Community College Facts at A Glance, 2006 2007 the Florida Fact Book 2006 and Demographia, n o date ) Table 5 3 Variances a mong College Associate of Science Degree Cost for a Sele cted Program for 2004 2005 # Enrolled # Degrees a warded College Associate of Science degree cost College 1 55 10 $148,931 College 2 54 3 $ 44,679 College 3 41 15 $223,396 College 4 24 13 $193,610 College 5 55 12 $178,717 ( Community College Office of Budget and Financial Services 2006 )
118 REFERENCES Achieving the Dream.(n.d.). Retrieved January 15, 2007 from http://www.achievingthedream.org/default.tp Adelman, C. (2005). Moving into town a nd moving on: the community college in the lives of traditional age students. Washington, DC: U.S. Department of Education. Retrieved October 26, 2007 from http://www.eric.ed.gov/ERICDocs/data/ericdocs2sql/content_storage_01/0000019b/80/28/ 07/dd.pdf Alexander, F. K. (2000). The changing face of accountability. Journal of Higher Education 71 (4), 411 429. Alfred, R., Ewell, P., Hudgins, J., & McC lenney, K. (1999). Core indicators of effectiveness for community colleges (2 nd ed.). Washington, DC: Community College Press. Alfred, R., Shults, C., & Syebert, J. (2007). Core indicators of effectiveness for community colleges (3 rd ed.). Washington, DC : Community College Press. American Association of Community Colleges (n.d.). Performance based funding: A review of five states. Retrieved November 11, 2005, from http://www.aacc.nche.edu/Content/NavigationMenu/AboutCommunityColleges/WhoAreY ou/Researchers/Performance_Based_Funding_ _A_Review_of_Five_States_ _South_Carolina.htm American Association of Community Colleges ( 2007 ). Fast Facts Retrieved October 21, 2007, from http://www2.aacc.nche.edu/research/index.htm Bailey, T. (2005, July). The new economy workf orce: Meeting industry demand with higher education reforms in Washington State. Paper presented at the Education & Productivity, 21 st Century Workforce Conference, Seattle, WA: http://cc rc.tc.columbia.edu/Publication.asp?uid=333 Bailey, T., Leinbach, D., & Jenkins, D. (2006). Is student success labeled institutional failure? Student goals and graduation rates in the accountability debate at community colleges. (C CRC working paper No.1 ) Retrieved August 24, 2007, http://www.unt.edu/transferinstitute/content/articles_essays/Bailey_StudentSuccess.pdf Beer, M & Cannon, M. D. (2004) P romise and peril in implementing pay for performance. Human Resource Management 43 (1), 3 48. Bok, D. (1992). Reclaiming the public trust. Change 24 (4) 13 20. Boswell, K., & Wilson, C. (Eds.). (2004). the community college. Denver, CO: Education Commission of the States.
119 Bradley, P. (2007, June 18). Measuring performance [Electronic version]. Community College Week, 19 (21), 6 8 Broom, C. A. (1995). Performance based government models: Building a trac k record. Public Budgeting & Finance, Winter 3 17. Bureau of L abor Statistics. (2003). Occupational o utlook h andbook U.S. Department of Labor. Washington, DC. Retrieved November 12, 2006 from http://www. bls.gov/oco/home.htm Burke, J. C. (Ed.) and Associates. (2002). Funding pub l ic colleges and universities for performance: Popularity, problems, and prospects. Albany, NY: Rockefeller Institute Press. Burke, J. C. (Ed.). (2005 a ). Achieving accountability in higher education. San Francisco: Jossey Bass. Burke, J. (20 05 b ). Reinventing Accountability. In J. Burke and Associates (Eds.), Funding public colleges and universities for performance ( pp. 216 245 ). Albany, NY: Rockefeller Press. Burke, J. C., & Mi nassians, H. (2001). accountability program (Sixth Annual Report, Nelson A. Rockefeller Institute of Government Call No: GOV 012.1 4 Perre 203 2382). Retrieved June17, 2007 from http://www.nysl.nysed.gov/scandoclinks/ocm51638216.htm Burke, J. C., Modarresi, S., & Serban, A. (1999 November/December it count for something in state budgeting? Change, 455 461. Bu rke, J. C., Rosen, J., Minassians, H., & Lessard, T. (2000). Performance funding and budgeting: An emerging merger? The fourth annual survey (2000). New York: Nelson A. Rockefeller Institute of Government, State University of New York. Callan, P Doyle, W., & Finney, J. (2000). Evaluating state higher education performance measuring up 2000 San Jose, CA: Center for Public Policy and Higher Education. Retrieved October 18 2006, from http://web.ebscohost.com/ehost/pdf?vid=2&hid=112&sid=4a1ee323 c902 44d0 82df 55cb16bb7b84%40sessionmgr106 Callan, P. M., Finney, J. E., Kirst, M. W. Usdan, M.D. & Venezia, A (2006). Claiming common g round: State policymaking for improving college readiness and success. Stanford CA : The Stanford institute for higher Education Research. Retrieved October 21, 2007 from http://www.stanford.edu/group/bridgeproject/Claim%20Comm%20Grnd%20Rpt%20FIN AL%2003 %2029%2006.pdf Canadian Council on Learning. (2006). How is quality post secondary education measured? Retrieved August 4, 2007, from http:// search.ccl cca.ca/NR/rdonlyres/BE1A9DDD E099 4FC7 B5B5 C31B9800C537/0/22QualityMeasures.pdf
120 Carter, K. ( 1994, December). Performance budgets: Here b y popular demand [Electronic Version]. State Legislatures 20 ( 12 ), 22. Cleary, T. S. (2001). Indicators of q uality [Electronic version] Planning for Higher Education 29 (3) 19 28. Community College Data Dictionary (2005). Accountability, research and measurement, Florida Department of Education Strategic Plan (2005). Retrieved September 12, 2007, from http://www.fldoe.org/arm/cctcmis/pubs/ccdictionary/dictionary_main.asp Community College Budget Office (2006, June 9). Community College Funding Model Tallahassee: Florida D epartment of Education. Community College Office of Budget and Financial Services. (2006, April). Performance funding report for 2006 2007. Tallahassee: Florida Department of Education. Community College Policy Center Education Commission for the States. (2000, November). State funding for community colleges: A 50 state survey. Retrieved December 14, 2005, from http://www.communitycollegepolicy.org/pdf/CC%20Finance%20Survey. pdf Council of Regional Accrediting Commissions.(2004). Regional accreditation and student learning: Improving institutional practice [Brochure]. Retrieved October 14, 2007, from http://www.utexas.edu/provost/sacs/pdf/ImprovingPractice.pdf Demographia. (no date) Retrieved October 21, 2007 from http://www.demographia.com/db 2000stater.htm DePalma, A. (1992, April 5). Critics say institutions spend carelessly, teach poorly and plan myopically. The New York Times. Retrieved October 18, 2006, from http://proquest.umi.com Diamond, R. M (2006, September 8). Why colleges are so har d to change. Inside Higher Ed Retrieved October 18, 2006 from http://insidehighered.com/layout/set/print/views/2006/09/08/diamond Dickeson, R. C. (2006). Frequently as ked questions about college costs [Issue paper 6th]. Retrieved September 9, 2007, from http://ed.gov/about/bdscomm/list/hiedfuture/reports/dickeson2.pdf Drury, R. L. (200 3). Community colleges in America; A historical perspective. Inquiry, 8 (1). Retrieved June 18, 2007, from http://www.vccaedu.org/inquiry/inquiry spring2003/i 81 drury.html Dywer, C., Millett, C., & Payne, D. (2006). A culture of evidence: Postsecondary assessment and learning outcomes: Recommendations to policymakers and the higher education community. Princeton, NJ: Educational Testing Service.
121 Education Commission of the States (2001). ECS State Notes Governance: Postsecondary, state master/strategic plans for postsecondary education. Retrieved December 16, 2005, from http://www.ecs.org/clearinghouse/31/14/3 114.doc Florida Board of Governors (2004, July 22). Florida Board of Governors Performance & Accountability Committee meeting presentation handout. Retrieved N ovember 22, 2005, from http://www.flbog.org/BOG/meetings/2004_07_22/PerfBasFun.pdf Florida Department of Education (n.d.) A comparison of community college student demographics by program areas. Retrieved October 20, 2005, from http://www.fldoe.org/cc/OSAS/DataTrendsResearch/rn2.asp Florida Department of Education ( n.d.) Facts at a Glance Report for the Florida community college system Retrieved June 30, 2007 from http://www.fldoe.org/cc/facts_glance.asp Florida Department of Education. (1998). Agency strategic and accountability plan, Florida community college system, fiscal years 1999 2000 through 2003 2004: A strategic plan for the mill ennium. Tallahassee, FL: Author. Florida Department of Education (2002). Florida community college system explanation of the performance based budgeting measures used in specific appropriation 177. Tallahassee : Author. Retrieved November 12, 2005, from http://www.flboe.org/K20AccAdvCounc/Apr_8_02/09_HighestStuAchieve.pdf Florida Department of Education. (2004). Internal Institutional Processes for Level II program revi ew Office of Student & Academic Success. Retrieved July 11, 2006 from http://www.fldoe.org/cc/Vision/IAAProgRevL2_full.pdf Florida Department of Education (2005a). Instructions for c ompletion of performance based deliverables and invoice schedule. Retrieved December 6, 2005, from http://www.fldoe.org/bii/curriculum/pdf/performancepf.pdf Florida Department of Ed ucation (2005b). The fact book : Report for the Florida community college system. Tallahassee, FL: Author. Retrieved from http://www.fldoe.org/arm/cctcmis/pubs/factbook/fb2 005/factbk05.pdf Florida Department of Education (2006). The fact book : Report for the Florida community college system. Tallahassee, FL : Author. Retrieved July 27, 2007, from http://www.fldoe.org/arm/cctcmis/pubs/factbook/fb2006/graphic.pdf Florida Department of Education (200 7 ). The fact book : Report for the Florida community college system. Tallahassee, FL : Author. Retrieved September 22, 2007, from http://www.fldoe.org/arm/cctcmis/pubs/factbook/fb2007/fb2007.pdf Florida Department of Education, Office of Accountability, Research, and Measurement (2007).Retrieved August 13, 2007 from http://www.fldoe.org/arm/cctcmis/pubs/ccdictionary/dictionary_main.asp
122 Gaither, G., Nedwek, B. P. & Neal, J. E. (1995). Measuring u p: The promises and pitfalls of perf ormance indicators in higher education. Higher Education Report Series, 23 5, 1. George, P. (2004, February 13) Financing quality in Ontario universities Speech to the Higher Education in Canada Conference, John Deutsch Institute for the Study of Econo mic Policy Kingston, Ontario Goldstein, L (2005). Approaches to Budgeting College and University Budgeting, NACUBO: Washington, D.C. Retrieved October 22, 2007 from http://www.iastate.edu/~budgetmodel/readings/ApproachesToBudgeting.pdf Grizzle, G. A., & Pettijohn, C. D. (2002). Implementing performance based program budgeting: A system dynamics perspective. Public Administration Review, 62, 51 62. Government Account ing Standards Board. (1987). Concepts s tatement ( No. 1 p. 1). GASB). Retrieved June 14, 2007, from http://www.gasb.org Halfill, T. (2007). Conversion calculations R etrieved May 31, 2006, from http://www.halfhill.com/inflation.html Reform Effort) [Electronic Version]. State Legislatures 20(12), 26 I Illinois Community Colleg e Board. (2005). Data and characteristics of the Illinois Public Community College System Retrieved from the internet October 21, 2007 from http://www.iccb.state.il.us/pdf/reports/da tabook2005.pdf Jenkins, D., & Fitzger ald, J. (1998, September). Community colleges: Connecting the poor to good jobs [Electronic version]. Policy Paper. Kong, D. (2005). Performance based budgeting: The U.S. experience. Public Organization Review: A Glo bal Journal. 5, 91 107. League for Innovation in the Community College. (2004, August). An assessment framework for the community college: Measuring student learning and achievement as a means of demonstrating institutional effectiveness. Phoenix, AZ: Aut hor. Leveille, D. E. (2005). An emerging view on accountability in American Higher Education ( Center for Studies in Higher Education Research & Occasional Paper Series CSHE.8.05 ) Retrieved July 27, 2007, from ht tp://cshe.berkeley.edu Levine, A. (1992). Why colleges are continuing to lose the public trust. Change, 24 (4), 4 Lingenfelter, P. (2003). Educational accountability. Change, 35 (2), 18 24 Retrieved October 18, 2006, from http://ebscohost.com
123 Lombardi, J. V. (2006, July 18). The value of commission reports. Inside Higher Ed Reality Check. Retrieved August 20, 2006, from http://www.insidehighe red.com/views/2006/07/18/lombardi McHewitt, E. R. (2004). SCHEV reports of institutional effectiveness: VCCS college performance measures. Retrieved October 21, 2007 http://sys tem.vccs.edu/vccsasr/Research/ROIE_rrs_May04.pdf McLendon, K., Hearn, J., & Deaton, R. (2006). Called to account: Analyzing the origins and spread of state performance accountability policies in higher education. Educational Evaluation and Policy Analys is, 28 (1), 1 24. Boswell, K., & Wilson, C. (Eds.). (2004). future of the community college. Denver, CO: Education Commission of the States. Mercer, J. (2002). Performance budgeting for federal agencies: A framework. Retrieved December 12, 2005, from http://www.governmentperformance.info/libr ary/Performance_Budgeting_FA.pdf Mercer, J. (2003). Cascade performance budgeting: A guide to an effective system for integrating budget and performance information and for linking long term goals to day to day activities. Retrieved December 12, 2005, fro m http://www.governmentperformance.info/library/cascade_pb.pdf Morgan, J. (2003). Performance based budgeting: A primer for legislators, May 2003. Nashville: State of Tennessee O ffice of Research. Retrieved June 17, 2007, from http://www.comptroller1.state.tn.us/repository/RE/pbbfinal.pdf National Center for Education Statistics (2005). The conditio n of education 2005. Washington, DC: U.S. Department of Education. Retrieved December 20, 2005, from http://nces.ed.gov/programs/coe/2005/pdf/33_2005.pdf National Center for Education S tatistics (2006). Digest of education statistics, 2006. Washington, DC: U.S. Department of Education Retrieved August 26, 2007, from http://nces.ed.gov/pubs2007/2007006.pdf National Center for Educ ational Accountability (n.d.). Policy implications of state data systems in 2005 06 Retrieved December 14, 2005, from http://www.nc4ea.org/index.cfm?pg=surveyresults&subp=policy National Center for Educational Accountability (n.d.). Results of 2005 NCEA survey of state data collection issues related to longitudinal analysis. Retrieved December 14, 2005, from http://www.dataqualitycampaign.org/activities/survey_result_2005.cfm National Center for Educational Accountability (n.d.). State of the nation in 2005 06. Retrieved December 14, 2005, from http://www.nc4ea.org/index.cfm?pg=surveyresults&subp=state_of_nation
124 National Center for Public Policy and Higher Education. (200 0 ) Measuring u p Retrieved October 19, 2006 from http://measuringup.highereducation.org/ National Center for Public Policy and Higher Education. (2002 ) Measuring Up Retrieved October 19, 2006 from http://measuringup.highereducati on.org/ National Center for Public Policy and Higher Education. (2004) Measuring up 2004: The national report card on higher education San Jose, CA: Author National Center for Public Policy and Higher Education. (2006) Measuring u p Retrieved Octobe r 19, 2006 from http://measuringup.highereducation.org/ National Commission of Excellence in Education. (1983). A nation at risk: The imperative for educational reform. Washington, DC: U.S. Governme nt Printing Office. National Commission on Accountability in Higher Education. (2005). Accountability for Better Results: A National Imperative for Higher Education Retrieved July 21, 2007, from SHEEO website: http://www.sheeo.org/pubs/pubs_search.asp Naylor, D. (2006, April 22). The t The Ottawa Citizen Retrieved June 4, 2007, from http://www.canada.com/ottawacitizen/news/opinion/story.html?id=adc4a1df d148 484f a569 80c18039f7c6 Noland, E., Johnson, B., & Skolits, G. (2004). Changing perceptions and outcomes: The Tennessee performance funding experience [Elec tronic Version]. Paper presented at the 44rd Annual AIR Conference. May 31, 2004. Boston, MA. Retrieved March 17, 2006, http://www. state.tn.us/thec/2004web/division_pages/ppr_pages/Research/Papers/2004.5.31 .PF%20AIR%20paper.pdf Office of Research and Information Systems (ORIS) of the New York Department Office of Higher Education. (1996). Performance reporting in higher education in the nation and in New York State. Retrieved July 22, 2007 from http://www.highered.nysed.gov/oris/ Performance based funding and budgeting. (1998, October). NCR Parameters, 1, 3. Office of Program Po licy Analysis and Government Accountability (n.d.). Performance based program budgeting. Retrieved November 7, 2005, from http://www.oppaga.state.fl.us/budget/pb2publications.html Office of Program Policy Analysis and Government Accountability (n.d.). Performance based program budgeting. Retrieved November 26, 2006, from http://www.oppaga.state.fl.us/budget/pb2.html Office of Program Policy Analysis and Government Accountability (1997, April). Performance based program budgeting in context: History and comparison (Report 96 77A) Tallahassee: F lorida Legislature
125 Phillips, M. A. (2002). The effectiveness of performan ce based outcome measures in a community college system. D octoral dissertation, University of Florida Gainesville. Rauber, C. (2007) Blue Shield of California to pay $31 million in quality bonuses to medical groups. Retrieved Augus t 15, 2007 from http://www.bizjournals.com/eastbay/stories/2007/08/13/daily22.html Rosenthal, M., & Dudley, R. A. (2007). Pay for performance: Will the latest payment trend improve care? [Electr onic version]. The Journal of the American Medical Association, 297 740 744. (2006, August 9). Draft report of the Commission Retrieved August 21, 2006, from http://www.ed.gov/about/bdscomm/list/hiedfuture/reports/0809 draft.pdf Snowdon, K. (2005) universities and colleges Ottawa, Ont ario: Canadian Policy Research Networks INC. Retrieved from the web July 28, 2007 from http://www.cprn.org/documents/40781_en.pdf Songini, M. L. (2005, February 14). Arkansas vote calls for end t o budgeting scheme. Computerworld, 39, 10. South Florida Community College.(n.d.) Placement testing Retrieved from the internet from http://www .southflorida.edu/eductnl stdsrvc/stdservices/services/testing/index_files/page7.html Spellings, M. (2006). An action plan for higher education. Speech given at National Press Club, October 2, Washington, DC State Higher Education Executive Officers (SH EEO) (2005). State higher education finance FY 2005. Boulder, CO: Author. Theurer, J. (1998, July). Seven pitfalls to avoid when establishing performance measures. [Electronic v ersion]. Public Management, 80 (7), 21. United Nations Educational, Scientific and Cultural Organization. (1998, July). Towards an agenda for the twenty first century for higher education. World Conference on Higher Education. Paris Retrieved August 24, 2007, from http://unesdoc.unesco.org/images/0011/001173/117392e.pdf U.S. Census Bureau. (2007). Retrieved April 22, 2007, from http://quickfacts.census.gov/qfd/states/12000.html U.S. Department of Education, Office of the deputy secretary, planning and performance management service. (2002). U.S Department of Education strategic plan 2002 2007 (DOE Publication No. ED000602P) Washington, DC: Author.
126 Weintraub, D. M. (1993, June ). based budgeting) [Electronic version]. State Legislatures, 19 (6), 19. Wellman,J. (2003). Statewide higher education accountability: issues, options and strategies for success tion, Higher Expectations: Essays on the Future of Higher Education, No. 2. Retrieved August 14, 2006 from http://www.nga.org/cda/files/041503HIGHERED.pdf Wright, D. L., Dallet, P. H., & Co pa, J. C. (20 02). Ready, fire, aim: Performance funding policies for public postsecondary education in Florida In J. Burke and Associates (Eds.), Funding public colleges and universities for performance ( pp. 138 168). Albany, NY: Rockefeller Press. Yan cey, G. W. (2002). Fiscal equity change in the Florida community college system during the first five years after the implementation of performance funding (Doctoral dissertation, University of Florida, 2002). Dissertations Abstracts International, 64/03, 772
127 BIOGRAPHICAL SKETCH Karen Bakuzonis was born in Lackawanna, New York. She holds a Bachelor of Science degree in m edical r ecords a dministration with honors from Daemen College and a Master of Science degree in health administration from Virginia Com mon wealth University Medical College Her career began in Schenectady, New York in a management role at Ellis Hospital. A s d irector of Medical Information Services at Geisinger Medical Center in Danville, Pennsylvania she chaired a system wide committe e that oversaw the quality and content of Bakuzonis became the d irector of Medical Records Services at Shands Hospital and later joined the University of Florida Fac ulty Group Practice As an administrative director, her duties included the development of more than $20 million of new primary care and specialty care clinical sites, including Pediatrics after Hours and the Eastside Clinic s Her teaching career began in 2001 as an adjunct professor at Santa Fe Community College and she currently is p rogram c oordinator for the H ealth Information Technology Program at Santa Fe Community College. S Institute of Higher Education L. V. Koos Scholarship Award in 2005. As a member of the American Health Information Management Association (AHIMA), Ms. Bakuzonis chaired the Council on Certification S ubcommittee of the registered health information techni cian ( RHIT ) Construction Committee and she co chaired the Florida Health Information Management Association Legislative Committee. She also was a m ember of the A merican Health Information Management Association E HIM Virtual Learning Laboratory Education Advisory Council.
128 While her children were in school, M s. Bakuzonis was active in the PTA, band boosters, robotics Boy Scouts and school advisory committee. Ms. Bakuzonis is married to Craig W. Bakuzonis; they have two children, Jason and Eric. The fa mily resides in Gainesville, Florida.