Performance Reporting: real Accountability Or Accountability lite: Seventh Annual Survey 2003, JC Burke, HP Minassians

Tags: Performance Reporting, Accountability, performance, South Dakota, Florida, Connecticut, Performance Budgeting, Oklahoma, Louisiana, Missouri, Legislature, Moderate Extent, South Carolina, Minimal Extent, performance reports, Coordinating board, Virginia, Colorado, Joseph C. Burke, public higher education, Missouri, Tennessee, Texas, Legislature Legislature, Governor Governor, Wisconsin Wyoming Performance Reporting, University System, Coordinating Board Coordinating Board, Legislature Coordinating Board, public colleges, performance report, campus performance, impact performance, Louisiana, Kansas Illinois, Rockefeller Institute of Government, performance funding, Education Program, funding, Performance Programs, North Carolina
Content: Performance Reporting: "Real" Accountability or Accountability "Lite" Seventh Annual Survey 2003 Joseph C. Burke and Henrik Minassians Higher education program The Nelson A. Rockefeller Institute of Government
Performance Reporting: "Real" Accountability or Accountability "Lite" Seventh Annual Survey 2003 Joseph C. Burke and Henrik Minassians Higher Education Program The Nelson A. Rockefeller Institute of Government State University of New York Albany, New York 2003
The Nelson A. Rockefeller Institute of Government, the public policy research arm of the State University of New York, was established in 1982 to bring the resources of the 64-campus SUNY system to bear on public policy issues. The Institute is active nationally in research and special projects on the role of state governments in American federalism and the management and finances of both state and local governments in major areas of domestic Public Affairs. Copyright г 2003 by The Nelson A Rockefeller Institute of Government Address inquires to: Joseph C. Burke Director Higher Education Program The Nelson A. Rockefeller Institute of Government 411 State Street Albany, New York 12203-1003 Tel.: (518) 443-5835 Fax: (518) 443-5845 E-mail: [email protected] Web: http://rockinst.org/higheduc.htm
CONTENTS Performance Reporting: "Real" Accountability or Accountability "Lite" Seventh Annual Survey Joseph C. Burke and Henrik Minassians Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 Performance Programs Prevail . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 The Questionnaire . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 Performance Budgeting and Performance Funding . . . . . . . . . . . . . . . . . 3 Methods of Initiation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 Performance Funding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 Performance Budgeting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 State Report Cards Spur Performance Reporting . . . . . . . . . . . . . . . . . . 11 State Performance Programs and the State Report Card . . . . . . . . . . . . . . 14 State Scores and Sources of Successes and Shortcomings . . . . . . . . . . . . . 16 Impact on Campus Performance . . . . . . . . . . . . . . . . . . . . . . . . . . 17 A Common and Fatal Flaw . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 Findings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 Appendix A . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 Appendix B . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 Appendix C . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 iii
Performance Reporting: "Real" Accountability or Accountability "Lite" Seventh Annual Survey* Joseph C. Burke and Henrik P. Minassians § Introduction The Seventh Annual Survey of State Higher Education Finance Officers (SHEFOs) shows the continuing triumph of performance reporting and the continuing trials of performance budgeting and funding. Performance reporting spread to all but four states, while performance funding and budgeting experienced deeper declines. The triumph of reporting had three apparent causes. First, the publication of Measuring UP 2000 and 2002 popularized performance reporting (The National Center 2000, 2002). Second, continuing budget shortfalls eroded support for performance funding and budgeting. Third, bad budgets encouraged some state policymakers to see performance reporting as the "no cost" accountability program -- an alternative to performance funding and budgeting. We began our SHEFO Surveys in 1997 based on the belief that the maxim of "what gets measured is what gets valued" was only half right. Only what gets "funded," "budgeted," or possibly "reported" can attract attention on college campuses and in state capitols and affect higher education performance. We considered reporting only a possible inducement, requiring proof, while the attention given to funding or budgeting in state capitols and on college campuses is legendary. This year's survey confirms the conclusion that performance reporting is by far the preferred approach to accountability for higher education. Clearly, it has covered the country, but is its impact on performance, planning, and policy making in higher education shallow or deep? Performance reports rely on information to improve performance. Information is powerful, but its force depends on its use. Our survey questions SHEFOs on the use of performance reports for planning, policymaking, and budgeting and on its effect on performance. The SHEFO responses provide no clear conclusions, but they do offer clues on whether performance reporting as currently used represents real accountability or only accountability "lite." § Summary of Results Performance funding and performance budgeting flourished during the 1990s. Our surveys showed that performance funding nearly doubled from 10 programs in 1997 to 19 in 2001. They also revealed that performance budgeting more than doubled from 16 to 28 programs from 1997 to 2000. The recent recession contributed to the decline in both programs. Performance funding fell by only one program last year, but lost three more this year. Performance budgeting slipped by one program in both 2001 and 2002, but dropped a net of five in this year's survey. Meanwhile, performance reporting jumped from 30 to 46 programs in the last three years. * Study supported by the Ford Foundation. 1
Higher Education Program -- Rockefeller Institute of Government Initially, we viewed performance reporting as a "halfway stop" on the road to performance budgeting or funding. Information arouses attention, but money levers action. Some statistics supported this conclusion. More than two-thirds of the states with performance funding and budgeting in 1999 also had performance reporting. Moreover, the adoption of performance reporting preceded the initiation of 44 percent of the performance funding and 49 percent of the performance budgeting programs. This year's survey confirms a conclusion suspected last year (Burke & Minassians 2002a). Far from being a precursor for other performance programs, performance reporting is now perceived as a less controversial and less costly alternative. All three of states that lost performance funding and four of the five that ended performance budgeting in this year's Survey already had performance reporting. The SHEFO from the fifth state said that the Legislature, after lobbying from the university system, directed the shift from performance budgeting to performance reporting. All of those eight states that ended performance funding or budgeting retained only performance reporting. In a few short years, performance reporting has gone from the precursor to the preferred approach to accountability. This year, SHEFOs claimed that bad budgets spelled bad news for performance funding, which often relies on additional state allocations, and even for performance budgeting, which only considers performance in allocating state appropriations. The finance officers attributed most of the losses in both of those programs to fiscal problems in their states. A recent Budget Update by the National Conference of State Legislators (NCSL) supports their conclusions (2003). Its charts suggest that most of the states dropping performance funding and budgeting this year faced more serious fiscal problems than those that retained them (NCSL, pgs. 5-6, 18-19, 21-24). § The Questionnaire Staff of the Higher Education Program at the Rockefeller Institute of Government has conducted telephone surveys of SHEFOs or their designees for the last seven years, with an annual response rate of 100 percent. The polling generally came in May, June or July, although the Sixth Survey occurred in August. The questions focus on the current status, future prospects, and perceived impact of performance funding, budgeting, and reporting in the fifty states (See Appendix A for the 2003 questionnaire). The interviews begin with definitions that distinguish the performance funding from performance budgeting. The questioner then asks whether a state currently has performance funding, budgeting, or reporting. If it has one or more of these programs, the interviewer asks the finance officer to predict whether the program or programs will continue for the next five years. If no program exists, the question changes to the likelihood of adopting the policy. "Highly likely," "likely," "unlikely," "highly unlikely," and "cannot predict" constitute the choices for all of these questions. Interviews also ask whether legislation mandates performance funding, budgeting, or reporting and whether it prescribes their indicators. In addition, respondents identify the primary initiator of these programs, choosing from governor, legislature, coordinating or governing board, university or college systems or "other." Two years ago, the survey started asking respondents to assess the effect of the three programs on improving campus performance. The options are great, considerable, moderate, minimal, no extent, or cannot judge the extent. This year's Survey added several new questions. The first asks whether the state allocations for performance funding has been maintained, reduced, or suspended, because a decline in state allocations. Two other questions inquired about the extent of consideration given by state 2
Performance Reporting: "Real" Accountability or Accountability "Lite" -- Seventh Annual Survey government leaders and by coordinating or governing boards to the performance reports in planning and policymaking. § Definitions Performance funding and budgeting add institutional performance to the traditional considerations in state allocations to public colleges and universities of current costs, student enrollments, and inflationary increases. The latter represent input factors that ignore outputs and outcomes, such as the quantity and quality of graduates and the range and benefits of services to states and society. Some states previously adopted programs that front-ended funding to encourage desired campus activities, which we call initiative funding. Performance funding and budgeting depart from these earlier efforts by allocating resources for achieved rather than promised results (Burke & Serban 1997; Burke & Associates 2002). Our annual surveys distinguish performance funding from performance budgeting by using the following definitions: · Performance funding ties specified state funding directly and tightly to the perfor- mance of public campuses on individual indicators. Performance funding focuses on the distribution phase of the budget process. · Performance budgeting allows governors, legislators, and coordinating or system boards to consider campus achievement on performance indicators as one factor in determining allocations for public campuses. Performance budgeting concentrates on budget preparation and presentation, and often neglects, or even ignores, the distribution phase of budgeting. In performance funding, the relationship between funding and performance is tight, automatic and formulaic. If a public college or university achieves a prescribed target or an improvement level on defined indicators, it receives a designated amount or percent of state funding. In performance budgeting, the possibility of additional funding due to good or improved performance depends solely on the judgment and discretion of state, coordinating, or system officials. The advantages and disadvantages of each is the reverse of the other. Performance budgeting is flexible but uncertain. Performance funding is certain but inflexible. Despite these definitions, confusion often arises in distinguishing the two programs. Moreover, at times, the connection between state budgets and campus performance in performance budgeting almost disappears. The allocations determined by either program are usually quite small, running from less than one to seldom more than five percent. Current costs, student enrollments, and inflationary increases still set the lion's share of state funding for public colleges and universities. Performance reporting represents a third approach to accountability for higher education. These periodic reports recount statewide and often the institutional results mostly of public higher education on priority indicators, similar to those found in performance funding and budgeting. On the other hand, since they have no formal link to allocations, performance reports can have a much longer list of indicators than performance budgeting and especially performance funding. The reports are usually sent to governors, legislators, and campus leaders, and increasingly appear on the websites of coordinating or system boards and of individual institutions. At times, they also include information directed to prospective undergraduate students and their parents. Performance reporting relies on information and publicity rather than funding or budgeting to encourage colleges and universities to improve their performance (Burke & Minassians 2002b). It is less 3
Higher Education Program -- Rockefeller Institute of Government controversial than performance funding or budgeting on campuses, which helps to explain ­ its increasing popularity in state capitols. § Methods of Initiation Three methods exist for initiating performance funding, budgeting, and reporting. · Mandated/Prescribed: legislation mandates the program and prescribes the indicators. · Mandated/Not Prescribed: legislation mandates the program but allows state-coordi- nating agencies in cooperation with campus leaders to propose the indicators. · Not Mandated: coordinating or system boards in collaboration with campus officials adopt the plan without legislation. Legislation mandated many of the early programs in performance funding; and in many cases also prescribed the indicators. Now over 53 percent of the funding programs are not mandated. Of the mandated programs, only 27 percent prescribe the indicators. Performance budgeting shows 57 percent mandated, with just 10 percent prescribing the indicators. Nearly two-thirds of the performance reports are mandated, and 24 percent of them specify the indicators. Mandates, and especially prescriptions, clearly undermine program support in the academic community. They are imposed by state officials and ignore the importance of consultation with coordinating, system, and campus leaders. On the other hand, not mandated programs can leave state policy makers without a sense of ownership of the initiatives. No consultation means no consent, especially on college campuses and in state capitals. New management theories suggest that government officials should decide state policy directions for public higher education and evaluate performance, but leave the method of achieving designated goals to coordinating, college and university system, and campus officers (See Appendix B for methods of initiation and dates). It is interesting to note that -- this year -- the three abandoned performance funding programs and all but one of the dropped performance budgeting efforts came in states without legislative mandates. Apparently, coordinating and governing boards chose to cut their own creations when confronted with serious budget problems. These reactions suggest that performance funding and budgeting are regarded as discretionary programs, which usually lose in budget battles to funding base operations. They may also suggest that non-mandated programs are less stable than mandated initiatives, which would conflict with previous studies on program stability that reached the opposite conclusion (Burke & Modarresi 2000). Changing a board policy is always easier than altering a state statute. § Performance Funding This year's Survey shows that the number of performance funding programs dropped from 18 to 15, with losses in Illinois, Missouri, and New Jersey (Table 1). SHEFOs from all three states say budget problems caused the program closings. The loss of Missouri's program is shocker, even though its SHEFO last year could not predict its continuance. Missouri's coordinating board and state government leaders have long championed their "Funding for Results" as a model for the nation. National observers have lauded that program ­ begun over a decade ago - as one of the best and most stable in country (Burke & Associates 2002, Burke & Modarresi 2000; Stein 2000; Stein & Fajen 1995). Only the Tennessee's plan had a longer life and a comparable reputation. 4
Performance Reporting: "Real" Accountability or Accountability "Lite" -- Seventh Annual Survey
Table 1. States with Performance Funding
Surveys First 1997
Number (Percentage) 10 states (20%)
States Colorado, Connecticut, Florida, Kentucky, Minnesota, Missouri, Ohio, South Carolina, Tennessee, Washington
Second 1998 Third 1999
13 states (26%) 16 states (32%)
Colorado, Connecticut, Florida, Illinois*, Indiana, Louisiana, Missouri, Ohio, Oklahoma, South Carolina, South Dakota, Tennessee, Washington California*, Connecticut, Florida, Illinois*, Kansas, Louisiana, Missouri, New Jersey, New York**, Ohio, Oklahoma, South Carolina, South Dakota, Tennessee, Texas, Virginia
Fourth 2000 Fifth 2001 Sixth 2002
17 states (34%) 19 states (38%) 18 States (36%)
California*, Colorado, Connecticut, Florida, Illinois*, Kansas, Louisiana, Missouri, New Jersey, New York**, Ohio, Oklahoma, Pennsylvania, South Carolina, South Dakota, Tennessee, Texas Arkansas, California*, Colorado, Connecticut, Florida, Idaho, Illinois*, Kansas, Louisiana, Missouri, New Jersey, New York**, Ohio, Oregon, Pennsylvania, South Carolina, South Dakota, Tennessee, Texas Colorado, Connecticut, Florida, Idaho, Illinois*, Kansas, Louisiana, Missouri, New Jersey, New York**, Ohio, Oklahoma, Oregon, Pennsylvania, South Carolina, South Dakota, Tennessee, Texas
Seventh 2003
15 States (30%)
Colorado, Connecticut, Florida, Idaho, Kansas, Louisiana, New York**, Ohio, Oklahoma, Oregon, Pennsylvania, South Carolina, South Dakota, Tennessee, Texas
* 2-year colleges only ** State University System Only
The NCSL Budget Update suggests that not all states with budget gaps and higher education cuts abandoned their performance funding programs (Appendix C, Table 1). For example, Tennessee, despite a $102 million cut for higher education, not only retained its program but also maintained its funding level at 5.45 percent of state operating support (NCSL 2003, p. 24). Other States with budget problems kept their programs, while reducing or suspending funding. Idaho, Oklahoma, Ohio and South Carolina reduced, while Kansas, Oregon, South Dakota, and Texas suspended funding. Despite the budget pressures and the preferences for performance reporting, the statistics on continuation on performance funding show the best percentages of highly likely in the last few years (Table 2). These ratings are surprising, given the consensus that state funding for higher education will remain depressed for some time even after economic recovery. The highly likely number is the only good news on program continuation. Placement of one fifth of the current programs in the cannot predict category suggests a troubling future for performance funding. The presence of South Carolina in that group is nothing less than astonishing. Since 1996, the coordinating commissioners and legislative leaders have vigorously championed their
5
Higher Education Program -- Rockefeller Institute of Government
controversial program in the face of almost universal criticism from national experts and campus officials in South Carolina (Burke & Associates 2002, Chapter Nine). Now budget problems have pushed that program's future into the cannot predict column.
Highly Likely Likely Cannot Predict Highly Likely Likely Unlikely Cannot Predict Highly Likely Likely Cannot Predict
Table 2. Likelihood of Continuing Performance Funding
2001
37%
Colorado, Florida, Idaho, Illinois, Pennsylvania, Tennessee,
(7)
Texas
58% (11)
Arkansas, Connecticut, Kansas, Louisiana, Missouri, New Jersey, New York, Ohio, Oregon, South Carolina, South Dakota
5% (1)
California
55.6% (10) 27.8% (5) 5.6% (1) 11.1% (2)
2002 Colorado, Connecticut, Florida, Idaho, Louisiana, Oklahoma, Pennsylvania, South Dakota, Tennessee, Texas Illinois, Kansas, New York, Oregon, South Carolina Missouri New Jersey, Ohio
2003
60%
Colorado, Connecticut, Florida, Idaho, Kansas, Oklahoma,
(9)
Pennsylvania, South Dakota, Tennessee
20% (3)
Louisiana, New York, Texas
20% (3)
Ohio, Oregon, South Carolina
The best news on the future adoption is that Missouri is likely to readopt performance funding (Table 3). The next best news may well be that SHEFOs could not predict whether ten states would or would not adopt performance funding in the next five years. The unpredictables become good news, only when four times as many states are unlikely rather than likely to initiate the program.
6
Performance Reporting: "Real" Accountability or Accountability "Lite" -- Seventh Annual Survey
Table 3. Likelihood of Adopting Performance Funding* 2001
Highly Likely Likely Unlikely Highly Unlikely Cannot Predict
9.5% (3) 13% (4) 26% (8) 16% (5) 35.5% (11)
2001 Kentucky, Oklahoma, West Virginia Alaska, Utah, Virginia, Wisconsin Arizona, Indiana, Maryland, Nebraska, Nevada, New Mexico, Washington, Wyoming Delaware, Iowa, Montana, New Hampshire, North Dakota Alabama, Georgia, Hawaii, Maine, Massachusetts, Michigan, Minnesota, Mississippi, North Carolina, Rhode Island, Vermont
Likely
6.3% (2)
2002 Alaska, West Virginia
Unlikely Highly Unlikely Cannot Predict
28.1% (9) 37% (12) 28.1% (9)
Georgia, Maryland, Mississippi, Montana, North Carolina, Utah, Vermont, Washington, Wyoming Alabama, Arizona, California, Delaware, Iowa, Kentucky, Nebraska, Nevada, New Hampshire, North Dakota, Rhode Island, Wisconsin Arkansas, Hawaii, Indiana, Maine, Massachusetts, Michigan, Minnesota, New Mexico, Virginia
2003
Highly Likely
3% (1)
New Mexico
Likely
11% (4)
Alaska, Missouri, Utah, West Virginia
Unlikely
40% (14)
Arizona, Georgia, Hawaii, Iowa, Maryland, Minnesota, Montana, Nebraska, North Carolina, North Dakota, Rhode Island, Washington, Wisconsin, Wyoming
Highly Unlikely
17%
California, Delaware, Indiana, Kentucky, Nevada, New
(6)
Hampshire
Cannot Predict
29%
Alabama, Arkansas, Illinois, Maine, Massachusetts, Michigan,
(10)
Mississippi, New Jersey, Vermont, Virginia
*Percent based on the number of states without Performance Funding program.
§ Performance Budgeting Five states -- Arkansas, Illinois, Missouri, North Carolina and Virginia abandoned performance budgeting since the last Survey; and Vermont never implemented its program from last year (Table 7
Higher Education Program -- Rockefeller Institute of Government 4). Minnesota started a new program this year, giving performance budgeting a net loss of five. SHEFOs from Arkansas, Illinois, Missouri, and Virginia claim that budget problems caused the demise of performance budgeting in their states. The respondent from North Carolina says its program simply expired over time from lack of use.
Surveys First 1997 Second 1998 Third 1999 Fourth 2000 Fifth 2001 Sixth 2002 Seventh 2003
Table 4. States with Performance Budgeting
Number (Percentage) 16 states (32%) 21 states (42%) 23 states (46%) 28 states (56%) 27 states (54%) 26 states (52%) 21 states (42%)
States Colorado, Florida, Georgia, Hawaii, Idaho, Illinois, Indiana, Iowa, Kansas, Mississippi, Nebraska, North Carolina, Oklahoma, Rhode Island, Texas, West Virginia Colorado, Florida, Georgia, Hawaii, Idaho, Illinois, Indiana, Iowa, Kansas, Louisiana, Maine, Mississippi, Nebraska, North Carolina, Oklahoma, Oregon, Rhode Island, South Dakota, Texas, Washington, West Virginia Connecticut, Florida, Georgia, Hawaii, Idaho, Illinois, Indiana, Iowa, Kansas, Louisiana, Maine, Massachusetts, Michigan, Nebraska, New Jersey, New Mexico, North Carolina, Oklahoma, Oregon, Texas, Virginia, Washington, West Virginia Alabama, California, Connecticut, Florida, Georgia, Hawaii, Idaho, Illinois, Iowa, Kansas, Louisiana, Maine, Maryland, Massachusetts, Michigan, Mississippi, Missouri, Nebraska, Nevada, New Jersey, New Mexico, North Carolina, Oklahoma, Oregon, Texas, Utah, Virginia, Wisconsin Alabama, California, Connecticut, Florida, Georgia, Hawaii, Idaho, Illinois, Iowa, Kansas, Louisiana, Maine, Maryland, Michigan, Mississippi, Missouri, Nebraska, Nevada, New Mexico, North Carolina, Oklahoma, Oregon, Texas, Utah, Virginia, Washington, Wisconsin. Arkansas, California, Connecticut, Florida, Georgia, Hawaii, Idaho, Illinois, Iowa, Kansas, Louisiana, Maine, Maryland, Michigan, Mississippi, Missouri, Nebraska, Nevada, New Mexico, North Carolina, Oklahoma, Texas, Utah, Vermont, Virginia, Wisconsin California, Connecticut, Florida, Georgia, Hawaii, Idaho, Iowa, Kansas, Louisiana, Maine, Maryland, Michigan, Minnesota, Mississippi, Nebraska, Nevada, New Mexico, Oklahoma, Texas, Utah, Wisconsin
The budget gaps for FY 2003 and 2004 from the Budget Update, allows a comparison of the fiscal problems of the states that kept and those that ended performance budgeting (See Appendix C, Table 2). If the four states of California, Nebraska, Oklahoma, and Texas are excluded, Illinois Missouri, and North Carolina clearly faced more difficult budget problems than the other 21 states with performance budgeting. Although the Budget Update indicates that Arkansas had no reported budget gap, its SHEFO said that budget problems led to the shift from performance budgeting to performance reporting. Moreover, the finance officer from Virginia cited budget difficulties as a reason for ending the program.
8
Performance Reporting: "Real" Accountability or Accountability "Lite" -- Seventh Annual Survey
Budget difficulties not only cut the number of states with performance budgeting, they also reduced the likelihood of continuing existing programs or adopting new ones. Table 5, shows a slide in the certainty of continuing performance budgeting since the 2001. SHEFOs called 63 percent of the programs highly likely to continue in 2001. This year, highly likely fell to 52 percent. SHEFOs now say they cannot predict the future of performance budgeting in Maryland and Nebraska. Last year that designation proved deadly, since -- of the three states cited as cannot predict, abandoned their program.
Highly Likely Likely Cannot Predict Highly Likely Likely Cannot Predict Highly Likely Likely Cannot Predict
Table 5. Likelihood of Continuing Performance Budgeting
2001
Connecticut, Idaho, Illinois, Iowa, Kansas, Louisiana,
63%
Maine, Michigan, Mississippi, Nebraska, Nevada, New
(17)
Mexico, North Carolina, Oklahoma, Texas, Utah,
Virginia
26%
Alabama, California, Hawaii, Maryland, Missouri,
(7)
Oregon, Wisconsin
11% (3)
Florida, Georgia, Washington
50% (13) 38.5% (10) 11.5% (3)
2002 Connecticut, Georgia, Idaho, Illinois, Iowa, Kansas, Louisiana, Michigan, Mississippi, Nevada, North Carolina, Oklahoma, Utah California, Florida, Hawaii, Maine, Maryland, Nebraska, New Mexico, Texas, Vermont, Wisconsin Arkansas, Missouri, Virginia
52.5% (11) 38% (8) 9.5% (2)
2003 Georgia, Hawaii, Idaho, Iowa, Kansas, Michigan, Minnesota, Mississippi, Nevada, New Mexico, Oklahoma California, Connecticut, Florida, Louisiana, Maine, Texas, Utah, Wisconsin Maryland, Nebraska
The prospects for new adoptions of performance budgeting in the next five years appear even less promising. For the last three years, SHEFOs forecasted no states as highly likely to adopt the program. This year, the number of highly unlikely and unlikely runs three times higher than those believed likely to institute performance budgeting. As with performance funding, performance budgeting's best hope for future expansions lies with the cannot predict category. Past trends show such hope is usually wishful thinking.
9
Higher Education Program -- Rockefeller Institute of Government
Table 6. Likelihood of Adopting Performance Budgeting* 2001
Likely Unlikely Highly Unlikely Cannot Predict
2001
9% (2)
Alaska, West Virginia
17% (4)
Delaware, Montana, New York, South Carolina
17% (4)
Arizona, New Hampshire, North Dakota, Rhode Island
57% (13)
Arkansas, Colorado, Indiana, Kentucky, Massachusetts, Minnesota, New Jersey, Ohio, Pennsylvania, South Dakota, Tennessee, Vermont, Wyoming
Likely Unlikely Highly Unlikely Cannot Predict
16.7% (4) 33.3% (8) 12.5% (3) 37.5% (9)
2002 Alaska, Montana, Tennessee, West Virginia Alabama, Arizona, Delaware, Kentucky, North Dakota, Rhode Island, South Carolina, Washington Colorado, New York, South Dakota Indiana, Massachusetts, Minnesota, New Hampshire, New Jersey, Ohio, Oregon, Pennsylvania, Wyoming
Likely Unlikely Highly Unlikely Cannot Predict
2003
14% (4)
Alaska, Missouri, Washington, West Virginia
31%
Arizona, Arkansas, Indiana, New York, North Carolina,
(9)
North Dakota, Pennsylvania, Rhode Island, Wyoming
14% (4)
Delaware, Kentucky, New Hampshire, South Dakota
41% (12)
Alabama, Colorado, Illinois, Massachusetts, Montana, New Jersey, Ohio, Oregon, South Carolina, Tennessee, Vermont, Virginia
* Percent based on the number of states without Performance Budgeting program.
Not surprisingly in bad budget times, the effect of performance budgeting on state funding for public higher education declines. The question asks the extent of the effect of performance budgeting on state funding for public higher education. Between 2001 and 2003, every one of the favorable categories of great, considerable, and moderate extent declines and the considerable extent actually disappears in 2002 and 2003 (Table 7). In the last three years, the moderate category of effect on funding drops by nearly half. This assessment of a relatively light effect on funding raises the question of why several states said they dropped the program because of budget cuts. One possible reason is that two of those states, Illinois and Missouri claimed that performance budgeting had a considerable effect on funding allocations in 2001 and Illinois in 2002, although the SHEFO from Missouri dropped that
10
Performance Reporting: "Real" Accountability or Accountability "Lite" -- Seventh Annual Survey
designation to minimal in 2002. What the table suggests is that states that retained performance budgeting simply reduced its effect on funding in years of fiscal difficulties.
Table 7. Effect of Performance Budgeting on Funding
Considerable Extent Moderate Extent Minimal Extent No Extent Cannot Judge
2001
11% (3)
Hawaii, Illinois, Missouri
37%
Connecticut, Florida, Idaho, Louisiana, Maine,
(10)
Maryland, Michigan, Nevada, Oregon, Utah
26%
California, Iowa, Mississippi, Nebraska, North Carolina,
(7)
Virginia, Washington
11% (3)
Alabama, New Mexico, Wisconsin
15% (4)
Georgia, Kansas, Oklahoma, Texas
Considerable Extent Moderate Extent Minimal Extent No Extent Cannot Judge
3.8% (1) 34.6% (9) 34.6% (9) 15.4% (4) 11.5% (3)
2002 Illinois California, Hawaii, Idaho, Louisiana, Maryland, Michigan, Oklahoma, Utah, Vermont Connecticut, Florida, Georgia, Kansas, Missouri, Nebraska, Nevada, North Carolina, Virginia Iowa, Mississippi, New Mexico, Wisconsin Arkansas, Maine, Texas
Moderate Extent Minimal Extent No Extent Cannot Judge
2003
19% (4)
California, Hawaii, Idaho, Utah
57% (12)
Connecticut, Florida, Kansas, Louisiana, Maryland, Michigan, Minnesota, Nebraska, Nevada, New Mexico, Oklahoma, Texas
19% (4)
Georgia, Iowa, Mississippi, Wisconsin
5% (1)
Maine
§ Performance Reporting Publication of the State-By-State Report Card ­ Measuring Up 2000 --in November by The National Center For Public Policy and Higher Education undoubtedly spurred the adoption of state 11
Higher Education Program -- Rockefeller Institute of Government
performance reports. Our SHEFO Survey in 2000 -- before the appearance of the first Report Card -- showed only 30 states with performance reports. In the two years following publication of Measuring Up 2000, that number jumped to 44, nearly a 50 percent increase (Table 8). In 2002, the National Center published the second Report Card, Measuring Up 2002 after our Survey that year. This year, SHEFOs say that three more states ­ Arkansas, Montana, and Nebraska ­ adopted performance reporting. However, a new SHEFO from Rhode Island says that State does not have the program listed as started last year. That change gives a net increase of two programs to 46.
Fourth 2000 Fifth 2001 Sixth 2002 Seventh 2003
Table 8. States with Performance Reporting
30 states (60%)
Alabama, Arizona, California, Colorado, Connecticut, Florida, Georgia, Hawaii, Idaho, Illinois, Kentucky, Louisiana, Maryland, Massachusetts, Mississippi, Missouri, New Jersey, New Mexico, North Dakota, Oregon, Rhode Island, South Carolina, South Dakota, Tennessee, Texas, Utah, Washington, West Virginia, Wisconsin, Wyoming
39 states (78%)
Alabama, Alaska, Arizona, California, Colorado, Connecticut, Florida, Georgia, Hawaii, Idaho, Illinois, Kansas, Kentucky, Louisiana, Maine, Maryland, Massachusetts, Michigan, Minnesota, Mississippi, Missouri, New Jersey, New Mexico, North Carolina, North Dakota, Ohio, Oregon, Pennsylvania, Rhode Island, South Carolina, South Dakota, Tennessee, Texas, Utah, Virginia, Washington, West Virginia, Wisconsin, Wyoming
44 states (88%)
Alabama, Alaska, Arizona, California, Colorado, Connecticut, Florida, Georgia, Hawaii, Idaho, Illinois, Indiana, Iowa, Kansas, Kentucky, Louisiana, Maine, Maryland, Massachusetts, Michigan, Minnesota, Mississippi, Missouri, New Hampshire, New Jersey, New Mexico, North Carolina, North Dakota, Ohio, Oklahoma, Oregon, Pennsylvania, Rhode Island, South Carolina, South Dakota, Tennessee, Texas, Utah, Vermont, Virginia, Washington, West Virginia, Wisconsin, Wyoming
46 states (92%)
Alabama, Alaska, Arizona, Arkansas, California, Colorado, Connecticut, Florida, Georgia, Hawaii, Idaho, Illinois, Indiana, Iowa, Kansas, Kentucky, Louisiana, Maine, Maryland, Massachusetts, Michigan, Minnesota, Mississippi, Missouri, Montana, Nebraska, New Hampshire, New Jersey, New Mexico, North Carolina, North Dakota, Ohio, Oklahoma, Oregon, Pennsylvania, South Carolina, South Dakota, Tennessee, Texas, Utah, Vermont, Virginia, Washington, West Virginia, Wisconsin, Wyoming
Continuance of the current reporting programs seems beyond doubt (Table 9). SHEFOs see 80 percent of states with the program as highly likely and 20 percent as likely to retain performance reporting. (The astronomical height of this rating becomes apparent when compared with the highly likely continuance prediction of 60 percent for performance funding and 53 percent for performance budgeting.) The current coverage of performance reporting makes future adoptions difficult. Replies from Delaware, Nevada, and Rhode Island see starting the program as unlikely, while the one from New York cannot predict the decision of that state.
12
Performance Reporting: "Real" Accountability or Accountability "Lite" -- Seventh Annual Survey
Table 9. Likelihood of Continuing Performance Reporting 2001
Highly Likely Likely Unlikely Cannot Predict
85% (33) 10% (4) 2.5% (1) 2.5% (1)
2001 Alaska, Arizona, California, Colorado, Connecticut, Florida, Georgia, Idaho, Illinois, Kansas, Kentucky, Louisiana, Maine, Maryland, Michigan, Minnesota, Mississippi, Missouri, New Mexico, North Carolina, North Dakota, Ohio, Oregon, Pennsylvania, Rhode Island, South Carolina, South Dakota, Tennessee, Texas, Utah, Virginia, West Virginia, Wisconsin Alabama, Hawaii, Massachusetts, New Jersey Wyoming Washington
Highly Likely Likely Cannot Predict
70.5% (31) 25% (11) 4% (2)
2002 Alabama, Alaska, Arizona, Connecticut, Florida, Georgia, Idaho, Illinois, Indiana, Iowa, Kansas, Kentucky, Louisiana, Michigan, Minnesota, Mississippi, Missouri, New Jersey, New Mexico, North Dakota, Ohio, Oklahoma, Pennsylvania, Rhode Island, South Carolina, South Dakota, Tennessee, Utah, Virginia, West Virginia, Wisconsin California, Colorado, Maine, Maryland, Massachusetts, New Hampshire, New York, North Carolina, Oregon, Texas, Vermont, Washington Hawaii, Wyoming
Highly Likely Likely
2003
Alabama, Arizona, Arkansas, California, Colorado,
Connecticut, Florida, Georgia, Hawaii, Idaho, Illinois,
Indiana, Iowa, Kansas, Kentucky, Louisiana,
80%
Massachusetts, Michigan, Minnesota, Mississippi,
(37)
Missouri, New Jersey, New Mexico, North Carolina, North
Dakota, Ohio, Oklahoma, Oregon, Pennsylvania,
Tennessee, Utah, Vermont, Virginia, Washington, West
Virginia, Wisconsin, Wyoming
20%
Alaska, Maine, Maryland, Montana, Nebraska, New
(9)
Hampshire, South Carolina, South Dakota, Texas
13
Higher Education Program -- Rockefeller Institute of Government
§ The Use of Performance Reports Performance reports rely on information to improve higher education. The power of information to lever improvement depends on its use. Our survey questions SHEFOs on the use of performance reports for planning and policy making by their own coordinating or system boards and by state officials. It also inquires about the coordinating boards level of use of performance reports in allocating resources to public colleges and universities. Planning, policymaking and budgeting are the major ways that state governments or coordinating agencies influence performance in higher education. The replies from SHEFOs show that the use of performance reporting for planning and policymaking by coordinating or system boards is far less than desirable (Table 10). Nearly half of the SHEFOs say there own agencies only use these reports moderately in planning and policymaking. About another quarter cite minimal, no extent, or cannot judge. Only four percent claim great and 20 percent considerable extent.
Table 10. Extent that Coordinating/System Governing Boards Consider Performance Reports in their Planning & Policymaking, 2003
Great Extent
4% (2)
Arizona, Iowa
Considerable Extent
20%
Alaska, Indiana, Kentucky, Missouri, North Carolina,
(9)
Ohio, Vermont, West Virginia, Wisconsin
Moderate Extent
Alabama, Arkansas, Connecticut, Florida, Georgia, Hawaii,
48% (22)
Idaho, Illinois, Louisiana, Maine, Maryland, Massachusetts, Montana, New Hampshire, New Mexico, North Dakota, Oklahoma, Oregon, South Carolina, South
Dakota, Tennessee, Texas
Minimal Extent
15%
Kansas, Minnesota, Mississippi, New Jersey, Utah,
(7)
Washington, Wyoming
No Extent
6.5% (3)
California, Michigan, Virginia
Cannot Predict
6.5% (3)
Colorado, Nebraska, Pennsylvania
If the use of reporting by coordinating or system agencies for planning and policymaking is disappointing, that by state government is dismal (Table 11). No state ranks government use of the performance reports for planning and policymaking as great extent; and only West Virginia rates it as considerable. Just 30 percent of the states even use their reports to a moderate extent for planning and policymaking. Fully 39 percent of the replies assess their use as minimal or no extent; and perhaps worst, 28 percent even cannot judge the extent of their use. Our recent book on performance reporting -- looking at the modest use of performance reports in planning and policymaking -- puts that program in a category that political scientists call "symbolic policies" (Burke & Minassians 2002b). Symbolic policies appear to address problems, while having little substantive effect.
14
Performance Reporting: "Real" Accountability or Accountability "Lite" -- Seventh Annual Survey
Table 11. Extent That Government Leaders Consider Performance Reports in Their Planning & Policymaking, 2003
Considerable Extent
2% (1)
West Virginia
Moderate Extent
30% (14)
Alaska, Colorado, Connecticut, Florida, Georgia, Hawaii, Louisiana, Maryland, Michigan, Missouri, New Mexico, Oklahoma, South Dakota, Tennessee
Minimal Extent
26% (12)
Alabama, California, Illinois, Iowa, Kentucky, Montana, Nebraska, New Jersey, North Carolina, North Dakota, Texas, Washington
No Extent
13%
Indiana, New Hampshire, Oregon, Pennsylvania,
(6)
Virginia, Wisconsin
Cannot Judge
28% (13)
Arizona, Arkansas, Idaho, Kansas, Maine, Massachusetts, Minnesota, Mississippi, Ohio, South Carolina, Utah, Vermont, Wyoming
If the effect of performance reporting on planning and policy making is far less than desired, SHEFOs estimates of its consideration in allocating state resources is more than expected (Table 12). After all, several of them had suggested that some legislators now regarded performance reporting as a "no cost" alternative to performance funding or budgeting. Although 65 percent of the responses claim minimal or no extent and four percent cannot judge, nine percent say performance reports are considered to a considerable extent and 22 percent to a moderate extent in budget allocations.
Table 12. Extent that Coordinating/System Governing Boards Consider Performance Reports in the Allocation of Resources to Campuses 2003
Considerable Extent
9%
Colorado, North Carolina, South Dakota, West
(4)
Virginia
Moderate Extent
22%
Alaska, Florida, Hawaii, Illinois, Kentucky, Maryland,
(10)
Missouri, Pennsylvania, South Carolina, Tennessee
Minimal Extent
30% (14)
Alabama, California, Connecticut, Georgia, Idaho, Louisiana, Maine, Nebraska, New Mexico, Oklahoma, Texas, Utah, Wisconsin, Wyoming
No Extent
Arizona, Arkansas, Indiana, Iowa, Kansas,
35%
Massachusetts, Michigan, Minnesota, Mississippi,
(16)
Montana, New Hampshire, New Jersey, North
Dakota, Oregon, Virginia, Washington
Cannot Predict
4% (2)
Ohio, Vermont
This estimate may well exaggerate the use of performance reporting in considering state allocations. It seems highly unlikely that coordinating boards would give even this limited extent of consideration to performance reports in budgeting at a time of deep budget difficulties. Of the four states listed as considering reports to a considerable extent in allocations, Colorado and North Carolina faced large projected budget gaps for 2004. Moreover, four of the ten states
15
Higher Education Program -- Rockefeller Institute of Government cited as moderate extent had large budget gaps either in 2003 or 2004 (NCSL 2003, pp. 5-6, 18-19.) In addition, coordinating boards in Illinois and Missouri abandoned both performance funding and budgeting because of budget problems, but -- in this same year -- they are also named as considering performance reports to a moderate extent in budget allocations. A survey in our book on performance reporting shows that the use of reports for budgeting by state, coordinating, and campus leaders trailed the moderate use for planning and policy making (Burke & Minassians 2000b, pp., 66-67) One explanation for the unexpected number of states that considered the reports for allocations is possible confusion of the effect of performance reporting with that of performance funding, since they often used the same indicators. Colorado and South Dakota, which also have performance funding, constitute half of the four states where performance reports are cited as having a considerable effect on budgeting. Moreover, four of the ten in the moderate column also have performance funding: Florida, Pennsylvania, South Carolina, and Tennessee. In addition, Illinois and Missouri had performance funding until this Year. On the other hand, SHEFOs in five other states with performance funding cited minimal or no extent, or cannot judge on the issue of considering performance reports in state allocations. Sorting out the confusion created by this response on considering performance reports in allocating resources requires more study. The best bet is that most policy makers probably view performance reporting as a "no cost" alternative to performance funding and budgeting. Still some may see it as a milder version of performance budgeting, which allows them to consider performance in budgeting without committing to a formal program. § Measuring UP and Performance Reporting Obviously, the publication of Measuring Up 2000 enhanced the popularity of state performance reports (National Center for Public Policy and Higher Education 2000). Beginning in 2001, we tried to track whether that State By-State Report Card would lead to revisions in the contents of state performance reports. In 2002, we began asking the SHEFOs whether their state had revised its performance report based first on Measuring UP 2000 and then on Measuring Up 2002. If they had made changes, we asked the extent of those revisions. After two years of questions, two Report Cards, and extensive field work by the National Center, SHEFOs say just seven states have revised their performance reports. Indiana, Oklahoma, Tennessee and West Virginia changed them to a considerable extent, Maryland to a moderate extent, and Hawaii to a minimal extent. The response from Texas did not note the extent of the revision. This small number of revisions underscores the problem of the lack of connection between the Measuring Up Report Cards that include only statewide performance reports and the state performance reports that also include system and institutional results. Our book on performance reporting proposes a limited list of common indicators to connect the state reports and Measuring Up (Burke & Minassians 2002b). § Impact on Campus Performance Of course, the bottom-line in assessing all performance programs is the extent to which each improves the performance of higher education. None of the three programs show the desired impact on improvement. In fairness, bad budget years are hardly fair times to test the relative impact of reporting, funding, or budgeting on improvement. But even the percentages on extent of 16
Performance Reporting: "Real" Accountability or Accountability "Lite" -- Seventh Annual Survey
improvement before the recent recession hardly reach the level expected from programs with performance in their title. Still, the budget problems that emerged in 2001 have clearly diminished the effect of performance funding on improving higher education. In our 2000 Survey conducted before the beginning of the recent recession, SHEFOs said that 35 percent of those programs improved performance to a great or considerable effect (Burke, Rosen, Minassians, & Lessard, 2000). That year, finance officers from Tennessee cited great extent and Connecticut, Missouri, Ohio, and Oklahoma claimed considerable extent. By this year, Missouri had dropped the program and Ohio, Oklahoma, and South Carolina had reduced its funding. The three-year Table below shows a steady slippage in SHEFO perceptions of the impact performance funding on improvement of higher education (Table 13). The categories of great and considerable extent declined, while those of moderate and minimal extent increased. In 2003, the considerable extent gets 6.5 percent, moderate 40, minimal or no extent 33.5, and a cannot judge of 20 percent.
Table 13. Extent of Performance Funding Impact on Improved Performance of Public Colleges and/or Universities
Great Extent Considerable Extent Moderate Extent Minimal Extent No Extent Cannot Judge
2001
5% (1)
Missouri
16% (3)
Ohio, South Dakota, Tennessee
16% (3)
Connecticut, Idaho, South Carolina
16% (3)
Florida, Louisiana, Oregon
5% (1)
New Jersey
42%
Arkansas, California, Colorado, Illinois, Kansas, New
(8)
York, Pennsylvania, Texas
Great Extent Considerable Extent Moderate Extent Minimal Extent No Extent Cannot Judge
5.6% (1) 16.7% (3) 27.8% (5) 16.7% (3) 5.9% (1) 27.8% (5)
2002 Connecticut Ohio, South Dakota, Tennessee Colorado, Idaho, Louisiana, Missouri, South Carolina Florida, Oregon, Pennsylvania, Kansas Illinois, New Jersey, New York, Oklahoma, Texas
17
Higher Education Program -- Rockefeller Institute of Government
Considerable Extent Moderate Extent Minimal Extent No Extent Cannot Judge
6.5% (1) 40% (6) 27% (4) 6.5% (1) 20% (3)
2003 Tennessee Colorado, Idaho, Louisiana, Ohio, South Carolina, South Dakota Florida, Oklahoma, Pennsylvania, Texas Connecticut Kansas, New York, Oregon
Performance budgeting also shows a declining impact on improvement (Table 14). Over 10 percent of the responses in 2001 indicate great extent and a third moderate. This year, no SHEFOs claimed effects on improvement of great or considerable extent, while the number citing no extent nearly doubled from last year. The highest category of moderate extent is 38 percent; those of minimal and no extent combine for 38 percent; and cannot judge reaches 24 percent.
Table 14. Extent of Performance Budgeting Impact on Improved Performance of Public Colleges and Universities
Great Extent Considerable Extent Moderate Extent Minimal Extent No Extent Cannot Judge
3.7% (1) 7.5% (2) 33.3% (9) 18.5% (5) 15% (4) 22% (6)
2001 Missouri Louisiana, Maine Connecticut, Hawaii, Idaho, Illinois, Iowa, Maryland, Michigan, Oklahoma, Oregon Florida, Mississippi, Nebraska, New Mexico, Virginia Georgia, Nevada, Washington, Wisconsin Alabama, California, Kansas, North Carolina, Texas, Utah
Considerable Extent Moderate Extent Minimal Extent No Extent
7.7% (2) 38.5% (10) 15.4% (4) 7.7% (2)
2002 Louisiana, North Carolina California, Hawaii, Idaho, Maryland, Michigan, Missouri, Nevada, New Mexico, Utah, Vermont Connecticut, Illinois, Nebraska, Virginia Georgia, Mississippi
18
Performance Reporting: "Real" Accountability or Accountability "Lite" -- Seventh Annual Survey
Cannot Judge Moderate Extent Minimal Extent No Extent Cannot Judge
30.8% (8) 38% (8) 24% (5) 14% (3) 24% (5)
Arkansas, Florida, Iowa, Kansas, Maine, Oklahoma, Texas, Wisconsin 2003 Hawaii, Idaho, Louisiana, Maryland, Michigan, Nevada, Oklahoma, Utah California, Connecticut, Nebraska, New Mexico, Texas Georgia, Iowa, Wisconsin Florida, Kansas, Maine, Minnesota, Mississippi
With so many new programs started in the last three years it is difficult to assess the trends of the perceived impact of performance reporting on higher education improvement. What is clear is that Seventh SHEFO Survey shows positive effects of little over 10 percent for great and considerable extent combined and a moderate of just 24 percent, while the negative ratings of minimal and no extent reached nearly 40 percent and the cannot predict slipped slightly to 26 percent (Table 15). An impact on improvement is hardly acceptable where the minimal and no extent exceed the great, considerable, and moderate effect combined.
Table 15. Extent of Performance Reporting Impact on Improved Performance of Public Colleges and/or Universities
Great Extent Considerable Extent Moderate Extent Minimal Extent No Extent Cannot Judge
2001
0%
13%
Kentucky, Michigan, Missouri, South Carolina, West Virginia
Hawaii, Idaho, Illinois, Louisiana, Maryland, New Jersey,
36%
New Mexico, North Carolina, Pennsylvania, South
Dakota, Tennessee, Utah, Virginia, Wyoming
15%
Arizona, California, Florida, Massachusetts, Mississippi, Wisconsin
8%
Alabama, Rhode Island, Washington
28%
Alaska, Colorado, Connecticut, Georgia, Kansas, Maine, Minnesota, North Dakota, Ohio, Oregon, Texas
Great Extent Considerable Extent Moderate Extent
0% 13.3% (6) 33.3% (15)
2002 Iowa, Michigan, North Carolina, South Carolina, Tennessee, West Virginia Alaska, Colorado, Florida, (Hawaii), Illinois, Kentucky, Louisiana, Maryland, Missouri, New Mexico, South Dakota, Utah, Vermont, Washington, Wisconsin
19
Higher Education Program -- Rockefeller Institute of Government
Minimal Extent No Extent Cannot Judge Great Extent Considerable Extent Moderate Extent Minimal Extent No Extent Cannot Judge
22.2% (10) 4.4% (2) 26.7% (12) 2% (1) 8.5% (4) 24% (11) 33% (15) 6.5% (3) 26% (12)
California, Connecticut, Idaho, Massachusetts, New Hampshire, New Jersey, Oklahoma, Oregon, Pennsylvania, Wyoming Arizona, Mississippi Alabama, Georgia, Indiana, Kansas, Maine, Minnesota, New York, North Dakota, Ohio, Rhode Island, Texas, Virginia 2003 Kentucky Colorado, Michigan, West Virginia, Wisconsin Iowa, Kansas, Louisiana, Maryland, Minnesota, Missouri, Ohio, South Dakota, Tennessee, Vermont, Washington Alabama, Alaska, California, Connecticut, Florida, Hawaii, Idaho, Mississippi, New Hampshire, New Mexico, Oklahoma, Oregon, Pennsylvania, South Carolina, Utah Arizona, Georgia, Texas Arkansas, Illinois, Indiana, Maine, Massachusetts, Montana, Nebraska, New Jersey, North Carolina, North Dakota, Virginia, Wyoming
§ A Common and Fatal Flaw Our recent SHEFO Surveys suggest that the impact of these performance programs on improved results in public higher education may have slipped because of budget problems. But a survey state and campus leaders from our other studies suggest another flaw. They show both performance reporting and funding become increasingly invisible on campuses below the level of vice presidents, because of the failure to apply these programs to the internal academic units on campus [Burke and Associates 2002; Burke and Minassians 2002b). These studies conclude that performance funding and reporting are unlikely to improve substantially the performance of colleges and universities unless they extend funding and reporting programs to academic departments. The anomaly of all three accountability programs ­ funding, budgeting, and reporting -- is that they hold states, systems, and colleges and universities responsible for performance, but campus leaders do not apply that same responsibility to the internal divisions that are largely responsible for producing institutional results. § Findings SHEFOs replies to the Seventh Survey suggest the following findings: · Performance reporting -- which now covers all but four states -- is by far the preferred approach to accountability for higher education; 20
Performance Reporting: "Real" Accountability or Accountability "Lite" -- Seventh Annual Survey · Bad budgets for states and higher education continue to erode support for performance funding and budgeting; · More policy makers in state government and higher education agencies seem to see per- formance reporting as a "no cost" alternative to performance funding and budgeting; · Still, the responses from some SHEFOs suggest that some policy makers may view per- formance reporting as an informal form of performance budgeting; · Measuring Up 2002 continues to spur interests in statewide performance reporting, but only a limited number of states are revising their reports to link them with those reports cards · State governments are making only modest and coordinating and system boards only moderate use of performance reports in planning and policymaking. · None of the three programs demonstrate the desirable impact on the improving perfor- mance, but performance funding shows more than budgeting or reporting. § Conclusion Performance reporting has become by far the preferred approach to accountability. State policy makers see it as a less controversial and less costly alternative to performance funding and budgeting. It relies on information rather than funding or budgeting as a lever to encourage desired performance in public higher education and its colleges and universities. But information is powerful only if used. The findings from the Seventh SHEFO Survey and our recent book suggest that performance reports are not widely used by state and campus policy makers. To date, reporting resembles more a symbolic than substantive reform. Only time will tell whether performance reporting represents "real" accountability that sets goals and seeks results or accountability "lite" that looks good but is less fulfilling. 21
Higher Education Program -- Rockefeller Institute of Government § References Burke J.C. & Minassians, H. P., Performance Reporting: The Preferred `No Cost' Accountability Program: The Sixth Annual Report. Albany, NY: The Rockefeller Institute, 2002a. Burke, J.C. & Minassians, H. P. Reporting Higher Education Results: Missing Links in Performance. New Directions in Institutional Research, No. #116, December 2002b. Burke, J.C. & Associates. Funding Public Colleges and Universities for Performance: Popularity, Problems, and Prospects. Albany, NY: The Rockefeller Institute, 2002. Burke, J.C. & Minassians, H. P. Linking Resources to Campus Results: From Fad to Trend: The Fifth Annual Survey: 2001. Albany, NY: The Rockefeller Institute, 2001. Burke, J.C. & Modarresi, S. (2000). "To Keep or Not to Keep Performance Funding: Signals from Stakeholders." The Journal of Higher Education 71(4): 432 - 454. Burke, J.C., Rosen, J., Minassians, H. & Lessard, T. Performance Funding and Budgeting: An Emerging Merger? The Fourth Annual Survey (2000). Albany, NY: The Rockefeller Institute, 2000. Burke J.C. & Serban, A.M. Performance Funding and Budgeting for Public Higher Education: Current Status and Future Prospects. Albany, NY: Rockefeller Institute of Government, 1997. National Conference of state legislatures. State Budget UpDate. April, 2003. Denver, CO. http://www.ncsl.org. The National Center For Public Policy And Higher Education, Measuring Up 2000: The State-By-State Report Card For Higher Education. San Jose, CA: The National Center For Public Policy And Higher Education, 2000. The National Center For Public Policy And Higher Education, Measuring Up 2002: The State-By-State Report Card For Higher Education. San Jose, CA: The National Center For Public Policy And Higher Education, 2002. Stein, R. B. "Missouri Coordinating Board for Higher Education Funding for Results,' Lessons Learned from FIPSE Projects IV. Fund for the Improvement of Postsecondary Education 2000, 2-9-220.. Stein, R. B. and Fajen, A. L. "Missouri's Funding for Results Initiative." In G. H. Gaither (ed.), Assessing Performance in an Age of Accountability: case studies. New Directions for Higher Education, no 91. San Francisco: Jossey-Bass, 1995. 22
Performance Reporting: "Real" Accountability or Accountability "Lite" -- Seventh Annual Survey Appendix A SURVEY OF STATE HIGHER EDUCATION FINANCE OFFICERS PERFORMANCE REPORTING, FUNDING, AND BUDGETING MAY 2003
NAME: STATE:
___________________________ ___________________________
PHONE #: ____________________
Definitions: PERFORMANCE FUNDING: Ties specified state funding directly and tightly to the performance of public campuses on performance indicators. PERFORMANCE BUDGETING: Allows governors, legislators, and coordinating or system boards to consider campus achievement on performance indicators as one factor in determining public campus allocations.
Section One: Performance Funding
1) Does your state currently have performance funding for public colleges and/or universities?
Yes o
No o
If Yes,
2) What is the percent of funding allocated to performance funding for public colleges and/or
universities in your state?
%
3) Was it mandated by legislation?
Yes o
No o
4) Were the indicators prescribed by legislation?
Yes o
No o
5) Of the following, what individual or group(s) initiated performance funding?
Governor
o
Legislature
o
Coordinating board or agency o
University system(s)
o
Other (please specify)
o
23
Higher Education Program -- Rockefeller Institute of Government
6) How has the state allocation for Performance Funding been affected by decline in state revenues?
Funding Maintained o
Reduced o
Suspended o
Don't Know o
7) In your opinion, to what extent has performance funding improved the performance of public colleges and/or universities in your state?
Great Extent o
Considerable Extent o
Moderate Extent o
Minimal Extent o No Extent o
Cannot Judge o
8) How likely is it that your state will continue performance funding for public higher education over the next five years?
Highly Likely o Likely o Unlikely o Cannot Predict o
Highly Unlikely o
If no,
9) How likely is it that your state will adopt performance funding for public higher education in the next five years?
Highly Likely o Likely o Unlikely o Cannot Predict o
Highly Unlikely o
Section Two: Performance Budgeting
10) Does your state currently have performance budgeting for public colleges and/or universities?
Yes o
No o
If Yes,
11) Was it mandated by legislation?
Yes o
No o
12) Were the indicators prescribed by legislation?
Yes o
No o
24
Performance Reporting: "Real" Accountability or Accountability "Lite" -- Seventh Annual Survey
13) Of the following, what individual or group(s) initiated performance budgeting?
Governor
o
Legislature
o
Coordinating board or agency o
University system(s)
o
Other (please specify)
o
14) In your opinion, to what extent has performance budgeting improved the performance of public colleges and/or universities in your state?
Great Extent o
Considerable Extent o
Moderate Extent o
Minimal Extent o No Extent o
Cannot Judge o
15) How likely is it that your state will continue performance budgeting for public higher education over the next five years?
Highly Likely o Cannot Predict o
Likely o
Unlikely o
Highly Unlikely o
16) How would you describe the actual effect of performance budgeting in your state on the funding of public colleges and universities?
Great Effect o
Considerable Effect o
Moderate Effect o
Minimal Effect o No Effect o
Cannot Judge o
If no,
17) How likely is it that your state will adopt performance budgeting for public higher education in the next five years?
Highly Likely o Likely o Unlikely o Cannot Predict o
Highly Unlikely o
Section Three: Performance Reporting
18) Does your state currently have performance reporting for public higher education?
Yes o
No o
If Yes, 19) Was it mandated by legislation?
Yes o
20) Were the indicators prescribed by legislation? Yes o
25
No o No o
Higher Education Program -- Rockefeller Institute of Government
21) Of the following, what individual or group(s) initiated performance reporting?
Governor
o
Legislature
o
Coordinating board or agency o
University system(s)
o
Other (please specify)
o
22) In your opinion, to what extent has performance reporting improved the performance of public colleges and universities in your state?
Great Extent o
Considerable Extent o
Moderate Extent o
Minimal Extent o No Extent o
Cannot Judge o
23) How likely is it that your state will continue performance reporting for public higher education over the next five years?
Highly Likely o Likely o Unlikely o Cannot Predict o
Highly Unlikely o
24) In your opinion, to what extent do the coordinating and/or system governing boards consider performance reports in the allocation of resources to colleges and universities?
Great Extent o
Considerable Extent o
Moderate Extent o
Minimal Extent o No Extent o
Cannot Judge o
25) To what extent do the coordinating and/or system governing boards consider performance reports in their planning and policymaking?
Great Extent o
Considerable Extent o
Moderate Extent o
Minimal Extent o No Extent o
Cannot Judge o
26) To what extent, do state government leaders consider the performance reports in their planning and policymaking?
Great Extent o
Considerable Extent o
Moderate Extent o
Minimal Extent o No Extent o
Cannot Judge o
27) Has your State revised its performance report based on its scores on the state-by-state report card Measuring Up 2000 & 2002, published by the National Center for Public Policy and Higher Education?
Yes o
No o
If Yes, to what extent? Great Extent o Minimal Extent o
Considerable Extent o
Moderate Extent o
No Extent o
Cannot Judge o
26
Performance Reporting: "Real" Accountability or Accountability "Lite" -- Seventh Annual Survey
28) How likely is it that your state will revise its performance report in the future based on Measuring Up 2000 & 2002?
Highly Likely o Likely o Unlikely o Cannot Predict o
Highly Unlikely o
If no performance reporting,
29) How likely is it that your state will adopt performance reporting for public higher education in the next five years?
Highly Likely o Cannot Predict o
Likely o
Unlikely o
Highly Unlikely o
Comments: ______________________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ Notes:
27
Higher Education Program -- Rockefeller Institute of Government Appendix B
State Colorado Connecticut Florida Idaho Kansas Louisiana New York Ohio Oklahoma Oregon Pennsylvania (State System) South Carolina South Dakota Tennessee Texas
Characteristics of Performance Funding
Adoption Year 2000 1985
Mandate Yes Yes
Indicators No No
1994
Yes
Yes
2000
No
No
2000
Yes
No
1997
No
No
1999
No
No
1995
Yes
Yes
1995
No
No
1999
No
No
2000
No
No
1996
Yes
Yes
1997
No
No
1979
No
No
1999
Yes
Yes
Initiation Legislature Coordinating Board Governor, Legislature Coordinating Board Governor, Legislature Coordinating Board University System Coordinating Board Coordinating Board Coordinating Board University System Legislature Governor, Legislature, Coordinating Board Coordinating Board Legislature
28
Performance Reporting: "Real" Accountability or Accountability "Lite" -- Seventh Annual Survey
State California Connecticut Florida Georgia Hawaii Idaho Iowa Kansas Louisiana Maine Maryland Michigan Minnesota Mississippi Nebraska Nevada New Mexico Oklahoma Texas Utah Wisconsin
Characteristics of Performance Budgeting
Adoption Year Mandated Indicators
2000
No
No
1999
Yes
No
1994
Yes
No
1993
Yes
No
1975
Yes
No
1996
Yes
No
1996
Yes
No
1995
No
No
1997
Yes
No
1998
Yes
No
2000
No
No
1999
No
No
2003
Yes
Yes
1992
Yes
No
1991
No
No
2000
No
Yes
1999
Yes
No
1991
No
No
1991
Yes
Yes
2000
No
No
2000
No
No
Initiation Governor, System Boards Governor, University System Governor, Legislature Governor Governor, Legislature Legislature Governor Coordinating Board Legislature Governor Governor Governor Governor, Legislature Legislature Coordinating Board Governor Legislature Coordinating Board Legislature Legislature, Coordinating Board Coordinating Board
29
Higher Education Program -- Rockefeller Institute of Government
State Alabama Alaska Arizona Arkansas California Colorado Connecticut Florida Georgia Hawaii Idaho Illinois Indiana Iowa Kansas Kentucky Louisiana Maine Maryland Massachusetts Michigan Minnesota Mississippi Missouri Montana Nebraska New Hampshire New Jersey New Mexico North Carolina
Performance Reporting
Date Adoption 1982 2000 1995 2003 1991 1996 2000 1991 2000 1996 1991 1997 2002 2002 2001 1997 1997 2000 1991 1997 2000 2000 1992 1992 2003 2003 2002 1994 1998 1991
Was It (PR) Mandated by Legislation? No Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes No No Yes Yes Yes Yes No Yes Yes Yes Yes Yes No No No No Yes No Yes
Were the Indicators (PR) Prescribed by Legislation? No Yes No No No Yes No Yes No No No No No No No No No Yes No No No Yes No No No No No Yes No No
30
Performance Reporting: "Real" Accountability or Accountability "Lite" -- Seventh Annual Survey
State North Dakota Ohio Oklahoma Oregon Pennsylvania South Carolina South Dakota Tennessee Texas Utah Vermont Virginia Washington West Virginia Wisconsin Wyoming
Performance Reporting (Continued)
Date Adoption 1999 1999 2002 1997 1997 1992 1995
Was It (PR) Mandated by Legislation? Yes No No No No Yes No
1989
No
1997
Yes
1995
Yes
2002
Yes
1995
Yes
1997
Yes
1991
Yes
1993
No
1995
Yes
Were the Indicators (PR) Prescribed by Legislation? No No No No No Yes No No Yes No No No Yes Yes No Yes
31
Higher Education Program -- Rockefeller Institute of Government Appendix C
Table 1. State Budget Gaps*
State Colorado
FY 2003 Current Estimated Gap as Percentage of General Fund Budget 0
FY 2004 Current Estimated Gap as Percentage of General Fund Budget 0
Connecticut Florida Idaho Kansas Louisiana New York Ohio Oklahoma Oregon Pennsylvania
0
0
0
0
7.9
8.8
2.4
5.1
0
8.5
6.3
24
0
7.1
7.8
5.3
18.5
17
3.4
0
South Carolina
8.6
7.5
South Dakota
0
0
Tennessee Texas
5.2
N/A
5.8
12
Average
4.3
6.3
States Dropped Illinois Missouri New Jersey
6.5
13.6
4.5
10.5
4.7
0
Average
5.3
12.05
* States with Performance Funding & States That Dropped Performance Funding
32
Performance Reporting: "Real" Accountability or Accountability "Lite" -- Seventh Annual Survey
Table 2. State Budget Gaps*
State California Connecticut
FY 2003 Current Estimated Gap as Percentage of General Fund Budget 10.9 0
FY 2004 Current Estimated Gap as Percentage of General Fund Budget 20.6 6.9
Florida Georgia
0
0
0
5.0
Hawaii
0
2.9
Idaho Iowa
7.9
2.9
0
0
Kansas Louisiana
2.4
5.1
0
8.5
Maine
0.6
0
Maryland Michigan
0
0
0
0
Minnesota Mississippi
0.04
15.5
0
0
Nebraska
8.5
13.6
Nevada New Mexico
9.8
N/A
0
0
Oklahoma Texas
7.8
5.3
5.8
12.0
Utah
0
0
Virginia Wisconsin
0
0
2.5
N/A
Average
2.55
4.5
States Dropped
Arkansas Illinois
0
0
6.5
13.6
Missouri North Carolina
4.5
10.5
0.8
14.0
Vermont
0
0
Average
2.3
7.6
* States with Performance Budgeting & States that Dropped Performance Budgeting
33
The Nelson A. Rockefeller Institute of Government The State University of New York 411 State Street Albany, New York 12203-1003
NONPROFIT ORG. U.S. POSTAGE PAID ALBANY, N.Y. PERMIT NO. 21

JC Burke, HP Minassians

File: performance-reporting-real-accountability-or-accountability-lite.pdf
Title: 7thSurvey.vp
Author: JC Burke, HP Minassians
Author: cooperm
Published: Wed Jul 2 10:04:32 2003
Pages: 41
File size: 0.18 Mb


Zack's Alligator, 7 pages, 0.14 Mb

Quiet City, 2 pages, 0.21 Mb

Cultural criminology, 2 pages, 0.01 Mb

Poems, 1799, 93 pages, 0.13 Mb
Copyright © 2018 doc.uments.com