by Task Force on
ACADEMICS TASK FORCE
August 28, 1996
CHE Conference Room
Task Force Members Present: Mr. Stephen Avery, Dr. John Britton, Ms. Juanita Bulloch, Mr.
Frank Gilbert, Dr. Martha Herbert, Mr. Douglas McKay, Dr. John Stockwell, Mr. Larry Wilson
Task Force Resource Persons Present: Dr. Wanda Hayes (USC-Aiken); Dr. Joe Prus (Winthrop
University); Dr. Gail Morrison, CHE Associate Commissioner for Academic Affairs; Dr. Mike
Smith, CHE Associate Commissioner for Special Projects; Dr. Lynn Kelley, as recorder (CHE-Academic Affairs);
Also present: Mr. Roger Whaley, CHE Vice Chairman; Mr. Fred Sheheen, CHE Commissioner;
faculty and staff representatives of the institutions of higher education; and state government
The meeting was opened at 1:40 p.m. by Mr. Whaley, Vice Chairman of the Commission, who
reviewed the background and history of the legislation which produced the Task Force. He stated
that there are nine critical success factors in the legislation, and that within these are nested 37
"performance indicators." The role of the three task forces which the Commission has created is to
provide measurable definitions for all these indicators. Thirteen of these indicators are within the
scope of responsibility of the Academic Task Force. The others have been divided to be considered
by the Administrative Management or Planning/Institutional Effectiveness task forces.
Mr. Whaley stated that some of the members of the Academic Task Force may be invited to serve
on a second tier of task forces, known as "Sector-Based Benchmarking Task Forces." The second
tier of task forces will decide to what extent each of the four sectors of institutions recognized in the
new law (i.e., research, four-year comprehensive, two-year USC branches, and technical colleges) is
to be held accountable for funding purposes regarding each of the performance indicators. He gave
as an example the idea that some students do not want to achieve a degree and that the variation in
the percentages of these kinds of students probably differs by sectors. Thus, the "weight" assigned
to the performance indicator on "graduation rates" would be adjusted, presumably, by sector.
Mr. Whaley said that this process is required and will be carried out since it is the law. He said
that in the deliberations which preceded the law's passage from October 1995 until February 1996
there was a blue ribbon committee of business persons and legislators which met continuously.
They unanimously agreed to the entire report which they fashioned. He said that no other state has
accepted the challenge of making funding for public higher education 100 percent dependent upon
performance indicators and that there are, consequently, persons in these other states who either are
convinced that it cannot be done or who are waiting to see how South Carolina does it, because they
believe it can be done. He then turned the meeting over to Mr. Wilson, after telling the group that he
felt certain that Mr. Wilsons talents would make the process productive.
Mr. Wilson requested that each member of the Task Force and the resource personnel introduce
themselves and tell how they happened to come to this Task Force. Mr. Wilson then requested that
Dr. Smith review the notebooks and process. Dr. Smith did so, emphasizing the short time frame for
completing the work of the Task Force. Mr. Wilson then requested that the Task Force decide upon
the dates the Task Force will continue to meet to complete its work. The dates agreed to are as
follows: September 3, 10, 18. All meetings will begin at 1:30 and proceed approximately
through 5:30 p.m. in the Commissions main conference room. At the same time, Ms. Bulloch
agreed to serve as Vice Chairman of the Task Force and preside whenever Mr. Wilson might be
unable to attend.
Dr. Smith stated that Dr. Joseph C. Burke, a Washington-based researcher who is known as the
country's expert on performance indicator funding, has put a brief question and answer interview
together for the CHE and the Task Forces which is included in the Booklet under Section 3, which
Task Force members will wish to read at their leisure. He also thanked staff of the technical
colleges and the State Board for Technical and Comprehensive Education as well as representatives
of the Committee on Statewide Planning for putting their ideas together and providing them to the
CHE staff for distribution to the Task Force members on the performance indicators. At the request
of Dr. Carol Garrison (USC/Vice Provost), Dr. Smith stated that two copies of the notebook handed
out to Task Force members would be distributed to each campus.
Mr. Wilson stressed to the entire group that it was critical that the performance indicators be done
well, since otherwise it might be dangerous and institutions would be motivated to develop
counterproductive behaviors which were neither in the interests of the institutions nor their students.
He stated that the emphasis of the deliberations in his mind should be on the results, not the
processes, of institutional activity; that it should be with an eye on preparing persons better during
their student days to take jobs that the economy is able to absorb; and that it should not be afraid to
tackle questions such as why high achieving graduates of our high schools leave for college and
never return to South Carolina. He concluded this by stating that if the legislation which the Task
Force is working with does not produce the desired ends, that the Commission should be advised by
this Task Force and retain the flexibility to modify as needed in order to achieve the original
intention of the legislation.
(1) Mission Focus: Curricula Offered to Achieve Mission
Mr. Wilson elected to discuss the indicators in the order that they appear in the legislation itself,
the first being "(1)Mission Focus: (B) Curricula Offered to Achieve Mission." Mr. Wilson
requested that Dr. Morrison provide an overview of the process used by the Commission to approve
a program proposal in the institutions of higher education. Mr. Wilson asked her what happened if
CHE did not approve a program that an institution wanted, and she responded that the institution
could appeal to the General Assembly. Dr. Britton stated that such an appeal has only been
successful once with respect to a Computer Engineering program.
Dr. Morrison also stated that the Commission has productivity standards related to degrees
awarded and enrollment for programs for both two-year and four-year institutions. The ones for the
two-year institutions include placement data as well.
Mr. Wilson asked for a discussion of mission statements. Dr. Britton asked how the Task Force
was to write appropriate indicators if mission statements had not been approved yet by CHE. Mr.
Wilson asked if research institutions had multiple missions. After considerable discussion, the
group decided that while new missions for the institutions would have to be written, existing mission
statements would be the basis for evaluating whether existing programs are aligned with the
institutional mission. Ultimately, all existing programs will have to be aligned with the new mission
Dr. Stockwell stated that programs of an institution need to be reviewed in terms of the missions of
all the institutions. Dr. Morrison and Dr. Smith concurred, stating that since the legislation
stipulates sectors of higher education, programmatic "fit" with the mission of the institution needs to
be looked at in terms of the sector's mission and the appropriateness of such programs to this sector
Mr. Sheheen noted that duplication of programs has been noted by many persons in public life in
South Carolina. He said that critics point to many examples of this duplication, including two
medical schools and three engineering schools. Dr. Britton said that he had been asked by a faculty
member at Georgia Tech if Clemson were a private institution, since he could not believe that a
small state like South Carolina had three public institutions with schools of engineering. Mr. Wilson
stated that he wondered if the General Assembly had intended in passing this legislation the
possibility of shutting down programs which were duplicative. Mr. Sheheen stated that they did and
that Mr. Whaley, were he still in the room, would agree with that interpretation. Dr. Morrison stated
that the implementation of the legislation will mean that "mission" can no longer be interpreted so
broadly and loosely as before, because the CHE will be required to approve the mission of each
institution. Mr. Sheheen agreed and stated that if the CHE delimits missions of institutions in
keeping with the legislative intent, the institutions will necessarily begin to "shed" programs.
Dr. Stockwell talked about "general missions" of institutions within a sector and counterpoised that
idea to "specific missions" of individual institutions within the same sector. It was agreed that they
are different and suggested that one need only consider how different the missions of two of the
State's four-year comprehensive institutions are: The Citadel, a residential, military college; and
USC-Spartanburg, a commuter-oriented institution dealing with many nontraditionally-aged
students. Mr. Gilbert stated that although the CHE will have to approve missions of institutions, he
hopes this can be done by "backing into it" to see how well the institutions are carrying out their
Mr. McKay stated that the Task Force finds itself between the mission of the institution and the
marketplace demand for graduates of these institutions. Dr. Britton and Mr. Wilson responded that
the mission needs to encompass the need for marketable graduates so that business can grow in
Dr. Reid Johnson (Francis Marion University) stated that in developing a measurable definition of
this performance indicator, the Task Force needs to take into account the fact that "programs" at the
institutions are defined as more than degree programs. He cited examples such as the general
education program, the remedial program, the minor programs and the concentrations programs. Dr.
Prus (Winthrop University) stated that use of "percentage" makes a big difference in measurement
from use of "numbers" when discussing programs which meet or do not support the institutional
mission. Dr. Herbert stated that the use of numbers would reward larger
institutions over smaller ones. Mr. Wilson suggested that it might work to hurt smaller ones more, too. Dr.
Herbert questioned if a negative measure might hurt smaller institutions less. Dr. Stockwell stated that there
could be many small programs at an institution whose fortunes might not hurt nearly as much as the cutting of a
single large one. All this discussion served to show how the element of human judgment needs to be built into
the performance indicators.
Mr. Avery stated that the mission against which programmatic offerings are to be judged had to be, in his
opinion, the most recently approved one. Mr. Wilson stated that another matter to be considered had to be an
example of an institution that was offering all its programs within its scope of mission but was not offering all the
programs it should be offering within that scope of mission. After much further discussion, the Task Force
decided to ask the CHE Academic Affairs staff to craft language for a draft of this performance indicator that will
take into account the following elements:
the number of programs
the resources of the institution
Mr. Wilson then requested that Dr. Morrison do this and have it sent to the Task Force prior to the next
meeting. He then asked at 3:45 p.m. for a short recess.
(2) Quality of Faculty: Academic and Other Credentials of Instructors
The Task Force reconvened at 4:00 p.m. Dr. Smith requested that all persons present join in the
conversation after it had been initiated by the members of the Task Force. Mr. Wilson then directed the attention
of those present to the second performance indicator: (2)Quality of Faculty: (A) Academic and Other Credentials
of Instructors. Mr. Avery requested a recapitulation of some of the many acronyms in the drafted responses to
this performance indicator by the Statewide Planning Committee and the committee of the State Board for
Technical and Comprehensive Education.
A lengthy discussion ensued about the relevance of Southern Association of Colleges and Schools (SACS)
criteria for faculty to the definition of performance indicators as measurable bases for determining "quality.
Reference to the SACS Criteria (i.e., Manual of Accreditation) showed that the requirement of a terminal degree
in field for accreditation depends upon the level of institution. It varies from zero percent in the two-year
institutions; to 25 percent in baccalaureate institutions; to 100 percent for graduate programs, except that in all
levels provision is made for exceptional cases of persons with particular talents and experience which can serve
in lieu of the formal terminal degree credential. (Note: a copy of relevant SACS criteria for this indicator is
Mr. McKay asked if the SACS Criteria were a "high bar" or "low bar." Concerns were expressed that they
were a low bar, since all the institutions in the State that are public are accredited by SACS. It was pointed out
that while this is true, not all accredited institutions are given equal commendation for faculty or other resources
in their evaluations by accrediting teams. Dr. Merrill (YORK TECH, President) stated that if the Task Force were
not careful, it would equate quality of teaching faculty with Ph.D. holders. In the technical colleges, he said, this
is not the case; many of the programs at these institutions do better hiring a person with a bachelor's or master's
degree and significant experience in field.
Mr. Wilson stated that SACS criteria could be used as a baseline and then institutions which exceed that
would be further rewarded. He also pointed out that the benchmarking groups would have an opportunity to
evaluate to what extent each sector was to be affected by this performance indicator definition. Dr. Merrill stated
that the SACS criteria would be a baseline for a large number of dollars to flow into the institutions. Dr. Earline
Sims (SCSU, Associate Vice President for Academic Affairs) said that objective quality indicators already exist
nationally. These are found by discipline group in such areas as membership on boards, journal articles in
refereed journals, presentations of scholarly programs, and research projects. Mr. Sheheen stated that although
Dr. Sims is correct, none of these benchmarks is accepted by all institutions and all disciplines. Therefore, he
said, there are no universally accepted benchmark measures.
Ms. Bulloch stated that there are many other measures the Task Force can look at besides SACS. She
said that in her opinion the conversation had gone off track. What the Task Force is to do, she stated, was to
focus not on the quality of the faculty members' credentials but on the quality of the faculty members'
Dr. Prus said that SACS Criteria are becoming more of a "high bar" over time. Mr. Wilson asked about an
institution like MUSC which already, he said, had 100 percent of its faculty with Ph.D. degrees. He wanted to
know how an institution like that could be rewarded for having Ph.D. faculty who were more valuable because
they both possess the degree and are "master teachers." Mr. Sheheen said that this notion of funding excellence
needs somehow to be captured, if possible, by an objective, measurable definition. Dr. Britton said we need to
find ways of getting funds to the institutions to create master teachers (if they don't have them); he said the
concern cannot be with just rewarding quality. There must be a similar concern with creating quality through
Dr. Morrison noted that professional licensure in such fields as engineering, nursing, or education; awards
or honors from organizations such as the National Endowment for the Humanities or National Endowment for the
Arts; Nobel and other laureates; membership in prestigious groups such as the American Academy of Sciences;
and so forth might be used as part of a basis for "credentials" in this credential-focused performance indicator, if
measurable "quality" were being looked for. Dr. Stockwell (USC-Spartanburg, Chancellor) said that some of this
might be picked up by other performance indicators such as post-tenure review, performance reviews of faculty,
and so forth. Dr. Morrison said that while she agreed with Dr. Stockwell, there was no other performance
indicator that is solidly based upon credentials.
Dr. Prus said he wanted to underscore Dr. Merrill's concern that credentials include experience. Dr.
Merrill stated again that the SACS criteria would form a good baseline for funding in this performance indicator.
Mr. Gilbert said the SACS criteria could be established as a baseline and then the institutions could be evaluated
each year to the extent that they exceeded these baselines.
Mr. Wilson asked whether credentialed faculty were going to be looked at in terms of the percentage of
classroom hours taught by faculty or the headcount of the faculty to be used. Dr. Harry Matthews (USC-Columbia, Director of Institutional Research) said that parents look for the percentage of persons with terminal
degrees teaching their young people as a sign of quality. Dr. Stockwell said that 3.B. also was looking at student
hours taught by credentialed faculty.
The discussion moved to the issue of whether "faculty" was to be defined as the "teaching faculty" (i.e.,
those in the classroom instructional mode) or all persons holding faculty rank (i.e., research faculty, librarians,
administrators with faculty rank, teaching assistants, etc.) After a vigorous discussion, it was agreed that the latter
group was to be used as the definitional base. One issue to be addressed is whether public service faculty should
also be included in this group. While they are paid by a separate line item appropriation, many of these people
have at least some student contact.
Mr. Wilson thanked the group for their time. He said we have to assume that someone who is more
credentialed is "better" on that level alone, given the legislative concern that this be included. He asked that Dr.
Morrison prepare a draft of this performance indicator also and send it to the members of the Task Force. Given
the discussion, he concluded that headcount rather than student course or credit hours must be the basis for the
measure that is prepared.
Dr. Smith asked the Task Force for any suggestions that they might have to make the process smoother
and/or more informative. The following suggestions were offered: relevant materials such as SACS measures,
etc. should be shared in advance of the next meeting (Mr. Avery); the legislative intent of each performance
indicator should be shared with the group in session and prior to the discussion of each indicator (Mr. Wilson,
Ms. Bulloch); there should be a mechanism available to revisit the Task Force's work by the Task Force, based
upon feedback of what the other task forces have done that suggests a need for revision (Dr. Stockwell); a
timekeeping mechanism needs to be put into effect to prevent endless discussion on only one of the four-per-session indicators to be covered (Ms. Bulloch). Mr. Sheheen reminded the group that all their work would go to
the full Commission before it went to the General Assembly and that any task force's work can be
reviewed/revised at any time prior to the report going to the General Assembly.
The meeting was adjourned by the Chairman at 5:32 p.m.