Skip Navigation
Back 

res1025

res1025


Minutes of the Research Universities Sector Committee
October 25, 1996  9:00 a.m. - 1:30 p.m.

Members Present                         Members Absent

Mr. Larry Wilson, Chair                 Dr. Janice Bellack
Mr. William Dauksch                     Mr. Winfred Greene
Ms. Patricia McAbee                     Dr. Layton McCurdy
Dr. Walton Owens, Jr.                   Mr. Thomas Marschel
Dr. Ronald Thurston
Dr. Marcia Welsh

Staff Assisting Committee

Mr. David Fleming, Clemson University
Mr. Harry Matthews, USC-Columbia
Dr. Gail Morrison, CHE
Dr. Mike Smith, CHE
Dr. Nancy Healy-Williams, CHE
Dr. Lynn Kelley, CHE

     Mr. Larry Wilson, Chair, called the meeting to order and
indicated that the committee should view the performance
indicators as an opportunity for the higher education community to
make changes for the better.   There will be five meetings of the
Committee during which two objectives will be accomplished:

     1. determine the relative weight of the 37 performance
        indicators and assign a total value of 100 percent.
     
     2. determine the benchmarks for achieving the performance
        indicator to include:
              satisfactory performance for an indicator
              award of progress toward satisfactory
               performance for an indicator

The Commission will weight each indicator for each institution by a
factor which will consider the size, complexity and cost of programs
at an institution.  The sector committee does not, therefore, need to
attempt to allow for enrollment or other variables in its weightings.

     The Committee's first task was to develop relative
weightings for the nine categories of performance indicators. 
There are nine categories of indicators with two to six indicators
within each category.  The first step in the process was to develop
a relative ranking of each of the categories, regardless of the
indicators within that category.  The ultimate objective in this
process was to obtain a value for each indicator such that the total
equals 100 percent.  The chair asked each member to rank each of
the nine categories on a five-point scale: highest in importance,
very important, important, less important, not important.  The
ranking of each category by the Committee, in descending order of
relative importance, was determined and is shown as follows:


     Category            Title     

         II              Quality of Faculty
         III             Instructional Quality
         V               Administrative Efficiency
         VI & IX         Entrance Requirement/ Research Funding
         I & VII         Mission/ Graduates' Achievements
         IV & VIII       Institutional Cooperation and Collaboration/                             User-Friendliness of the Institution

The group's collective ratings of the nine categories show the following
values in Table 1.

                           Table 1


Highest in Importance: II Quality of Faculty, III Instructional Quality, 
		       V  Administrative Efficiency

Very Important:        VI Entrance Requirements, IX Research Funding      

Important:             I Mission,VII Graduates'Achievements          

Less Important	       IV Institution Cooperation/Collaboration, 
		       VIII  User-friendliness of Institutions  

        
     Committee members discussed the importance of the various
categories and emphasized the importance of developing an initial
ranking of these categories.  Mr. Wilson indicated that the ranking will
assist the committee in focusing and thinking about the relative
importance of each category.  He further stressed that the committee's
actions represent an iterative process and that the categories should
not be viewed as separate entities but as a continuum.

     The Committee's second task was to weight individually the 37
performance indicators to develop an initial ranking for each.  For this
task, each performance indicator was weighted regardless of its
categorical grouping, i.e., each indicator was considered separately. 
The purpose of this task was to develop an initial ranking of the
indicators from which the Committee could discuss the importance of
each one, as well as its relationship to other indicators. Once again, it
was stressed to the members that this task assumes that all indicators
can be measured and that these rankings will be modified by future
reflection and discussion.  The Committee members ranked, on a five
point scale (most important to not important) each indicator individually
and a mean value for each was calculated.  The relative importance of
each indicator is shown in Table 2:

Table 2


Order

                         Performance Indicator
                            Category/Rating
                                    
                                    
1	9b.  	Amount of public and private sector grants
		Research Funding V.I.


2	6a.  	SAT and Act scores of student body
		Graduates' Achievements I.


3	5a.  	Percentage of administrative costs as compared 
       		to academic costs
		Administrative Efficiency H.I


4	2d.  	Compensation of faculty
		Quality of Faculty H.I.


5	5b.  	Use of best management practices
		Administrative Efficiency H.I.


6	2a.  	Academic and other credentials of professors      
       		and instructors
		Quality of Faculty H.I.


7	7a.  	Graduation rate
		Graduates' Achievements I.


8	1b.  	Curricula offered to achieve mission
		Mission I.


9	3c.  	Ratio of full-time faculty as compared to other full-     
       		time employees
		Institutional Quality H.I.


10 	6d.  	Priority on enrolling in-state students
		Entrance Requirements V.I.


11	7b.  	Employment rate for graduates
		Graduates' Achievements I.


12	2b.  	Performance review system for faculty to             
       		include student and peer evaluations
		Quality of Faculty H.I.


13	3d.  	Accreditation of degree-granting programs
		Instructional Quality H.I.


14	1e.  	Attainment of goals of the strategic plan
		Mission I.


15	7d.  	Scores of graduates on post-graduate professional,  
        	graduate or employment-related examinations and   
        	certification tests
		Graduates' Achievements I.


16	2c.  	Post-tenure review for tenured faculty
		Quality of Faculty H.I.


17	3a.  	Class sizes and student/teacher ratios
		Instructional Quality H.I.


18	5c.  	Elimination of unjustified duplication of an           
       		waste in administrative and academic programs
		Administrative Efficiency H.I.


19	6b.  	High school standing, grade point averages, and       
       		activities of student body
		Entrance Requirements V.I.


20	8c.  	Accessibility to the institution of all citizens of the
             	State
		User-friendliness L.I.


21	3b.  	Number of credit hours taught by faculty
		Instructional Quality H.I.


22	3e.  	Institutional emphasis on quality teacher              
       		education and reform
		Instructional Quality H.I.


23	1a.  	Expenditure of funds to achieve institutional mission
		Mission I.


24	5d.  	Amount of general overhead costs
		Administrative Efficiency H.I.


25	2e.  	Availability of faculty to students outside the       
       		classroom
		Quality of Faculty H.I.


26	7c.  	Employer feedback on graduates who were               
       		employed or not employed
		Graduates' Achievement I.


27	9a.  	Financial support for reform in teacher education
		Research Funding V.I.


28	1c.  	Approval of a mission statement
		Mission I.


29	1d.  	Adoption of a strategic plan to support the mission    
       		statement
		Mission I.


30	4b.  	Cooperation and collaboration with private industry
		Cooperation/Collaboration L.I.


31	8a.  	Transferability of credits to and from the institution
		User-friendliness L.I.


32	8b.  	Continuing education programs for graduates and     
       		others
		User-friendliness L.I.


33	4a.  	Sharing and use of technology, programs,                 
       		equipment, supplies, and source matter experts        
       		within the institution, with other institutions, and the   
       		business community
		Cooperation/Collaboration L.I.


34	7e.  	Number of graduates who continue with their             
       		education
		Graduates' Achievements I.


35	7f.   	Credit hours earned of graduates
		Graduates' Achievements I.


36	6c.  	Post-secondary non-academic achievement of          
       		student body
		Entrance Requirements V.I.


37	2f.  	Community or public service activities of faculty  
      		for which no extra compensation is paid
		Quality of Faculty H.I.


H.I.- high in importance; V.I.- very important; I.- important; L.I.- less
 important

A comparison of Tables 1 and 2 clearly demonstrates that particular
performance indicators within the categories are more important that other
indicators within that same category.  For example in Category 9 (Research
Funding), indicator 9b. (Amount of public and private sector grants) ranks
number one as an indicator while 9a. (Financial support for reform
in teacher education) ranks number 27.  

     This task allowed the committee members to focus their thinking on the
process of developing percentages for the individual performance indicators
which reflect the importance of each indicator in relation to the category
in which it is contained.  Mr. Wilson stated that the members must take
time to examine the initial weights, provide input into the weighting
procedure, and determine if the critical data are available. 

     The Committee was informed by Mr. Wilson that the average weight for
each indicator is 2.7 percent, calculated by dividing 37 indicators into
100 percent.  The Committee indicated that using a five-point scale does
not provide enough range in the weighting of the performance indicators.
Using as input the discussion resulting from the generation of Table 2, the 
Committee then proceeded to provide relative ranking for the indicators on
a scale of 1-10. The results of this ranking are shown in Table 3 and a
percent rating for each indicator is shown in the left column.  The total
percentage rating for each category is shown on the right.  Items are
grouped by category.


                              Table 3

Percent  Performance Indicator,                        Category Percentage


2.7%	1a.  Expenditure of funds to achieve mission		12.8 %

3.5%	1b.  Curricula offered to achieve mission

1.6%	1c.  Approval of a mission statement

2.1%	1d.  Adoption of a strategic plan to support the 
	     mission statement

2.9%	1e.  Attainment of goals of the strategic plan

4.2%	2a.  Academic and other credentials of professors 	18.0 %
	     and instructors  

3.1%	2b.  Performance review system for faculty to include 
	     student and peer evaluations

3.5%	2c.  Post-tenure review for tenured faculty

4.4%	2d.  Compensation of faculty

2.3%	2e.  Availability of faculty to students outside 
	     the classroom

0.5%	2f.   Community or public service activities of faculty 
	      for which no extra compensation is paid

2.1%	3a.  Class sizes and student/teacher ratios		13.8  %

2.1%	3b.  Number of credit hours taught by faculty

3.9%	3c.  Ratio of full-time faculty as compared to other 
	     full-time employees

3.3%	3d.  Accreditation of degree-granting programs

2.4%	3e.  Institutional emphasis on quality teacher 
	     education and reform

1.2%	4a.  Sharing and use of technology, programs, equipment, 
	     supplies, and source matter experts with the 
	     institution, with other institutions, and the business 	     community						2.8 %

1.6%	4b.  Cooperation and collaboration with private industry


4.5%	5a.  Percentage of administrative costs as compared to 	     	                academic costs 					13.6 %

3.8%	5b.  Use of best management practices

2.9%	5c.  Elimination of unjustified duplication of and waste 
	     in administrative and academic programs

2.4%	5d.  Amount of general overhead costs

4.8%	6a.  SAT and ACT scores of student body			11.5 %

2.7%	6b.  High school standing, grade point averages, and 
	     activities of student body

0.6%	6c.  Post-secondary non-academic achievement of student body

3.4%	6d.  Priority on enrolling in-state students

2.8%	7b.  Employment rate for graduates

2.6%	7c.  Employer feedback on graduates who were employed 
	     or not employed

3.6%	7d.  Scores if graduates on post-graduate professional, 
	     graduate or employment-related examinations and 
	     certification tests

0.8%	7e.  Number of graduates who continue their education

0.7%	7f.   Credit hours earned of graduates

1.4%	8a.  Transferability of credits to and from the 
	     institution					5.6 %
1.6%	8b.  Continuing education programs for graduates 
	     and others

2.6%	8c.  Accessibility to the institution of all citizens 
	     of the State

2.9%	9a.  Financial support for reform in teacher education	7.7 %

4.8%	9b.  Amount of public and private sector grants



The purpose of compiling such rankings is to distill the Committee's
thinking on theperformance indicators.  As can be seen from Table 3, the
relative weight for a category is influenced by the number of indicators
within each category.  Table 4, below, is a comparative analysis of
weighting the categories by two different methods.  Both methods provide
relative weights for the categories but as can be seen from the table
below, different weights are obtained.   One method relied on totaling the
weights for each indicator on a ten point scale, computing the percentage 
for each indicator, and totaling the percentages for each category.  The
other method repeated this process but using the four-point scale.
                      
Table 4


Category
Relative Weight of Category on a 10-point Scale*
Rank Order
Relative Weight of Category by 5-pt Scale*
Rank Order


1  Mission
10.8
5
13.2
5


2  Quality of Faculty
9.6
7
14.0
3


3  Instructional Quality
12.4
3
15.0
2


4  Cooperation/Collaboration
7.6
9
3.7
9


5  Administrative Efficiency
13.8
2
13.4
4


6  Entrance Requirements
11.9
4
11.5
6


7  Graduates' Achievements
10.4
6
15.2
1


8  User-Friendliness
9.6
7
7.0
7


9  Research Funding
14.0
1
6.8
8


* Weighting of each performance indicator on a 10-point or 5-point scale
and computing the total weight for each category.

                                            
Table 4B


Order
Rank Order of Categories on a 10-Point Scale
Order 
Rank Order of Categories on a 5-Point Scale


1
Research Funding
1
Graduates' Achievements


2
Administrative Efficiency
2
Instructional Quality


3
Instructional Quality
3
Quality of Faculty


4
Entrance Requirements
4
Administrative Efficiency


5
Mission
5
Mission


6
Graduates' Achievements
6
Entrance Requirements


7
User-Friendliness
7
User-Friendliness


7
Quality of Faculty
8
Research Funding


9
Cooperation/Collaboration
9
Cooperation/Collaboration



     The next task for the Committee to perform is to refine the weightings 
interms of the importance of each performance indicator within each
category. Each Committee member was asked to prepare a weighting of each of
the 37 indicators (by percent) in terms of importance.  These percentages
must total 100 percent.  These percentages are to be faxed to Dr. Morrison
(803-737-2297)  no later than Wednesday, October 30.  The Commission staff
will then enter these data into a spreadsheet from which averages for each
indictor will be developed.  At the next meeting the Committee will examine
these weightings and offer comment and criticism.

     The Commission has indicated that it is appropriate to compare
institutions to those located out-of-state for indicators on which national
data might be available.  The Commission realizes that the three research
institutions have many inherent differences, such as MUSC being a medical
training facility and Clemson being a land-grant institution. However,
in-state comparisons can also used for benchmarking.

     The Committee developed a provisional list of nine aspirational peer
institutions for benchmarking comparisons.  These aspirational peer
institutions are ones that each research institution should be striving to
meet the same standards.  Brief discussion of aspirational peers and how
they might be used followed.  All three institutions were invited to review
the list below and provide comments at the next sector meeting.  

Aspirational Peer Institutions

Clemson University
Auburn
University of California-Davis*
University of Florida*
Georgia Tech
Iowa State University*
Michigan State University*
North Carolina State University
Purdue University*
Virginia Polytechnic Institute

MUSC (to be refined)

University of Alabama-Birmingham
University of Colorado Health Science Center
University of Illinois at Chicago
University of Iowa
University of Kansas Medical Center
University of Maryland-Baltimore
University of Michigan
University pf Nebraska Medical Center
University of North Carolina-Chapel Hill
University of Pittsburgh Medical Center
University of Texas Health Science Center at Houston
Virginal Commonwealth University

USC-Columbia

University of California-Davis*
University of Florida*
University of Georgia
University of North Carolina-Chapel Hill*
University of Missouri*
University of Pittsburgh*
University of Tennessee-Knoxville
University of Texas-Austin*
University of Virginia*

*Members of the Association of American Universities (AAU);  USC has
indicated membership is one of its goals.