Analytic Quality Glossary

 

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z Home

 

Citation reference: Harvey, L., 2004-24, Analytic Quality Glossary, Quality Research International, http://www.qualityresearchinternational.com/glossary/

This is a dynamic glossary and the author would welcome any e-mail suggestions for additions or amendments. Page updated 8 January, 2024 , © Lee Harvey 2004–2024.

 

Recipes

   

_________________________________________________________________

Value added


core definition

Value added is the enhancement that students achieve (to knowledge, skills abilities and other attributes) as a result of their higher education experience.


explanatory context

Value added is about what value, to the student, has been accumulated as a result of a period of time in higher education. Institutions may be evaluated or assessed on the basis of the cumulative value that they add to their students. Some proponents argue that the status of an institution should be judged by their value added contribution. However, most league tables or rankings do not do this as it difficult to calculate value added.


analytical review

Bennett (2001) defined value added as follows:

By value added we mean what is improved about students’ capabilities or knowledge as a consequence of their education at a particular college or university. Measuring value requires having assessments of students’ development or attainments as they begin college, and assessments of those same students after they have had the full benefit of their education at the college. Value added is the difference between their attainments when they have completed their education and what they had already attained by the time they began. Value added is the difference a college makes in their education.

 

Cunha and Miller (2009) wrote:

We define value-added as the increase in students’ skills and knowledge over their tenure in school. As such, it is student-specific and inherently difficult to measure. For nursing students, the value-added by a college encompasses what is learned in the core liberal arts curriculum as well as practical knowledge, like how to inject a vaccine and accurately measure a patient’s blood pressure, about the nursing profession. For a math major, value-added is quite different; while it still encompasses the same core liberal arts curriculum, we do not care if a math major knows how to clean a bedpan. However, we do hope that he completes college with a thorough understanding of proofbased logic and has a grasp of at least one branch of modern mathematics.

 

Harvey and Green (1993) defined value added as follows:

Value added is a ‘measure' of quality in terms of the extent to which the educational experience enhances the knowledge, abilities and skills of students (HM Government, 1991, para 80; HMI, 1990, p. 7). A high quality institution would be one that greatly enhances its students (Astin, 1990). Oxbridge may produce some ‘brilliant' first class graduates but having brilliant school leavers in the first place they may not have added very much. An inner-city polytechnic may produce a good proportion of 2:1s from an intake of non-traditional entrants, unqualified returners, and so on, and therefore may be adding a tremendous amount. Exactly how much is added, however, depends on the methodology (Barnett, 1988; CNAA, 1990) and what is defined as being of value in the first place.


associated issues

Harvey (2002, p. 14) noted:

There have been attempts to assess the value-added to students of their education (CNAA, 1990). Value-added refers to the enhancement of the knowledge, skills and abilities of students and the empowering of them as critical, reflective, life-long learners.

Value-added is experiencing a revival of interest, as the result of considered discussions of published league tables at all levels of education, not least the burgeoning interest in measuring ‘employability’  However, it is difficult to assess value added and most attempts have relied on measurement of entry and exit grades or abilities using somewhat crude indicators. Quantitative analysis of value-added is difficult for a variety of reasons including, the establishment of base benchmarks, measurement problems and the attribution of added value to the programme rather than some other factor. Arguably, though, the assessment of value-added is at the core of any improvement-oriented, value-for-money and transformative approach to quality assessment at the programme level.

 

Measuring value added

Although value added is widely regarded as a laudable measure of the contribution of higher education, there have been only spasmodic attempts to measure it and use it as a criterion for evaluating institutions’ provision and performance. Bennett (2001) states:

Easy as it is to state, assessment of value added is difficult to carry through. Let me briefly mention just a few of the more important difficulties.

·      Value has many dimensions. No college or university is trying to develop only a single capability in students; all are trying to develop an array of capabilities. Measurements of value added must therefore attend to a number of different dimensions of value. We probably should develop several different measures of value added and invite institutions to select the measures that reflect their intentions.

·      Institutions are different. Colleges and universities do not all seek to add the same kind of value to students’ development.

·      Even liberal arts colleges do not all have the same mission. We need to assess value added against a college’s chosen aspirations--its mission. Any effort to rank colleges or universities along a single dimension is fundamentally misguided.

·      Effects unfold. Some consequences of a college education may take years to express themselves. We may need to assess some aspects of value added with alumni rather than with graduating seniors.

·      Complexity and Cost. Measurement of value added is likely to be complex and expensive. Yet it can be more expensive for society to have no serious assessments of whether we are succeeding in having students learn.

A value-added approach is the best way to assess student learning, but higher education has not yet committed itself to developing reliable measures of the most important dimensions of a college education. There are, on the other hand, a few other possible strategies for assessing student learning that are worth considering.

 

Harvey and Green (1993) add:

The measurement of value added, for example, in terms of input and output qualifications provides a quantifiable indicator of ‘added value’ but conceals the nature of the qualitative transformation.

Approaches that attempt to identify a number of dimensions of value added provide clearer ideas about what has been transformed but these still rely heavily on output assessment (DTI/CIHE, 1990; Otter, 1992)

Arguing against a fitness-for-purpose approach, Müller and Funnell (1992, p. 2) argue that quality should be explored in terms of a wide range of factors leading to a notion of ‘value addedness’. The role of educational providers from this perspective is to ensure that:

learners fully participate in, and contribute to, the learning process in such a way that they become responsible for creating, delivering and evaluating the product (Müller and Funnell, 1992, p. 175)

In short, learners should be both at the centre of the process by which learning is evaluated and at the centre of the learning process. Feedback from learners is a crucial aspect of evaluation (Müller and Funnell, incidentally, prefer qualitative rather than quantitative methods as they better explore learners’ perceptions of quality). Placing the learner at the centre shifts the emphasis from value-added measures of enhancement to empowerment.

 

It is arguable that attempts to determine and systematically apply value-added approaches have not been developed or encouraged because, in many countries, this may raise doubts about the reputational hierarchy of institutions, as most of the high reputation institutions take only the top performing high-school leavers and may add relatively little.


related areas

See also

benchmark

enhancement

quality


Sources

Astin, A. W., 1990, ‘Assessment as a tool for institutional renewal and reform’, in American Association for Higher Education Assessment Forum, 1990, Assessment 1990: Accreditation and Renewal, AAHE, Washington, D.C., pp. 19-33.

Barnett, R., 1988, ‘Entry and exit performance indicators for higher education: some policy and research issues', Assessment and Evaluation in Higher Education, 13, 1, Spring, pp. 16–30.

Bennett, D.C., 2001, Assessing quality in higher education – Perspectives, Liberal Education, Spring.

Council for National Academic Awards (CNAA), 1990, The Measurement of Value Added in Higher Education. London, CNAA.

Cunha, J. and Miller, D., 2009, Value-Added in Higher Education, 30 April 2009, available at http://www.stanford.edu/~millerdw/value_added.pdf, accessed 29 January 2012, not available 28 August 2012.

Department of Trade & Industry and Council for Industry and Higher Education (DTI/CIHE), 1990, Getting Good Graduates. London, HMSO.

H.M. Government, 1991, Further and Higher Education Bill. HL Bill 4 50/5. London, HMSO.

Harvey, L. and Green, D., 1993, ‘Defining quality’, Assessment and Evaluation in Higher Education, 18(1). pp. 9–34.

Harvey, L., 2002, ‘Quality assurance in higher education: some international trends'  Higher Education Conference, Oslo, 22–23 January 2002.

HMI, 1990, Performance Indicators in Higher Education. A Report by HMI, Reference 14/91/NS. Jan-April, 1990. London, DES.

Müller, D. and Funnell, P. , 1992, ‘Exploring learners’ perceptions of quality’, paper presented at the AETT conference on ‘Quality in Education’, University of York,  6–8th April, 1992.

Otter, S., 1992, ‘Learning outcomes: a quality strategy or higher education’, paper to the ‘Quality by Degrees’ Conference at Aston University, 8th June, 1992.


copyright Lee Harvey 2004–2024



Top

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z Home