Higher Education and Research Bill Debate
Full Debate: Read Full DebateLord Lucas
Main Page: Lord Lucas (Conservative - Excepted Hereditary)Department Debates - View all Lord Lucas's debates with the Department for Education
(7 years, 10 months ago)
Lords ChamberMy Lords, to pick up on the recently finished speech of the noble Baroness, Lady Wolf, I thoroughly agree with the three main points she made. First, producing a mixed indicator, as the Government propose, would not be useful to students or others looking at the quality of a university or a course. It would be like composing a meal out of mincemeat, cornflakes and cleaning fluid. Each of those things is useful in its own right, but mix them together and they have no function. Keep them separate, as the noble Baroness advocated, and you get some very useful data on which students can judge in their own terms the quality of a university.
Secondly, let these things be criterion-referenced. We have a real problem at the moment in GCSE—we are saying that every child should get English and Maths, but we are making that impossible, because we make these exams harder as students do better. About 30% are required to fail in order to meet the requirements of Ofqual. We have to be careful about this when we are looking at a bronze, silver or gold indicator. If we do not make these indicators criterion-referenced, we are saying that, whatever happens— however well our universities do—we will always call 20% of them bronze. In other words, we will put them into an international students’ “avoid at all costs” category. That seems a really harmful thing to do. If these criteria mean anything —if there is a meaning to any of the elements going into the TEF—we should be able to say, “We want you to hit 60%.” Why not? Why do the criteria have to be relative? They do not mean anything as relative criteria. They must have absolute meanings and they must be absolute targets.
Thirdly, this really adds up. The noble Lord, Lord Liddle, made it clear that gold, silver and bronze indicators—this big step change between the three grades —are not suited to a collection of imprecise measures. You do not know whether an institution that you have placed towards the bottom of silver is actually bronze or, worse, whether something in bronze is actually in the middle of silver. It is not that exact. You have to do what the Government do elsewhere in education statistics—for example, in value added on schools—which is, yes, to publish a value, but publish a margin of error too. That way, people get to learn that you might be saying: “This is actually 957 on your scale of 1,000, but the error margin is somewhere between 900 and 1,010.” You get used to the imprecision, to understand that this is not precise, so you can put a proper value on the information you are being given.
My Lords, I am speaking to the proposal, in the name of my noble friend Lord Stevenson, that Clause 25 should not stand part of the Bill.
That clause refers to the Office for Students taking over HEFCE’s current administrative responsibilities to deliver the TEF on behalf of the Secretary of State. I say in passing how disappointed I am that so many in your Lordships’ House, whom I thought would come to hear this debate on TEF metrics, have now departed. Perhaps that was not the reason they were here after all. Those of us who are ploughing through the Bill until all hours of the night realise that this is an important topic. The fact that we have had so many speakers on it is a clear reflection of that.
As the Minister will be aware, there is widespread concern across the sector at the use of proxy metrics, including statistics on graduate earnings, in an exercise that was supposed to be about teaching quality. On the face of it, there is some logic to the metrics. It is difficult to imagine an excellent course, the teaching, support and assessment for which the students think are rubbish, and that a large proportion of the students do not complete; or that hardly anyone who completes it manages to find employment or get a place on a postgraduate course.
Where metrics are used, they have to be much more securely evidence-based than those suggested. Last week in Committee, our Amendments 196 and 198 would have obliged the Office for Students to assess the evidence that any proposed metric for assessing teaching quality is actually correlated to teaching quality, and ensured that, prior to making that assessment, the OfS consulted those who know first-hand what is needed to measure teaching quality: academic staff and students. The Minister did not comment on that point, so it remains one on which I should like to hear his opinion. The importance of ensuring the statistics used are reliable and evidence-based cannot be overstated. They must earn and retain the confidence of the higher education sector—and that involves academics, students and administrators.
In her Amendment 201, the noble Baroness, Lady Wolf, seeks to ensure the quality of the statistics used by the OfS, and this should be a basic requirement. I support my noble friend Lord Lipsey in questioning the validity and value of the National Student Survey. The survey merely asks students about their perceptions of teaching at their institution. By definition, these perceptions are subjective and cannot involve comparing institutions. I heard what the noble Lord, Lord Willetts, said, when he suggested that similar institutions could be compared in terms of their ethnic make-up and students’ economic background. That kind of benchmarking sounds improbable at best because, even if suitable comparators could be found, the question is, how would the outcome be weighted?
It sounds as though gold, silver and bronze categories would be created before the metrics had even been measured. As I said, that sounds improbable to me, and I agree with the noble Baroness, Lady Wolf, that benchmarking is surely not the answer. Linking institutions’ reputations to student satisfaction is likely to encourage academics to mark more generously and, perhaps, even avoid designing difficult, more challenging courses.
With academics increasingly held accountable for students’ learning outcomes, students’ sense of responsibility for their own learning—something I thought was a core aspect of higher education—will surely diminish. We are now entering an era where students dissatisfied with their grades can sue their universities. Improbable as that sounds, only last week the High Court ruled that Oxford University had a case to answer, in response to a former student who alleged that what he termed “boring” and “appallingly bad” teaching cost him a first-class degree and the opportunity of higher earnings.
This may be the shape of things to come. Last year, nearly 2,000 complaints were made by students to the higher education Ombudsman, often concerning contested degree results. Nearly a quarter were upheld, which led to universities being ordered to pay almost £500,000 in compensation. Does anyone seriously believe that the introduction of the TEF metrics will lead to a reduction in such complaints?
Metrics used to form university rankings are likely to reveal more about the history and prestige of those institutions than the quality of teaching that students experience there. The Office for National Statistics report, on the basis of which the TEF is being taken forward, made it clear that they were told which metrics to evaluate, leading to the conclusion that these metrics were selected simply because the data were available to produce them. It is widely acknowledged that students’ experience in their first year is key in shaping what they gain from their time at university, yet the focus of the proposed metrics, of course, is mainly on students’ experiences in their final year and after graduation.
The ONS report was clear that the differences between institutions’ scores on the metrics tend to be narrow and not significant. So the majority of the judgment about who is designated gold, silver or bronze will actually be based on the additional evidence provided by institutions. In other words, an exercise that is supposedly metrics-driven will in fact be decided largely by the TEF panels, which is, by any other description, peer review.
Although the Minister spoke last week about how the TEF would develop to measure performance at departmental level, the ONS report suggested that the data underpinning the metrics would not be robust enough to support a future subject-level TEF. Perhaps the Minister can clarify why he believes that this will not be the case—the quality of courses in a single university tend to be as variable as the quality of courses between institutions. As I said in Committee last week, this would also mean that students’ fees were not directly related to the quality of the course they were studying. A student at a university rated gold or silver would be asked to pay an enhanced tuition fee, even if their course at that university was actually below standard—a fact that was disguised in the institution’s overall rating.
Learning gain—or value added—has been suggested as an alternative, perhaps better, measure of teaching quality and is being explored in other countries. At a basic level, this measure looks at the relationship between the qualifications and skills level a student has when starting their degree programme, compared to when they finish—in other words, a proper, reliable means of assessing what someone has gained from their course of study.
The BIS Select Committee report on the TEF metrics published last year recommended that priority should be given to the establishment of potentially viable metrics relating to learning gain. I hope the Minister will have something positive to say on that today, or, failing that, on Report. We do not believe that the metrics as currently proposed are fit for purpose; more importantly, nor do many of those within the sector who will be directly involved with the TEF. That should be a matter of some concern for the Minister, for his colleague the Minister for Universities and Science, and indeed for the Government as a whole.
My Lords, in moving Amendment 207 I shall speak to the other amendments in the group. The amendment covers a point I have made before—that it is essential that the whole sector should be represented in these organisations, not just the bits that the old universities like.
Amendment 392 would extend the Secretary of State’s access to information to anything they may be required or interested to know under any enactment, rather than just under the Bill.
Amendment 395 would appoint HESA—I suspect it is HESA being talked about at this point—to take an interest in people who intend to become students, not just people who become students, because a lot of the data they produce will be used to inform people as to whether to pursue a course, which is not really of much interest to those who have already taken that decision. It is important that HESA should focus on the students-to-be as much as on people who are already students.
Amendment 400 is an alternative to Amendment 207. I do not blame the current HESA regarding the provisions of Amendment 401. It is a trap that UCAS has fallen into of putting money and its constituent institutions ahead of the interests of students. This is a difficult thing with all such bodies, such as Ordnance Survey and others: the money tends to become the focus of what they are doing. It needs government to pull them back to focus on the interests of the country as a whole and, in this case, of students in particular. As long as the Office for Students has power to keep a body on the straight and narrow in this regard, I shall be quite satisfied that the Bill does not need this additional wording.
The anti-competitive conditions in Amendment 403 again look at the way UCAS has become a constraint on the way individual universities reflect students. Anti-competitive behaviour should always be subject to the very closest scrutiny by government to justify it. I would like to know that the OfS can keep its eye on that.
Amendment 407 goes with Amendment 395. I beg to move.
My Lords, I thank my noble friend for drawing attention to a range of concerns relating to how the designated bodies will operate. I offer my assurance that we share the intention that legislation must support these bodies to be responsive to the needs of current and prospective students, and representative of the whole sector. I am happy to discuss these amendments further when we meet—although, given my state of health, I quite understand if he wishes to postpone that pleasure.
The role of the designated data body is to provide reliable and robust data on the sector for students, prospective students, the OfS and the sector itself. It will gather and make available source data, but it will not to be the sole source of information. The designated body functions most closely resemble those currently carried out by the Higher Education Statistics Agency—a sector-owned body that collects and publishes official data on higher education. I should clarify that the role currently under discussion is not related to the current role of UCAS. The designated body functions do not extend to running an admissions service. I reassure my noble friend that it is absolutely the Government’s intention that the interests of prospective students will be taken into account in the new system. The Bill already allows for this.
Amendments 398, 401 and 403 would create additional conditions for the designated data body to put the interests of students above that of higher education institutions and the commercial interest, and to ensure that data collection is not anti-competitive. The Government support the broad thrust and intent of the amendments, but believe that the current drafting is sufficient. The new data body will have a duty to consider what would be helpful to students and prospective students. However, it would not be in the spirit of co-regulation to direct the order of interests of the body.
I assure my noble friend that there is no intention to give the designated body a monopoly over data publication. We have a wide range of organisations involved in providing information for students, including specialist careers advice services aimed at mature students and career changes. We would not want any reduction in this choice for prospective students. While the Bill gives the designated body the right to receive information from providers, it does not give the body any right to prevent providers sharing those data with other organisations.
On Amendments 207 and 400, the Bill already requires that the persons who determine the strategic priorities of the designated data and quality bodies represent a broad range of registered higher education providers. The quality and data bodies are designed to be independent of government, so it would not be right to prescribe the make-up of a board in the way these amendments do. Rather, the bodies should have the ability to take a view on the mix of skills they require for the challenges they face.
The Government have confidence that they have the right balance here. In these circumstances, I therefore ask my noble friend to withdraw Amendment 207.
My Lords, I am very grateful for the answer my noble friend has given me and for her offer of further conversations if there is anything, on reflection, I think she has not covered completely. I beg leave to withdraw my amendment.