Part III: The 2019 Results: An Interim Analysis

First Published: Monday 4th February 2019 (updated Tuesday 5th February 2019)
This is the third post in our series about the EPSRC CDT scheme.

Towards the end of last year, we published our investigation into EPSRC's flagship CDT scheme. Back then, we found that the scheme was underperforming, rather than being the "gold standard" asserted by EPSRC. Today, the EPSRC have published the results of the competition and how approaching £1bn will be spent on doctoral training within this model. What follows is our interim (and thus preliminary) analysis of the results. We expect to provide a more detailed examination over the coming days.

Less is More?

One of the most striking results of the CDT Competition is that only 75 CDT's were funded. This is down nearly 35% from the 115 CDT's that are presently supported by EPSRC. In the call for full proposals, EPSRC said that they expected that they would be funding at least 90 and up to 120 "subject to quality". The call specified that "up to £492m" would be available to do this, although significantly less was furnished in the event, suggesting that EPSRC had difficulty in finding a sufficient number of proposed centres that met the requisite funding standard. Taking the figure of £446m in the announcement and then adjusting for inflation of 12.57% (using the Bank of England's Inflation Calculator), there is around a 20% real terms cut in funding from EPSRC and the Government for this call.

EPSRC also seem to have lost a significant amount of "leverage", at least according to the written text. There is £386m proclaimed this time around in the announcement, which is down from the "more than £450 millon" claimed by EPSRC in respect of the previous CDT competition. Presuming the leverage was calculated in the same way as before, then this would presumably be well over 20% reduction in leverage and possibly approaching 30%. A subsequent press video by EPSRC claims a different figure of £508m (including that offered by the hosts themselves), which if comparable to the previous approach, would be equivalant (after adjustment for inflation).

Runners and Riders

As with any competition, there are winners and losers. Some University's have lost a significant number of centres. Perhaps the most notable examples are Imperial College London (which has lost half of its 12 centres), the University of Oxford (who lost 4 out of 9) and the University of Cambridge (which is down nearly 40%). UCL also lost 30% of its centres. The most successful University was Bristol, who was awarded 9 CDT's. Possibly the most Pyrrhic victor was Newcastle University, which was awarded 3 CDT's, but lost the Cloud Computing CDT which is attached to EPSRC's £15m National Innovation Centre for Data.

The CDT program also welcomes five newcomer institutions: Northumbria Univerity, Royal Holloway, the University of Hull, the University of Lincoln and the University of Salford. Two of the newcomers are Post-1992 institutions which together with the reductions in the countries most prestigious University's suggests a potential move towards egalarianism by EPSRC. There's also a significant number of departures, namely Exeter University, Loughborough University, Queen Mary, the University of Leicester, the University of St Andrews, the University of Southhampton and the University of Surrey, all of which were not selected to lead a centre. The most notable departure is the University of Southampton (incidentally the institution of EPSRC's recently departed CEO, Professor Phil Nelson CBE), which had an abysmal GPA in the mid-term review, and lost all three centres.

A Brave New World?

Most of the 2019 intake of CDT's are newcomers. Even taking a generous approach to inclusion, we estimate that only 37 out of the previous 115 centres were renewed, which would be less than one third of the 2014 cohort. This is perhaps the most staggering result in our initial analysis, especially given the CDT's are supposed to be "centres for excellence" and the "gold standard" for doctoral training.

Another notable result is that EPSRC have decided to substantialy increase the funding (presuming the accounting is the same) that is given to those successful in the 2018 competition, rather than attempting to fund more of them. Indeed, each centre funded (average of £5.82 millon in todays money) will be around £1 millon more expensive on average in real terms (or roughly 20% more expensive compared to the 2013 competition). Whilst is is possible that EPSRC is also funding more studentships at each centre, it would seem that this additional expenditure arises from the low number of renewals, thereby incurring substantial start-up costs for new centres.

The press video subsequently released by EPSRC claims that there will be £954 millon in total invested, across "4600+" students (presumably the extra students over 4600 are those which will be attached to a centre from additional sources of funding at a later point, as provided for in the FAQ to the call). Accordingly, it appears to us that the cost per student would be over £200k each, which would be around 3 times the cost of a traditional 3 year studentship, and even when adjusted for inflation, far more expensive than predecessor CDTs (the old program trained over 7000 students, so even accounting for inflation, this would be an increase of around 35% per CDT student). This would appear to be a rather extravagent amount to spend per student, putting it mildly. Indeed, it is difficult to see how on present (or even enhanced) stipends, this amount of money could possibly be spent on training a PhD student. We would therefore posit a more likely explanation would seem to be that the claimed industrial and stakeholder engagement is heavily exaggerated, which would mean that the real cost is substantially less than what EPSRC claim. Given the lack of explanation from EPSRC, it suffices for present purposes to say that EPSRC's numbers do not stack up.

Centres of Excellence?

Some of the individual results are rather surprising. Many readers will recall our previous analysis and some of the press-coverage thereof, which included the identification of the fact that a CDT scored either "Interview" or "Satisfactory" was poorly performing (one of the panellists at the outline stage even said that "Anything that scored [satisfactory or below] I would not [have] let pass unless there was honest disclosure of results in the proposal and some reasonable explanation why the score is so low"). Nevertheless, 13 centres from that group made the final cut (see Appendix A). This includes three centres that were awarded the lowest possible grade of Interview:

This is a very strange result. It is even stranger when one considers that in addition to the CDT involving Cambridge highlighted above, there were centres graded Satisfactory that were also identified in Part 1 as making potentially misleading comments in their outline submissions. These examples are:

This seems like a highly concerning set of results. In respect of the potentially misleading centres, it is notable that all of the University's involved have refused FOI requests for the full outline proposals (unlike many other institutions, who gladly provided them without redaction). Moreover, our calculations identify that of those centres scored "Good" (i.e. the best performing centres) in the mid-term review, only 16 out of 44 (or 36%) were retained by EPSRC. It might be that contrary to the promises made by EPSRC (whose FAQ stated "we don't have particular targets for each priority area and we will not be making funding decisions based on success rates by area"), certain centres were not awarded on quality, but on fit to a priorty area. Even with that, it is difficult to see how EPSRC conducted a rational or appropriate process for selecting those centres it did fund. Indeed, it is difficult to see which, if any, of the Nolan principles (these are regarded as "the basis of the ethical standards expected of public office holders" in the UK) were complied with by EPSRC and HEI's in this competition.

Conclusion

There has been significant change in the CDT scheme, most notably that less than one third of these "centres for excellence" were funded again and there was a 20% real-terms cut in funding. However, EPSRC have not provided an explanation for the changes that have been made, or indeed, why the wider scheme is being retained in something resembling its present form. At present, it is difficult to see how the scheme can be value for money (or "investing wisely" as EPSRC put it to the Information Commissioner). We look forward to EPSRC explaining how it took its decisions in due course. In the meantime, we congralate the newcomers to the CDT club, as well as the improvers.


Appendix A: Table of Renewals

This table is a summary of the renewals we have been able to identify ()
The data for each column of the table has been assembled as follows.
  • The following columns come directly from EPSRC's own website: the CDT names and PI's (both 2014 and 2019), the Institution.
  • The Score is computed by mapping EPSRC Descriptors as follows (4 is best, 1 is worst): "Good" = 4, "Good/Satisfactory" = 3, "Satisfactory" = 2, "Interview" = 1. This is the same as our Part 1 Blog
  • The Cost of each centre is the raw award figure as published on EPSRC's 'Grants on the Web': this was scraped automatically using a Matlab Script.
  • The Renewal in 2009 column was determined by examining the list of CDT's funded in the previous competition and seeing if they had the same PI or effective Title (if so, then this was marked as a "yes") or if the University had been previously awarded a centre in this domain (if not, then the answer was "no"). The remaining centres were determined by examining the CDT websites and abstracts on Grants on the Web to determine if they had previous cohorts. Through this process, we were able to determine which category a centre fell into.
  • To determine if a centre was a renewal in 2019, we included all centres from the same Institution where there was one or more of (i) Same PI, (ii) Same or Similar CDT Name or (iii) the CDT was identified as a renewal in the outline proposal abstract. If there are any centres we may have overlooked, please do email us.