The Danger of Nostalgia: Historicising the Early-Career Debate by William Whyte

In a rather heated moment during the recent early career ‘twitterstorm’, History Lab Plus offered to host any responses to Matthew Lyons’ original piece that were constructive and sought to move the debate on. We were pleased, as historians, to see that William Whyte, a historian of universities, answered calls for a historical perspective on the debate. His response is below. We continue to be open to any constructive responses from all. E-mail us at to contribute.

Historians need to be careful when they extrapolate from their own experience. History may be a broad and catholic discipline, but at its heart is surely a basic attempt to historicize: to relate past and present; to situate individual experience within broader, historical categories and trends.

In his article for History Today, Matthew Lyons raises an important issue, articulating some complaints of the early career researchers he has encountered, and (though he does not acknowledge it) also echoing many other voices within universities who are similarly concerned about the conditions in which temporary staff of all sorts currently have to work.
But in order to make his argument, he falls back on a fundamentally presentist, ahistorical – indeed anti-historical – peddling of myths. Like so many people writing about universities, he wholly ignores history and instead invents an idealized past – a past, which of course, never existed.

He’s far from alone in this. ‘Nostalgia is a big thing in academia’, observed Diana Warwick, head of the Committee of Vice Chancellors and Principals in 1996; ‘everything is worse today’. And it’s true that even those who ought to know better are prey to the myth of the golden age – indeed, when the London Review of Books turned its attention to life within Britain’s universities in December 2011, it found a variety of laments and a range of dates identified as better than now, from Keith Thomas’ celebration of the mid-1950s as a ‘golden age of academic freedom’, to Rachel Malik’s description of the mid-1990s as a ‘wonderful’ time to work in academia. In the last ten years or so, a variety of different writers have favourably compared every post-war decade with the experience of life in the twenty-first century university.

If the past is always golden, the present is always dark. Hence the language of decline, of betrayal, and crisis. In his lecture to the Australian Historical Association on 7 July, Peter Mandler challenged the notion of a broader ‘crisis’ in the humanities, pointing out that ‘It is hard to take too seriously talk of a crisis in Britain when even by the narrowest definition of the humanities the absolute number of humanities students has increased fivefold since 1967, and by the broader definition almost 10-fold’. One might do the same for Matthew Lyons’s notion of a ‘great betrayal’ of early career researchers now.

Of course, for all those individuals trapped in nine-month contracts (or worse); for those who have a PhD but no job – and seemingly little likelihood of getting one, this will seem like a crisis. As Harold Wilson is said to have observed, for the individual unemployment is 100 per cent. If anyone was ever lured into postgraduate work by the promise that they would find it easy to become a tenured member of a university, then they are quite right to feel betrayed. We mustn’t, too, make the equivalent mistake of arguing that everything is fine now, and that there is nothing to complain about. That is isn’t true, either. The work of History Lab Plus illustrates just how much more universities ought to be doing for their temporary staff.

Nonetheless, it is worth challenging the notion that there was time in which post-graduate funding was easy to obtain and permanent jobs were just there for the taking. The Robbins Report of 1963 noted that each year only 300 state-funded scholarships were available for the arts, humanities, and social sciences in the whole of the UK. In the 1980s, there were only 200 funded places for PhDs in sociology and just a few more for history.
Nor were jobs easy to obtain. Far from being, in Matthew Lyons’s words, ‘a generation which benefited from free education to degree level and generous support into postgraduate study’, the most senior staff at most universities today were amongst the very few people who managed to gain employment in the far darker days of the 1980s. Public expenditure cuts after 1981 led to the loss of about 5,600 academic jobs and creation of very few new posts. Even the Thatcher government realized that this was creating a demographic crisis in which a generation of postgraduate students would be lost forever, and in 1983 the ‘New blood scheme’ was created to hire some of the very best. It is an index of just how little this benefitted the humanities that in its three years of operation, only 66 posts were created for the arts and 78 for the social sciences. An over-privileged generation, this was not.

In that sense, our current problems are not the product of cuts, or decline, much less betrayal. They are rather the consequence of success. The number of universities, university teachers and researchers, and the number of postgraduate students has simply grown and grown. Moreover, in recent years, a rise in university income, driven by tuition fees and by increasing research funding, has enabled institutions to employ a growing – not a declining – proportion of permanent staff. By 2000, one survey estimated that half of all academics were on temporary contracts; in 2010, HEFCE produced figures suggesting that nearly 90 per cent were now in permanent employment. Moreover, about half of these people would previously have held at least one temporary job.

It is this remarkable growth that causes us so many dilemmas. Last year no fewer than 539,000 people were enrolled as postgraduate students, compared to 135,000 in 1994 and around 10,000 thirty years before that. It is quite clear that, even were universities to continue to expand, there is no way in which all of these people can expect be employed as full-time, tenured academics. It would be quite wrong for anyone to promise any one of these people that their path to a university job would be smooth or easy.

But therein lies our difficulty. Twenty or thirty years ago, most of these people would not have been able to undertake postgraduate research. Many of the institutions they attend had no history department and no tradition of taking research students at all. If the problem is the overproduction of PhDs, then which are the institutions who should be stopped from training doctoral students? Who are the individuals who should be prevented from starting research? Or do we just have to accept that there is a structural imbalance and that most doctoral students will not go on academic work?

It is a valid debate, but not an easy one. And it’s not one that will be helped by bad history or accusations of bad faith.


5 thoughts on “The Danger of Nostalgia: Historicising the Early-Career Debate by William Whyte

  1. Thank you for an important post. It seems to me correct that accusations of bad faith, or bad history, won’t help. I found the questions with which the post concludes interesting. I edited them as bullet points:

    If the problem is the overproduction of PhDs […],

    1) which are the institutions who should be stopped from training doctoral students?

    2) Who are the individuals who should be prevented from starting research?

    3) Or do we just have to accept that there is a structural imbalance and that most doctoral students will not go on academic work?

    I note 1) “stopped from”, 2) “preventing from” and 3) “will not go on”.

    Is it possible to think of questions with positive connotations too?

    For example, is it possible to enquire, if the problem is the overproduction of PhDs, the following alternative questions?

    1) should/could institutions be training doctoral students better/differently for the current work landscape in the UK and abroad?

    2) who are the individuals who should be encouraged to start research, and in which situations would it be ethical to prevent anyone from starting research in ways that current methods of selection do not already contemplate?

    3) How can any existing structural imbalances be tackled, so that most doctoral students will go on employment (academic or not) that fulfills their personal and professional expectations?

    Moreover, should/could we not also ask if the academy could operate without partiallly-paid or unpaid labour, or without relying on individuals’ ability to support themselves, fully or partially, while appropriate positions are eventually (hopefully) made available?

    Should we turn the problem on grad students (“you should have known life is hard”) or should we take a look hard on ourselves and aim to come up with positive solutions to what many experience as a very real problem, regardless if it’s always been hard anyway?

  2. Pingback: The job market for historians: some data, 1995-2014 | the many-headed monster

  3. Pingback: The Here and Now | williamgpooley

  4. Pingback: The debate in History over early careers | Fighting Against Casualisation in Education

  5. Pingback: What We’re Reading: Week of September 4th | JHIBlog

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s