The Danger of Nostalgia: Historicising the Early-Career Debate by William Whyte

In a rather heated moment during the recent early career ‘twitterstorm’, History Lab Plus offered to host any responses to Matthew Lyons’ original piece that were constructive and sought to move the debate on. We were pleased, as historians, to see that William Whyte, a historian of universities, answered calls for a historical perspective on the debate. His response is below. We continue to be open to any constructive responses from all. E-mail us at historylabplus@gmail.com to contribute.

Historians need to be careful when they extrapolate from their own experience. History may be a broad and catholic discipline, but at its heart is surely a basic attempt to historicize: to relate past and present; to situate individual experience within broader, historical categories and trends.

In his article for History Today, Matthew Lyons raises an important issue, articulating some complaints of the early career researchers he has encountered, and (though he does not acknowledge it) also echoing many other voices within universities who are similarly concerned about the conditions in which temporary staff of all sorts currently have to work.
But in order to make his argument, he falls back on a fundamentally presentist, ahistorical – indeed anti-historical – peddling of myths. Like so many people writing about universities, he wholly ignores history and instead invents an idealized past – a past, which of course, never existed.

He’s far from alone in this. ‘Nostalgia is a big thing in academia’, observed Diana Warwick, head of the Committee of Vice Chancellors and Principals in 1996; ‘everything is worse today’. And it’s true that even those who ought to know better are prey to the myth of the golden age – indeed, when the London Review of Books turned its attention to life within Britain’s universities in December 2011, it found a variety of laments and a range of dates identified as better than now, from Keith Thomas’ celebration of the mid-1950s as a ‘golden age of academic freedom’, to Rachel Malik’s description of the mid-1990s as a ‘wonderful’ time to work in academia. In the last ten years or so, a variety of different writers have favourably compared every post-war decade with the experience of life in the twenty-first century university.

If the past is always golden, the present is always dark. Hence the language of decline, of betrayal, and crisis. In his lecture to the Australian Historical Association on 7 July, Peter Mandler challenged the notion of a broader ‘crisis’ in the humanities, pointing out that ‘It is hard to take too seriously talk of a crisis in Britain when even by the narrowest definition of the humanities the absolute number of humanities students has increased fivefold since 1967, and by the broader definition almost 10-fold’. One might do the same for Matthew Lyons’s notion of a ‘great betrayal’ of early career researchers now.

Of course, for all those individuals trapped in nine-month contracts (or worse); for those who have a PhD but no job – and seemingly little likelihood of getting one, this will seem like a crisis. As Harold Wilson is said to have observed, for the individual unemployment is 100 per cent. If anyone was ever lured into postgraduate work by the promise that they would find it easy to become a tenured member of a university, then they are quite right to feel betrayed. We mustn’t, too, make the equivalent mistake of arguing that everything is fine now, and that there is nothing to complain about. That is isn’t true, either. The work of History Lab Plus illustrates just how much more universities ought to be doing for their temporary staff.

Nonetheless, it is worth challenging the notion that there was time in which post-graduate funding was easy to obtain and permanent jobs were just there for the taking. The Robbins Report of 1963 noted that each year only 300 state-funded scholarships were available for the arts, humanities, and social sciences in the whole of the UK. In the 1980s, there were only 200 funded places for PhDs in sociology and just a few more for history.
Nor were jobs easy to obtain. Far from being, in Matthew Lyons’s words, ‘a generation which benefited from free education to degree level and generous support into postgraduate study’, the most senior staff at most universities today were amongst the very few people who managed to gain employment in the far darker days of the 1980s. Public expenditure cuts after 1981 led to the loss of about 5,600 academic jobs and creation of very few new posts. Even the Thatcher government realized that this was creating a demographic crisis in which a generation of postgraduate students would be lost forever, and in 1983 the ‘New blood scheme’ was created to hire some of the very best. It is an index of just how little this benefitted the humanities that in its three years of operation, only 66 posts were created for the arts and 78 for the social sciences. An over-privileged generation, this was not.

In that sense, our current problems are not the product of cuts, or decline, much less betrayal. They are rather the consequence of success. The number of universities, university teachers and researchers, and the number of postgraduate students has simply grown and grown. Moreover, in recent years, a rise in university income, driven by tuition fees and by increasing research funding, has enabled institutions to employ a growing – not a declining – proportion of permanent staff. By 2000, one survey estimated that half of all academics were on temporary contracts; in 2010, HEFCE produced figures suggesting that nearly 90 per cent were now in permanent employment. Moreover, about half of these people would previously have held at least one temporary job.

It is this remarkable growth that causes us so many dilemmas. Last year no fewer than 539,000 people were enrolled as postgraduate students, compared to 135,000 in 1994 and around 10,000 thirty years before that. It is quite clear that, even were universities to continue to expand, there is no way in which all of these people can expect be employed as full-time, tenured academics. It would be quite wrong for anyone to promise any one of these people that their path to a university job would be smooth or easy.

But therein lies our difficulty. Twenty or thirty years ago, most of these people would not have been able to undertake postgraduate research. Many of the institutions they attend had no history department and no tradition of taking research students at all. If the problem is the overproduction of PhDs, then which are the institutions who should be stopped from training doctoral students? Who are the individuals who should be prevented from starting research? Or do we just have to accept that there is a structural imbalance and that most doctoral students will not go on academic work?

It is a valid debate, but not an easy one. And it’s not one that will be helped by bad history or accusations of bad faith.

Advertisements

Don’t Call It Lucky: A Personal Perspective on the ‘ECR Debate’

N.B. This guest post was sent to us by an ECR who wanted to contribute to the debate but was unsure about doing it in their own name. We are happy to publish responses, whether anonymous or otherwise.

This post is about the ECR issues that have been convulsing Twitter. It is a story of personal experience, and as such it does not contribute very sensible calls for systematic evidence-based debate. But I do think that stories of personal experience have a place in this discussion, not as an alternative to the wider trends we need to identify, but in terms of addressing how individuals are dealing with these problems, and what range of solutions might be available.

I have some ideas, all of which stem from my identity and experience, but perhaps that identity and experience is worth thinking about. I hope my colleagues can understand that I have chosen to write this anonymously, even though it is a deeply personal post.

Let me begin, then, by saying that I do not deal well with uncertainty. No need to paint a picture, but I book travel a long time in advance, arrive everywhere early, and like routine. I am a lousy gambler.

Now put this character sketch into a situation: the situation of the early-career researcher. When I was finishing my doctorate, I started applying somewhat haphazardly for junior fellowship positions at Oxford and Cambridge. In the end, I got a six-month research fellowship, and spent most of it applying for the next round of things: more postdocs, the same junior fellowships, but also teaching positions both temporary and permanent, the British Academy. I put out feelers about the Leverhulme, the Wellcome Trust, the AHRC.

I managed to pick up some teaching to carry me through another year. And I started applying again.

This was the worst time for me. The same schemes where I had had limited success before (interviews, personalized rejections) were now rejecting me outright.
Did I mention that I do not deal well with uncertainty?

Around February, things started to look up. A Cambridge college wanted to interview me for a three-year fellowship, and – to my surprise – another university wanted to interview me for a permanent lectureship.

Long story short: I got the permanent job, a year after I finished my doctorate.

You could call this lucky, but it isn’t.

I know what you’re thinking, but I can get this out of the way pretty quickly: the obvious alternative to ‘luck’ in my story is not ‘hard work’.

Hard work is not optional on the lower rungs of the academic ladder.

I have known many different people from a variety of institutions and backgrounds, and ALL of them, without fail, and despite what they sometimes seem to worry, work very hard indeed.

I suspect that very few people who don’t work hard make it to graduate study and then to the end of the doctorate. (I do not mean to insult anyone who does not finish: there are many, many reasons not to complete a doctorate, and I doubt problems applying enough effort is a particularly common one (quite the opposite!).)

All I mean to say is that the problem – as I see it – is not that ‘luck’ is more important than ‘hard work’ in the current system.

So, you could call my story lucky, but it isn’t.

It is privileged.

I am privileged to have held positions and had income in the months after my doctorate, I am privileged to have worked with excellent people at prestigious institutions, and I’m not under any illusion that my journey has been tough compared to that of many others.

I freely admit it: I have had a privileged education (Oxbridge most of the way) and the support (including financial) of my family throughout. I have not had extensive responsibilities caring for family members or raising children. I have never felt discriminated against because of the colour of my skin or because of my sexual preference, gender identity, or a physical or mental disability.

My story is the story of one good-enough-person among a great many whose success – I like to think! – did require talent and determination, but also actively relied on these privileges in many complex and important ways.

I am not saying this to boast. Far from it – hence the anonymity.

There is nothing to be proud of about the fact that the system I was born into, and the place I have gradually taken in it rewards individuals not simply on merit, as we sometimes like to believe, but a complex cultural-symbolic maelstrom of identity issues.

If I’m not saying it to brag, then why bother?

Because a lot of relatively junior people I know who are in relatively secure jobs in academia seem to feel (like me) that the recent spats triggered by Matthew Lyon’s piece for History Today are not helping the discussion, but instead alienating people who should be involved.

So I did not, by any means, have the toughest transition from doctorate to relatively secure employment, but there was still transition, there was uncertainty, and I did personally deal with some of the problems that are at the heart of these arguments.
And as someone who does now have relative job security I am still deeply concerned by these problems.

I think there are structural problems with the current system, which cannot be reduced to ‘too many PhDs, not enough jobs’. The oversupply of excellent ECR is a catalyst that allows institutions and the system as a whole to exploit junior academics and perpetuate unfair disadvantages, but I do not believe that oversupply justifies exploitation and injustice.

And let’s not kid ourselves. Some people really do believe that. It’s a tenet of the neoliberal attitude, which believes the ECR situation should be ‘corrected’ when students realize that academia is not an economically attractive career.
Bullshit.

The problems of precariousness and the role of privilege in this system should be corrected by sustained efforts by academics to pressure institutions for specific concrete changes that make academic careers fairer: by which I mean accessible to people from a range of backgrounds, accessible to people with different responsibilities as carers and parents, with different needs and requirements.

How? These are my suggestions. The value of speaking from personal experience is that I – like others – can explore the things that have helped me as I went through the harder times.

My guiding idea is that some of the privileges I have had from financial support and from prestigious institutions should be expectations.

ECR should reasonably expect they will get support of various kinds: this should not be a privilege limited to people who come from the right place and fit a certain image.

1 Get ‘em early

We could begin by going further back, very briefly. Everything I know from the excellent work of the Sutton Trust suggests that a deep cause of some of these inequalities lies way before university. We need a less unequal education system now more than ever, and of course, now more than ever, we have a government that is fundamentally opposed to the measures that would encourage a wider range of young people to arrive at the age of 22 or so and think academic research is a possibility, and a desirable one. Hell, we lack a government that is committed to encouraging mature students to feel that way, too.

These are debates (or despair?) for another day, because there are things we can be doing within academia to address these problems, too, even if the underlying issues of education and inequality have deeper roots…

2 The Code of Good Practice

I see it as part of my responsibility to ensure that the institution I work for meets the suggestions in the Royal Historical Society Code of Good Practice for temporary contracts, drawn up with History Lab plus, available here: http://bit.ly/1Ei6CB0
This, from the first page, a thousand, thousand times: “for those on short- term teaching contracts a few simple things, such as being included on email lists and invited to seminars, could make a big difference to their experience of a department.” The code goes on to suggest a short and straightforward list of things departments can do to make working in temporary jobs a better situation than it sometimes is now. Having an office, being included in meetings and research discussions, these should not be ‘perks’.

They should be expectations.

3 Making job applications less onerous

I would only add that – in my experience – some institutions could be more sensitive to the position that applicants are in when they apply for jobs. Is it really reasonable, in the situation we find ourselves, for every single institution in the UK to have a different hiring process? How much time is wasted producing tailored research or teaching statements for jobs at universities that never even have the courtesy to formally reject your application?

Could we have a national, standardized system (not unlike UCAS or the NHS job system)? There are obvious risks. Departments might complain that they will be swamped with even more applicants than they get at the moment and this will make their task harder.

Well, from what I can see, the problem here is not how hard it is for those of us who are looking for new colleagues. The problem is the immense pressure these potential colleagues are coming under from a variety of angles. And a standardized application system might be one way to reduce some of that pressure.

And let’s not forget that many of the same people who would have a tougher job sifting through applications in this nationwide model are the same ones who are besieged with demands for references. How would you like to write just one reference a year per candidate instead?

Of course there are other problems with this idea. Who will decide what the application should include? I suspect we would never reach any agreement on what it would look like.

And how would applicants tailor their pitch to the institution?

Well, not very much. Or rather, they would only get one shot. They’d have to pick what kind of institution in the abstract they wanted to go to, and then write a general enough pitch that those kinds of institution would be interested in them.
What else could institutions do to treat applicants better? Well I understand that personal letters to every single applicant may simply never be possible, but I will just say that every single time I got a personalized rejection, it meant a huge amount to me. Several times I asked for feedback from unsuccessful interviews and got none. I think that is broadly unacceptable. Colleagues have suggested that people worry about complaints, even court cases if they are too honest about why candidates are not good enough. I do not think this can be the main problem.

I wonder if institutions and the academics within them are very happy with a system where the criteria that exist on paper do not have to be addressed in decision-making. (I am thinking especially of Oxford and Cambridge, where the depressing spirit of academic libertarianism – both right and left wing – reigns supreme).

But let’s be clear about this: a system where departments have no accountability even to people who came very close for why they chose someone else… that is a system where privilege will continue to dominate.

4 Eliminating temporary contracts where possible

Beyond these thoughts about making life easier for people working in temporary contracts and helping job applicants, I am broadly committed to the idea that no job should be done by temporary staff that could be done by permanent staff.
Anyone who has experienced or witnessed (for instance) the misery of nine-month contracts, where staff are hardly paid to prepare teaching materials, and receive no pay at all over the summer, only to face possible renewal for another nine-months can justify the existence of temporary staff where they are not absolutely necessary. Of course temporary staff may be needed to cover periods of research leave, maternity and paternity leave, illness.

But I believe one of our goals should be to ensure that the slide of the UK system towards a US adjunct system must be arrested.
I have too many thoughts about this to contain in one post. I want to end on a positive note, although this is easier said than done.

Academia can be fantastic, and that is why so many people want to be academics. But I would hardly be the first person to point out this does not mean that any of them should have to work in financial insecurity.

Nor would I be the first to point out that the blame game turns very hurtful very quickly, not least since experiences of being ECR are clearly hugely variable.
Senior academics who feel they ARE doing their best to help junior colleagues might rightly feel pretty annoyed with recent finger-pointing. Mid-level academics with some security can quickly feel alienated from conversations because they are no longer the ones who have to deal with everyday insecurity. But as a group with BOTH 1. recent experiences (never underestimate how insulated other academics might be from what the job market is really like) and 2. a little more power within the system, surely these early-to-mid-level people must be key to making sure that change happens?

Getting angry about people who got desirable jobs without having to do X or Y blames those job-seekers (your former peers!) for systematic problems.

As the History Lab Plus Twitter account put it: ‘Let’s take the energy of this debate and channel it into working together as historians to make things better.’

So this is my declaration of hope, and also shame.

Because of course it would be easy to feel somewhat divorced from the struggle once the uncertainties are less present. My attention turns towards new uncertainties: my book, articles, the REF, research funding, teaching, the godawful Teaching Excellence Framework – to mention only my professional concerns.
I’m ashamed of that temptation and do honestly want to continue helping to reform the current situation.

And I feel bad for times I have made light of privilege in academia.

I think there is a very real problem whereby junior people are devalued in ways that make them cynical. Cynicism turns to sarcasm, and I might find myself joking about how so many jobs go to Oxbridge graduates, for instance.

It isn’t funny.

In fact, I’m ashamed of those tendencies towards cynicism in the darkest part of my between time, because, after all, I am a beneficiary of a system that is demonstrably unfair.

The only avenue for hope in the face of this shame must be: we can change this.