consciousness and the sausage factory
In the first part of this article we looked at the individual characteristics likely to be needed for making a revolutionary breakthrough in understanding of the mind-body problem.
To the four qualities already outlined in part 1 let us add a fifth, for which there seems to be no better word than chutzpah. In order to solve (say) the riddle of gravitation, or the black body problem, you need to believe you could solve it, where others – ostensibly greater than you – have failed. 
Does the presence of chutzpah, in combination with the other four qualities, guarantee success? No. As in other areas of achievement, a belief that one can win the competition is a necessary, not a sufficient, condition of success.
Look at it from the selfish gene’s point of view. A tendency towards exceptional achievement is a high-risk-high-reward strategy. Most of the time, the conditions will be too unfavourable to allow the potential to be fully expressed. Hence the individual will often end up doing worse than average, given that they’re probably not well adapted for a conventional life – the price paid for being unusual. Even when conditions are favourable, luck generally seems to play a role. History mostly records those to whom Lady Luck was kind. But there were many more misses than hits, now largely forgotten.
This is something which putative patrons – at least those who wish to go beyond the predictability of the sausage-factory model – need to bear in mind.
We should perhaps note that there are different types of chutzpah. Some types, involving a belief that one’s superiority is socially assigned, and that one can succeed because other people say one can, may be unhelpful, at least for this purpose. Other types may be concealed by superficial diffidence, and so appear invisible.
I recently watched an interview with jazz rocker Donald Fagen of Steely Dan fame, in which he described his teenage self disparagingly as
somewhere between a nerd and a schmendrick.
Talented people often seem to start their adult life as social outsiders.
Once upon a time, when university intake was more restricted, social outsiders made up a relatively high proportion of the student population. To say that college provided them with a congenial environment in which they could unfold their true selves, and express the full extent of their abilities, would probably be overstating it. Nevertheless, it was perhaps in most cases a more sympathetic world than their schools had been, and many of them did flourish there.
Some of these outsiders were extreme cases in that they cared only, or primarily, about their academic life, to the exclusion of other things. They were dedicated to physics, or classics, or Shakespeare, or whatever. Their emotions – the ‘correct’ purpose of which, according to most biologists and psychiatrists, is relationships and social life – were dominated by their intellectual preoccupations. Some of them were capable of making significant intellectual progress. A few of that subgroup actually went on to do so, if they managed to survive the obstacle course.
Is there still room for people like that in the modern university? I am not sure.
With degree “massification” having shifted the position of the average student towards the middle of the bell curve, there is more of a mediocre feel to campuses these days. But it’s not just demographics. There is a sense in which universities – like other organisations in the public eye – now seem to regard being ‘normal’, ‘healthy’, ‘modern’, ‘well-adjusted’, or however they like to put it, not just as one of their selling points but as a value they are morally proud to embody. And there are plenty of figures on campus nowadays to ensure that all this normality, health, modernity, adjustment etc. is maintained. Those who fall foul of the correct attitudes may be encouraged – or pressured – to seek ‘help’, make a complaint, or attend a session to sort them out. If things don’t improve they may be asked to leave, though it doesn’t usually come to that. Once you’ve lost the support of your teachers, you soon find yourself on the slippery slope to academic oblivion, without any need for formal ejection.
The top universities are no different in this regard, and anyway do not differ much from the others these days, in terms of ethos or atmosphere. Critics of Oxbridge may like to invoke the Brideshead image (which, under egalitarianism, has become a convenient stick) but the critics in question are either being dishonest or can’t have visited those places lately. What is perhaps best described as diversified conformity – but is often called ‘individualism’, a splendid example of inversion – has become the fashionable value. Cultural institutions seem to enact the fashionable values harder than everyone else, as perhaps they have always done.
It seems to be part of a wider trend. Once, soap operas like Coronation Street and other TV series like Friends had a distinct flavour, which you could take or leave, but which was at least trying to be unique and characterful. Now most soaps look and feel the same. The same is true of US crime series – in fact more or less all US drama series. The characters in them, and the actors playing them, seem perfectly interchangeable. They all say the same kinds of things, in the same kind of way.
Even seasoned actors, clearly capable of more characterful roles, seem to be pressed by directors and scriptwriters into performing formulaically. The motive is unclear, but presumably homogeneity is less threatening. “Yes, I know you thought Jeremy Irons, Alec Baldwin, Andie MacDowell, etc. were special – but see, they’re just like everyone else!” 
Other cultural sectors – from universities to museums, opera companies to nature reserves – have gone down the same path of homogenisation.
So your choice at college these days is simple. Be ‘normal’ (this comes in a number of slightly different flavours); or become stereotyped as a member of a minority group popularly known as “dorks”.
If you’re lucky, being a “dork” won’t be held against you when they write a reference for your PhD grant application, or when they hand out post-doctoral positions. On the other hand, your fate may end up in the hands of one of those professors who dislike people being too obsessional, or insufficiently sociable. In that case, your goose will be pretty much cooked.
Let us switch attention to the kind of working environment that a putative innovator would need, in order to have a chance of making a breakthrough in the understanding of consciousness. What conditions are needed to make it possible? I think they are such as would be regarded as extreme, from a contemporary perspective.
First, you would have to have your board and lodging requirements taken care of, to a reasonable standard of comfort. Second, your time would have to be free from all the usual requirements of a job, and preferably also from the requirements of meal preparation, cleaning, repairs, general administration etc.
You would probably need at least a year to make fruitful progress, but you would not want to be under any pressure to come up with a ‘solution’ by the end of that time. Pressure to produce results is bad for revolutionary thought. Therefore, third, you would need to have your position guaranteed for at least 5 years, or let us say 3 as an absolute minimum.
Is there any activity which would not detract from devoting as much as is needed of your brain to what is one of the most intractable problems in the history of science? Possibly a very light teaching load – maybe three or four hours a week, provided the topic of teaching is pertinent to your area of research. So fourth, no requirement to perform duties beyond a minimal level of this kind. Certainly no managerial or other administrative duties. (And definitely no office politics.)
The ‘solution’ to the problem, if/when you get it, may well involve a radically different way of thinking which bears little relation to what has gone before, and possibly none at all to prevailing approaches. Moreover, your first year is likely to involve many false starts and wrong turns. So for those reasons it’s important that, fifth, there be no expectation to publish anything, or even discuss your thoughts with others, before the end of the three or five year period. You may of course choose to talk about your ideas, or even publish something for purposes of feedback, but there should be no requirement to do so.
Given the peculiar nature of the topic to be researched, we must add a sixth, more qualitative condition. The person should be free to be as antisocial as they wish, including shunning all forms of social contact altogether. No requirement to be friendly, chummy, communicative or convivial.
In order to think clearly about the mind-body relation, one needs to be able to eliminate all preconceptions. Social interaction, a good deal of which involves emphasising shared assumptions and reinforcing prejudices, is likely to be highly distracting from the most abstract kinds of reasoning, and therefore unhelpful, particularly on a topic that is intimately bound up with the question of what it means to be a person.
Not only that. Because mutual reinforcement of shared assumptions is a pleasurable and possibly essential component of psychological health, and because, conversely, overt departure from shared norms causes discomfort, you may well find that others have a definite wish to nudge you off your perch, and return you to the world of common sense.
Putting these requirements together, we arrive at a potential appointment with characteristics that appear outrageously demanding by current standards. No obligation to do anything, beyond a perfunctory amount of teaching, for five years. No need to publish anything throughout that time. All one’s needs taken care of. Permission to be reclusive to an extreme degree.
Are there still slots like this in academia even remotely resembling this image, for people who do not already have established careers? There were only ever very few. New ones seem unlikely to be created in the current climate, when everyone is expected to show they are providing value for money. Would our aspiring neuroscientist-philosopher be able to write a convincing “impact statement” for his grant application to the Biological Sciences Research Council, for example?
A good Pathways to Impact statement should be:• project-specific, and have very clear deliverables,
• describe societal and economic deliverables and milestones, instead of focussing on just scientific deliverables,
• plan to deliver activities pertinent to the project, instead of a focus on track record or routine activities for University post,
• consider broader beneficiaries, likely impact on them and appropriate mechanisms for realising these potential impacts,
• focused on knowledge exchange and impact generation rather than narrowly focused, end focused or purely for dissemination purposes,
• clearly laid out in terms of timelines when each impact activity will be carried out.
A possible example of a position where you would not have to convince a committee that you adequately ticked these kinds of boxes is the Examination Fellowship offered by Oxford’s All Souls College. This is supposedly a pure research post, and contains few if any associated duties.
However, looking at the kind of essay which All Souls applicants have to write in order to impress the examiners with their intelligence, and looking at the list of people who have held the Fellowship (many of them going on to a career in politics or administration) one gets the impression that, although the holders were no doubt highly intelligent, they were not typically the kind of person to make major intellectual advances. 
There may well be something doomed about attempts to provide positions specifically intended to facilitate advances, if the usual method of design by institution, and allocation by committee, is used. It’s more likely that the few cases where major advances were made possible by an unconditional academic post arose fortuitously, perhaps as a result of string-pulling. This is compatible with the observation that few recipients of awards such as the MacArthur Genius Grant go on to do anything very significant after receipt.
Further data is hard to come by on what has always been a fringe phenomenon. However, we can get some insights into attitudes to the no-strings post by looking at a related, weaker concept – that of tenure. In theory, tenure means an academic job for life, and is granted once a person has supposedly proven himself sufficiently capable.
Tenure is sometimes thought of as a North American rather than British system. However, something similar exists in a looser sense in the UK, in that appointments may be permanent in practice. I.e. one could expect to hold an appointment until retirement, with only serious misdemeanour or incapacity providing grounds for termination.
Fifty years ago, tenure in some form was the norm rather than the exception for academics in the US. It was assumed that you would – unless and until you actively wanted to decamp – remain with your institution for the duration of your career. As long as you fulfilled some basic teaching duties, you would have reasonable job security, and would not be fired merely for lack of research productivity, though you might not get promoted.
Tenure has come under attack in recent decades. Since the 1970s, it has been under fire from critics of academia, from both the Left and the Right.
Academics like to complain that the biggest threat to the university system is the philistinism of the free market; but a far more potent revisionist force has surely been egalitarianism. The idea of a select few having privileges that the rest of us don’t have sits uneasily with the new dominant ideology. Expansion of the university system – a programme predicated vaguely on ‘democratisation’ – was bound to mean that, when things had to be sacrificed to facilitate the required changes, it would be the things that least fitted with the democratising ideal.
Tenure is criticised by the Left for being available to some members of faculty but not others at a given institution, and at some institutions to a greater degree than at other institutions. It is criticised, more generally, for allowing certain people to be excluded from ‘accountability’ – an increasingly popular notion, according to which everyone, particularly those in the public arena, should be answerable to everyone else.
A higher education system that has been massively expanded using taxpayers’ money is unlikely to maintain the same commitment to a practice which does not yield obvious rewards – unless tenure is restricted to certain institutions, which again is at odds with the ideology.
As for the Right, many of them intensely dislike (with some justification) the blatant leftist bias of academia, particularly in the humanities. The result, after all, is that it’s the Left’s values which get to be inculcated into society’s future movers and shakers. The Right’s response to this (like their response in other areas where they feel they have lost the ideological battle) is to try to use democratisation against those espousing it. This is done by demanding greater accountability, more responsiveness to student preferences, more proof of money well spent, etc. Thus many on the Right are equally hostile to tenure, though it’s doubtful whether its complete abolition, and all academics having to work as wage slaves, would produce a significant shift in ideological bias.
One way or another, the ideal of tenure – in the US and elsewhere – seems on the way out, even if it ends up retained in name and in a castrated form. There is simply not enough support for it, even from academics themselves. This, for example, is what a liberal arts professor wrote in 1999, allegedly in defence of it. 
In many departments there are several thoroughly dysfunctional people hired years ago, faculty who repeatedly skip classes or otherwise fail their most basic responsibilities. Much more common still are enervated faculty who lack intellectual vitality or have long since stopped being up to date in their fields. Faculty members who haven’t read current scholarship in decades are legion. What is to be done?
A strict up-or-out system, one that fires people who have not fully proven their teaching or research capabilities during the probationary period, does occasionally overlook late bloomers, but that is finally a price we must pay for tenure’s benefits.
Reading this 15 years after publication, and with the benefit of hindsight, it is hard not to see Peter Higgs – coming up with the Higgs mechanism ten years after his PhD – as a ‘late bloomer’. In other words, the kind of person who would now, under pressure from cost considerations, research rankings etc., be summarily booted out. 
What kind of environment for academics remains, once ‘privileges’ like tenure have been stripped out, and the academy is compelled to acknowledge the ‘needs of society’ and ensure that each group of ‘stakeholders’ approves of what it is doing?
Quite possibly an inversion of the original. Instead of providing a refuge from normal demands in which novel ideas can be explored, we obtain a mechanism for penalising deviation from the consensus. This would certainly help to explain occasional reports suggesting that universities have now become one of the prime locations for workplace bullying.
In any case, the possibility that such an environment could allow a breakthrough to be made with regard to a major intellectual challenge, such as the mind-body problem, seems remote.
A strong desire to advance knowledge – stronger than the normal range of motives found (say) in the average office – is rare. By wiping out the few loopholes for exceptional individuals that are present in institutionalised academia, its primary raison d’être is eliminated, leaving behind a husk that generates plenty of volume but little in the way of meaningful content.
Academia has become mediocratised, resulting in the exclusion of those who don’t conform to the required models. My colleagues and I are prevented from contributing to debates on key issues in science, philosophy and social policy. Ideological bias means the debates are highly skewed towards maintaining the status quo.
The ability to comment on the web is not a substitute for mainstream publishing from the position of an academic post. We are waiting to start being productive academics. Our web writings are intended to draw attention to our existence and objectives.
We need financial support, and associates to help with our efforts. If the idea of individual academics operating without institutional endorsement seems strange, it’s a sign of how dominant sausage-factory thinking has become.
2. HBO’s True Detective was a recent exception.
3. Perhaps the most famous of All Souls’ past Examination Fellows was Isaiah Berlin. Arguably, however, his intellectual significance – beyond the seminal essay ‘Two Concepts of Liberty’ – has proven to be limited. Berlin provided an important example of the concept of ‘public intellectual’ but, like others who passed through All Souls, could not be said to have made major advances.
4. Cary Nelson, Professor of English at University of Illinois, in Academic Keywords, Routledge, 1999, p.246 and p.244.
5. The initial exposition of the Higgs mechanism was rejected by Physics Letters as being “of no obvious relevance to physics”. It was later published in the rival journal Physical Review Letters.