Soul of Man Under Scientism: Part 2
We may ask ourselves if the human soul can survive in an increasingly ideological culture. But a better question may be: Can we learn to ‘see through’ scientistic ideology to the Truth of ourselves?
The pattern to be observed in the field of neuroscience mimics the tendency of science in general to avoid the question of meaning while implying that it is debunking it. Failing to address meaning is not the same as disproving it, and it is easy to disintegrate existing meanings by avoidance and neglect while not replacing them with anything equal or better. Equally, to ignore the requirement for meaning in pursuit of science may well itself be unscientific, since it risks unraveling the very processes of reason which have served our species in its quest for further knowledge hitherto.
Science itself is an aid in the search for meaning, but needs to keep a distance from the ideological phenomenon of scientism, which seeks a purely mechanistic explanation for everything, detaching itself from questions of meaning or other ‘moral’ considerations. It would seem axiomatic that a ‘science’ which dispassionately disintegrates useful understandings without supplanting them with more useful ones is unworthy of the name, which derives from the Latin word scientia, meaning ‘knowledge’.
What is most worrying about the drift of neuroscientific ambition is not that it represents an affront to current understandings — such as, for example, the religious understanding of human existence — but that it appears to offer a prognosis and prescription for humanity that really tends towards the antithesis of what the word ‘human’ currently provokes in us. It seems clear that in advancing the ‘progress’ project of our species in the direction currently indicated by this discipline, human beings calling themselves scientists engage in a process of auto-obliteration, which must by definition be the antithesis of true human progress. This drift raises questions of ethics, but it also harks back to a fundamental issue of philosophy: Is there a kind of knowledge that man is better off not having? Could it be right for humankind to cut itself off from forms of knowledge that might lead inevitably to the abolition of the species? In this question we are returned to Eden and to the earliest moral of the Bible, Adam’s and Eve’s folly returned in a different age as a metaphor of human hubris.
The urgent issue for us right now may not so much be whether or what science will discover as how we might deal in our culture with the distortions of the aspirations of science through popular dissemination in the media, popular culture and at the level of mainstream public conversation.
Human certainty is not what our cultures have decided: a definitive clarity concerning facts and meanings. Rather, it is a process of building on what is known, verifying all the time, while moving forward in a confidence all the while bolstered by a coherence that is encountered as we go. This misunderstanding of the meaning of certainty is at the heart of the false notion that there is some kind of split between science and religion.
There is a fallacy of scientific ‘understandings’ that the various disciplines of the science amount to forms of ‘objective’ apprehension — rational, logical and overseen by fastidious peers who question every dot and comma of each new tablet of stone handed down by one of their number. For one thing, it has long been accepted fact that scientific reports and studies have been afflicted by error, plagiarism, falsified data and outright fraud, and the instances of this are outlined in graphic and extensive form in the 1982 book, Betrayers of the Truth: Fraud and Deceit in the Halls of Science, by Willian Broad and Nicholas Wade.
Broad and Wade examine ‘the various means by which fraud is committed and how it can remain undetected for months, years, or even centuries’, and analyse how the lures of careerism, research grants and other inducements lead some scientists to betray their commitment to the discovery of truth, abandoning the lofty ideals of their profession. In their Preface, the authors state: ‘We believe that the logical structure discernible in scientific knowledge says nothing about the process by which the structure was built or the mentality of the builders. In the acquisition of new knowledge, scientists are not guided by logic and objectivity alone, but also by such nonrational factors as rhetoric, propaganda, and personal prejudice. Scientists do not depend solely on rational thought, and have no monopoly on it. Science should not be considered the guardian of rationality in society, but merely one major form of its cultural expression.’
The correct demeanour of the investigator in the face of any question or challenge is a mixture of skepticism and openness. Because science constantly rewrites itself it is not possible to ‘arrive’ anywhere, but only to move forward on the basis of hunches with an assurance that becomes the source of its own, provisional, certitude. Although the scientific approach cannot hope to describe an ultimate destination, it still leaves behind itself a line of no-return, allowing us a degree of confidence in understandings that offer a coherent if contingent hypothesis of reality. This, while remaining tentative, ought to confirm our sense of the astonishing order of reality and fill us with a greater sense of confidence concerning our own place in it, which is to say that, correctly pursued, it should add to our store of meaning. Confidence and humility, then, are ideal companions in the project of scientific discovery.
Science, far from being a repository of certitude, is continually in a state of intellectual flux, which requires the mind of the true scientist to be open to every possibility. Inflexibility and lack of openness lead to stultification, but so too does an excess of skepticism. Thus, there is a paradox whereby the scientist needs at once to be confident and skeptical. Unrelenting skepticism induces paralysis, but its absence leads to stasis.
Language is a source of traps and deceptiveness, and this is especially true of the quasi-mathematical language such as science mostly relies upon, right up to the point of contact with conventional public comprehension. The process of popularising understandings of scientifically-treated phenomena often implies a concreteness at the level of words and concepts that is actually quite tenuous. Words can provide names and malleable concepts for things which are of their nature mysterious, but sometimes the words appear to ‘rescue’ phenomena from this mysteriousness, as though delivering them into the realm of the ordinary and the ownership of man, when in truth it does this by squeezing into ordinary language inflations, distortions and over-statements that have no place in such critical matters.
Often, the way scientific phenomena are named and described involves hidden acts of presumption. In the act of naming them, for example, the scientist can lay what seems to be a total claim through words to something that, though already familiar in outline form, is not fully comprehended, scientifically or otherwise. Fire, for example, can be defined as ‘rapid oxidation occurring between the oxygen in the atmosphere and some kind of material’ or ‘a chemical process arising from combustion’ but these definitions are essentially tautological and actually camouflage the fact that fire remains fundamentally mysterious. Everyone ‘knows’ what fire it, because they have seen it, have lit a fire with a match, or a cigarette with a lighter. But fire was discovered, not invented, and nobody really knows what it is in its essence — beneath ‘the science’. This irreducible quotient, therefore, retains an inscrutability that is incapable of being drilled into.
In invoking the concept of fire, a scientist trades off this irreducible mystery, and yet need not acknowledge its existence. Indeed, the impression is often given and taken that the phenomenon of fire is something itself generated by science. There are lots of cautionary myths and legends to warn us off doing such things, and one of them, a Grecian version of the Garden of Eden, occurs precisely in the realm of fire. The legend goes that Zeus, king of the gods, angry for having been tricked by Prometheus into accepting just the bones and fat of sacrificial animals — Prometheus retaining the meat for himself — withdrew and hid fire from humans. Prometheus, however, stole and brought it back to Earth. From here comes the myth of Pandora’s box: Zeus created Pandora and sent her to Earth, with the intention that she become married to Epimetheus, brother of Prometheus, which is what happened, despite Epimetheus being warned by Prometheus. Again defying instructions from the gods, Pandora removed the lid from the box she had received as a wedding gift and it was then that death, evil, disease, greed, envy, hardship and a thousand other miseries were loosed into the world. The troubles of the world therefore started with fire, and man’s determination to steal it from the gods.
Similarly, scientists talk about the 'discovery of electricity' which is an accurate description of what has happened — i.e. electricity was found and understood by breathless man. But then something further happened: Man named electricity, put descriptions on its constituent processes and claimed the 'discovery' of the essence of the phenomena as, in effect, his own creation. It is as if electricity in its natural existence had occurred only accidentally, or crudely, and, having been marshalled by man for his own use, could be claimed as an ‘invention’. Thus, the scientist creates a language and imagery of electricity which in effect colonises something preexisting and mysterious as something knowable and falling within man’s ambit and power.
A dictionary definition of electricity describes it as ‘a form of energy carried through wires and used to operate machines, lights etc’. Electricity, rather than being a fundamental element of nature, becomes wires and switches rather than wondrous impulses arising from electrons and protons, which man has likewise found in the world. Then, next time he comes upon a similar process — for example in his own brain — man declares, ‘But this is just electricity’. In this way, he creates a circularity to elide mystery, and this, disseminated through the channels of popular conversation and education, gets rinsed down to something like, ‘But the human brain is merely electrical impulses!’
In theory at least, each new generation of scientists first verifies mechanistically, and afterwards takes for granted the work of the last. What was magical and mysterious acquires words and descriptions which appear to render banal, or at least familiar, what was once inscrutable. This is in some ways essential and probably unavoidable. The difficulty arises when hubris enters in.
After a while, some scientists, emboldened by the collective endeavours of their profession, tend to overlook the tricks by which their forebears have rolled the irreducible essences of fundamentally existing entities into words and phrases capable of insinuating authorship and ownership on behalf of the user. It is but a short hop from here to the use of terms like 'pixie dust explanations' to describe the tentative expositions of other disciplines — theology, philosophy — for phenomena which have not yet been claimed or conquered by science. It is as though their previous successes in comprehending emboldens scientists to presume that the remaining mysteries will eventually not merely yield to their interrogations, but will ‘belong’ to them in the same way that, for example, fire and electricity have come to. Thus, science not merely eliminates the mystery of what was once totally unknown but also of what remains unexplained. Science takes out an intellectual mortgage on future discovery and claims everything now.
In truth, all explanations for everything under the sun are simply rational confabulations in earthbound language concerning phenomena that remain bedded in the mysterious. Nobody involved has ever actually ‘made’ anything, not excluding himself. At a certain point, going backwards through the history of knowledge, science runs out of road — with the origin of matter, with the question of first causes more generally, with the nature of consciousness, which appears to have no material basis and yet reposes in the physicality of the human person. Having words to clothe over these deficiencies enables science to obscure its own relative ignorance. Having mechanistic explanations for the coherence of phenomena that have yielded to certain understandings by man gives them the appearance of existing at the level of rational phenomena, but this is largely a trick of positivistic thinking. Everything remains, relatively speaking, at the level of pixie dust, or, if you prefer, at the level of observable mystery.
For example, we look at a leaf and say, 'Oh, that's just DNA'. Rightly, this ought prompt the question: ‘But what, really is DNA?’ Having a name and a tentative understanding for something does not make it 'ours', nor render it the property of science, nor unlock its irreducible nature, nor predict all its characteristics in all circumstances. Indeed the statement is a tautology of the crudest kind.
It is perhaps the case that it might be possible to encounter different worlds with different laws of physics and biology to ours. We have no way of knowing what manner of physical ‘laws’ obtain at the furthest end of the universe, or in the next one. Still, no matter how any such world might be found to work it is inevitably going to have processes capable of being named, described and ultimately comprehended up to a point, if only functionally. It may be possible to imagine or stumble upon an entirely different set of mechanisms for how reality might function — as in science fiction — but these explanations too would ultimately require to be amenable to understanding, according to some kind of scientific principles. To suggest otherwise is to imply that there might somewhere be a version of reality characterised by incomprehensibility and nonsense. Even if such an alternative arrangement were utterly different to ours, and operated by completely different physical laws, it would still comprise a range of mechanisms and functions which cohered according to some set of pre-existing rules or laws. These laws are divined and tabulated by man, but they are not manmade. Naming them does not make us their masters, merely the observers of pre-existing phenomena.
When you delve into the state of modern thinking about subjectivity you stumble into what soon becomes recognisable as yet another attempt to debunk the religious view of reality. In this, as in other contexts, it seems as if the religious sensibility represents such an affront to the ‘rational’ mind of modern man that human thinking requires a very low level of evidential basis to become satisfied that it has transcended all questions of faith in God and a hereafter.
Of course, to achieve this, science has often discredited itself by engaging in ludicrous reductions of the religious idea, by setting up straw men and valiantly knocking them down. The soul of man, for example, is grotesquely caricatured in some scientific quarters, first reduced to a sub-rational phenomenon, and then dismissed as an affront to reason. Although science has so far failed in its efforts to debunk the idea of soul, we cannot presume this incapacity will endure, at least as far as the popular imagination is concerned. Fortified by the advance publicity concerning such an imminent deconstruction, our positivistic cultures nowadays adopt an increasingly metaphorical conception of ‘soul’, recognising only a residual linguistic tic: ‘He’s got soul’; ‘Upon my soul!’; ‘soul food’ etc.
Actually, the religious view of subjectivity is unreasonable only if you change the rules to exclude questions that, being unlikely in the normal run of events to be answered, increasingly tend to be elided. In fact, like many religious explanations for things, the religious perspective on subjectivity conforms to the naturalistic, commonsensical sensations and apprehensions that go with being human, with having a mind, with being inside a recognisable and distinguishable human ‘entity’ looking out at the world.
But, whereas science often provides a ‘counterintuitive’ understanding of the functionality of the human edifice, the religious view is essentially the one that might empirically be surmised from the totality of the visible and experiential evidence.
In his book The Religious Sense, and indeed throughout his life, the Italian priest and founder of the Catholic movement Communion and Liberation, Father Luigi Giussani, insisted to the point of irritation on reminding us that ‘we do not make ourselves’, which suggests, beyond the obvious, that nothing we have arrived at as an ‘explanation’ for either our origin or our moment-to-moment functioning is wholly persuasive in the way external phenomena may appear. We may settle on a biological explanation for the existence of an oak tree, but the same process, applied to our own being, strikes us as incomplete. There is ‘something else’, something that precedes its own self-understanding, and also exceeds in potential and capacity its own attempts to define the precise nature of that being. Giussani was also, up to a point, in agreement with some of the more radical neuroscientists in holding that, other than as a helpful metaphor, no ‘central intelligence’ can be discovered in the human — except that this led him to believe not that we do not exist but that we are more than simply ‘ourselves’. Giussani describes this as the generative force, which he called God. It is important to stress that he describes this in the present tense as a continuing process: He emphasises not that ‘I did not make myself’, but that ‘I do not make myself.’
‘If I descend to my very depths,’ he says, ‘where do I spring from? Not from myself: from something else. This is the perception of myself as a gushing stream born from a spring, from something else, more than me, and by which I am made. If a stream gushing forth from a spring could think, it would perceive, at the bottom of its fresh surging, an origin it does not know, which is other than itself.’
‘Here,’ he explicated in one of his talks to students, ‘we are speaking of the intuition which, in every period of history, the most intelligent human spirits have had. It is an intuition of this mysterious presence, which endows the instant, the “I” with substance (solidity, density, foundation). I am you-who-make-me, except that this “you” is absolutely faceless. I use this word “you” because it is the least inadequate in my experience as a human being to indicate that unknown presence which is beyond comparison, more than my experience as a human being. What other word could I, on the other hand, use? When I examine myself and notice that I am not making myself by myself, then I — with the full and conscious vibration of affection which this word “I” exudes — turn to the Thing that makes me, to the source that causes me to be in this instant, and I can only address it by using the word “you”. You-who-make-me is, therefore, what religious tradition calls God — it is that which is more than I, more “I” than I am myself. It is that by means of which I am.’
In The Religious Sense, Giussani elaborates that the human ‘I’ is that level of human nature in which nature becomes aware of not being made by itself. ‘In this way,’ he continues, ‘the entire cosmos is like the continuation of my own body. But one could also say that the human being is that level of nature in which nature experiences its own contingency. Man experiences himself as contingent, subsisting by means of something else, because he does not make himself by himself. I stand on my feet because I lean on another. I am because I am made. Like my voice, which is the echo of a vibration: If I cease the vibration, it no longer exists. Like spring water rising up — it is, in its entirety, derived from its source. And like a flower which depends completely on the support of its roots. So I do not consciously say “I am”, in a sense that conveys my entire stature as a human being, if I do not mean “I am made”. The ultimate equilibrium of life depends upon this. The human being’s natural truth . . . is his nature as creation — he exists because he is continually possessed. And, when he recognises this, then he breathes fully, feels at peace, glad.’
‘True self-consciousness is well portrayed by the baby in the arms of his father and mother — supported like this, he can enter into any situation whatsoever, profoundly tranquil, with a promise of peace and joy. No curative system can claim this, without mutilating the person. Often, in order to excise the censure of certain wounds, we end up censuring our humanity.’
The religious understanding, as captured by Fr Giussani, conveys better than anything else I have encountered the essence of what I have always understood about myself. It chimes with what I have experienced, with what I experience now, with what I know — not theologically, not ‘spiritually’, but actually, from living in my own skin in reality. It leaves no loose ends. It is credible and coherent. It is a near flawless description of the nature of the subjective self.
I do not make myself. I am You-who-make-me. If I look for the ‘I’ as the author of myself, then I rummage in the package and find nothing but wrapping. And yet there is obviously something within, an ‘I’ of some kind — one that does not seem to be the source of itself. Only if the ‘I’ that fires or beats or thinks can be seen as the projection of something beyond will anything begin to make sense. If the human being is contemplated only in terms of the deterministic and mechanistic processes in which the human mind excels, then a process of elision is necessary, to shut out the idea of the ‘ghost’ that must reside at the centre — heart — of this machine. This ‘ghost’ is at the heart of the ‘I’ — that essential, primary element of the human heart that the transplant surgeon cannot find.
Back in 1991, pursuing the neurological nature of pain, scientists discovered that pain is ‘created’ by the brain but moderated by the heart, and far from being a purely sensory experience, it comes associated with emotional, cognitive, and social components. These mechanisms operate to moderate the cognitive and emotional factors of pain, something along the lines that the heart acts ‘compassionately’ to ameliorate pain in the body. It was the appropriately named Dr. J. Andrew Armour of the University of Montreal, who in conducting this research discovered that the heart has its ‘little brain’ or ‘intrinsic cardiac nervous system’, causing the heart to send more signals to the brain than the brain communicates to the heart. This ‘little brain in the heart’ is composed of approximately 40,000 neurons resembling those in the brain, meaning that the heart’s ‘nervous system’ is capable of sensing, feeling, perceiving, learning, remembering and problem-solving. Dr Armour found that this circuitry of neurons could retain input information for a short while, suggesting that memory consists in a constantly oscillating series of cycles like a sampling loop. In other words, the heart can feel, think, and decide of its own volition, independently of the brain, perhaps explaining, finally, why humans, in speaking of themselves, often place a hand over the heart, as if aware that this is where the centre of human being resides. In other words, not only does the heart respond to the brain, but the brain continuously responds to the heart. The heart is not just a pump, but part of the mind of man. It is also the part that unites us more immediately and profoundly with other humans. The heartbeat of a human person creates electromagnetic waves that can be detected and measured six feet from the body — many times more powerful than those of the brain.
People who have received heart transplants find that their tastes — for food, music, poetry, writing, sport, exercise, travel — have radically changed, as though the heart contained the predispositions, tastes and memories of the donor. This intuition has been demonstrated in the research of Professor Gary Schwartz, working with Dr Paul Pearsail, comparing the testimonies of heart recipients with those of the donor families and finding strong correlations concernings things that could not have been known to the recipients. The incidence of concurrence was far too detailed to be coincidental. Professors Schwartz believes that the phenomenon has to do with the feedback loops in the neural pathways of the heart, which store information concerning the thoughts and memories of its human ‘user’ and pass these on after transplantation. When the transplanted heart is suffused with blood from the new ‘user’ it spontaneously begins beating again, reactivating the memory loops. Memory is not localised in any specific neuronal context, but is distributed throughout the system.
This discovery may provide ‘scientific’ elucidation for what Fr Giussani has postulated ‘religiously’. The heart of man may be the site of the ‘external presence’ by which the dualism Giussani identifies is achieved. The ‘I’, in the sense of being the core of ‘me’, is not self-contained, but seems to be the partner of something else. We might say that it is a receiver of signals from afar, except that such a construction seems dangerously to replicate the mechanistic thinking we would do well to avoid. Briefly pursuing that wonky line of reasoning, we might arrive at a clumsy comparison with, say, a TV set: the brain as merely the receiver of signals sent from elsewhere. But this, though somewhat illuminative, seduces us into the old trap of applying to what is unknowable crude understandings we have arrived at through our tentative fumblings with the majesty of reality.
Let us just say then that, our sense of affairs-as-they-are endures, that at the heart of each of us, something remains unexplained. More than that: Something seems to exist that cannot be reduced, that cannot be understood according to the methods that seem to work for everything else, or at least for most other things that we address in our everyday lives. Perhaps we might agree that the heart, by definition, is designed in a manner as to be unable to understand itself. The ‘religious’ description of this, though in positivistic terms implausible, seems to conform more persuasively to our intuited sense of things. We think by means of a dialogue with something or someone, but never think to ask ‘With whom?’ or ‘With what?’
And this ‘religious’ understanding of the human person is where the heart reveals itself in its true nature: the locus of that in man which cannot be reduced even by man’s craving for an explanation, a craving that itself may emerge in the heart, which seems to dramatise the paradox of man’s central dilemma, but also to offer the beginnings of a path of reasoning that is negotiable and reliable. The heart is the entity in which my humanity seems first to emerge, to start out, having rooted itself. So, if indeed a metaphor, it is one in which we discover the only bedrock that seems to be available. And yet, it too, when investigated closely with an ‘objective’ eye or ear, fails to yield up its source, its inner voice, the impulse that creates the pulse.
This, then, is the paradox: The ‘I’ that resides at the centre of each human being is not some autonomous, disconnected, individuated authority, but a kind of partnership between what is evident and something that seems not to be there. My ‘I’ is ‘me’, yes, but also something mysteriously other. ‘My’ desires, therefore, are not entirely ‘mine’ in the sense of relating to some straightforward correspondence between my needs and what I have discovered to be my immediate options. (Or at least not invariably.) Certain desires, observed in their most essential condition, seem to impel us above ourselves, even as they bring us crashing down into the banality of immediate ‘satisfaction’. My desires create in me a tussle between what I think I ‘want’ and what I discover I ‘need’. This paradox appears indeed to arise from the presence at the heart of me of some otherness, whose influence is both elevating and confusing. Perhaps this is the root of what we have come to regard as human irrationality, whereby the human person seems not to know what he wants, what is good for him, what to do with himself.
Let us hypothesise, then, that the heart, the font of the desire that follows me from the beyond whence I came, speaks to me every moment of what this ‘I’ of mine really seeks, really wants, really is.
Here, we might advisedly pause to avoid a short-circuit arising from the intervention in these ruminations of the word ‘religious”. This idea of the heart’s truest desires cannot — must not — be heard as a sentimental or moral injunction, implying a direction that is already decided: into the arms of some ‘god’, to sit ‘at the right hand of the Lord’, and so forth. For too long, religious discussion has fallen into this trap: begging, closing down the question before it is even properly opened. Identifying the call of the heart is the starting-point of a journey, not the cue for repentance or unquestioning obedience, still less certainty as to where the journey takes me. If we leap from the idea of some great human desiring to an immediate conclusion rooted in an established cultural idea, we risk provoking a reduction as deadly as any of the deterministic or relativistic reductions that might incontrovertibly be identified as problematic. Instead, we take a deep breath and prepare for the real journey. The heart will inform us of the full scale and nature of our desire. The search must be thorough and total. It begins with the authentic question that beats out its insistent rhythm all the days of our lives.
Because positivistic logic sees religion as neither verifiable nor falsifiable, it was inevitable that religious logic would become separated from mainstream culture. Being purely ‘subjective’, religious understandings could not correspond to the requirements of reason as positivistically understood, and so have been ruled out of the investigation, consigned to the ‘irrational’ realm. But positivism has inflicted something far worse on our cultures: the idea that ‘objectivity’ implies, for the human person, a kind of out-of-body position before the world and himself. Christianity tells us that this is a falsehood, but this does not negate the possibility of a Christian ‘objectivity’.
In Christian terms, as Father Giussani outlined, human self-understanding derives from the nature of the ‘I’, which is understood as not merely a solitary subjectivity, but a self accompanied by a Presence. Subjectivity, then, is more than one: It is a dualistic relationship looking out on the world, a dialogue internal to the human person. The human mind is engaged in a constant conversation within itself, and this, applied to the stuff of experience, is the most formidable instrument of objective reasoning. Here we glimpse a point at which the subjective and objective mechanisms may coalesce and interact. True knowledge requires a dialogue, and subjective knowledge requires both experience and a relationship with another, out of which a process of objective judgement may arise.
The Christian understanding of this, as outlined by Giussani, resides in the acceptance that the mind is not ours, that the heart is not ours, that both are given, that both are projections not from within, because they cannot be — because there is no traceable source on the inside — but from somewhere else; that our mechanistic model of reality, derived from an objectified sense of rationality implicit in the manmade world, is incapable of providing total coherence if applied mechanistically to the human subject itself. Deceptively, though, this mechanistic perspective on humanity is capable of making partial sense, enabling humans to achieve a certain working ‘knowledge’ of themselves — but only to the extent that, if we acquiesce in behaving like the machines we have ‘created’, we may achieve an approximately functional theory of our own habitation of reality, and perhaps a simulacrum of coherence with regard to it — a ‘quale’ of meaning that seems to work, but only, on closer examination, tautologically.
Some scientists have sought some halfway house between the religious and scientific understandings. The philosopher/psychologist William James made a distinction between the 'I' — the subjective self — and the objective self, which is more or less, as regards the entity called John Waters, the being, body, that might be observed sitting here writing this. The 'I', James reasoned, mediates and coordinates all the sensations, perceptions, experiences and memories, and thereby, as it were, ‘governs’ the human person. James rejected the idea of the self as illusion, but did not share the conventional religious idea that the self and the soul are coterminous. Instead he posited the idea of thought being ‘itself the thinker’ - i.e. a supervisory form of Thought that edited all the minor thoughts emanating from the stream of consciousness of the mechanical intelligence, forging a definitive subjectivity out of the complex process of selection and rejection. I’m not sure I fully understand how this may be said to differ from the modern idea of the self as illusion. It too seems to depend on a tautology, which avoids the question of how Thought might develop a mechanism to initiate a process of thought. Surely something else must have generated the mechanism being called by the name ‘Thought’, for how else could it exist unless some entity first thought it into being? It is hard to see, then, how James could be credited here with anything more than a clever use of words. In truth, in constructing his elaborate ‘logical’ formula, he was missing the central ‘religious’ idea: that the human is more than either mechanical or mystical processes: it is a relationship that occurs in what appears to be a single entity or ‘being’.
What am I saying? That God — Christ — is within me? Not a new or radical idea! Why all the parsing and palaver? Because I am trying to demonstrate it — not ‘prove’ it (impossible) but create a line of reasoning that might enable it to be glimpsed as something resembling a fact — a stab at an observational model, with a little empiricism, a little rationalism and the merest sprinkling of faith.
To the open-minded layman, I believe, this ‘explanation’ will make a kind of sense — because it achieves far more coherence with the experience of being human than anything so far posited by neuroscience. But, as any adjacent scientist will probably tell you, this of itself proves nothing, because we know that human intuitions are disposed to be wrong.
That may be so. But it is also the case that there appears to be an insurmountable difficulty with achieving what many neuroscientists now say is imminent: the objectification of the human mind in order to demonstrate that the self is an illusion. To the layman, there seems to be an immediate, albeit possibly semantic difficulty: Who, conceivably, might hope to attain such an objective? Who, were it to be achieved, might be credited with it? How would scientists keep themselves out of the attempt to prove there is no self? If — just imagine — they one day claim to have succeeded, who might we credit with achieving the breakthrough, since the research itself would presumably have shown that there exists no subjective entity capable of looking into objective reality?
This might seem like a philosopher’s objection, even a vexatious one. But it is also shared by scientists. In A Skeptic's Guide to the Mind — What Neuroscience Can and Cannot Tell Us About Ourselves, Robert A. Burton argues that the innate irrationality of humans has itself led neuroscientists astray in their attempts to understand the mind. ‘Any application of data to explain the mind will always be a personal vision, not a scientific fact’, he says. This is because, if we use an ‘irrational’ instrument to examine itself, the outcomes can never be regarded as reliable. And of course this also implies that the search must necessarily be impeded — even diverted — by the tendency of subjective enquiry to lead its protagonist to that which he already believes. The self — however constituted, but in any event already solidified into a particular way of seeing — stands as a shield against the truth as much as it offers an opportunity of grasping it.
To be fair, many scientists, when asked the question, agree that we are nowhere near resolving the issue, although they often go on to compare it to other problems which have arisen in the past and being resolved in time — like the way scientists used to be puzzled by what appeared to be subjective/objective blockages regarding questions of light and sound, or with the now discredited understanding of the élan vital (‘vital impulse’, also defined by Schopenhauer as the ‘will-to-live’) or how electrical activations in the brain could correspond to the process of thinking. Many scientists believe that the territory might at any time yield up a major breakthrough, but fewer are prepared to declare this an imminent possibility.
Nevertheless, the very fact that these explorations are happening, the fact that neuroscientists appear broadly to have in mind a single potential destination, the fact that many of them speak about these matters in the same blasé, supercilious way — all this may be having unquantifiable effects on our culture.
From the perspective of human understandings as currently culturally constituted, the project aiming at the deconstruction of the self is an ‘amoral’ endeavour which refuses to take responsibility for consequences, holding that knowledge alone is its own justification. This, of necessity, is a prerequisite of scientific endeavour: The advance of science itself requires to be separated from any concerns about negative consequences or misuse. But there are many potentially detrimental aspects of current trends and objectives in neuroscience which perhaps ought to give us pause for thought outside the austere crucible of scientific curiosity.
Already, for example, remote intellectual progeny of half-formed neuroscientific ideas are creeping into courtrooms, where they are used to defend otherwise indefensible actions on the basis of what in a certain light seems like a secularised, rationalised version of predestination. What is called ‘neurodeterminism’, for example, suggests that our behaviours are dictated not by an autonomous self but by an inherited neurobiology which renders us blameless for our actions. Taking the idea of the mechanistic human to its limit may in time lead to the elimination of current concepts of free will and render obsolete conventional ideas about crime and punishment. Hence, a murderer did what he did because his synapses told him to. This, of course, ignores innumerable factors which can definitively be stated, for example that many other people may have similar neurology to the murderer but never killed anyone. But the tendency and trend are both real.
The effects of highly tentative scientific ruminations are also creeping into the public square — in invisible ideologies which infect popular and political thinking, adding pep to the step of the relativist and the secular-atheist who wants an end to all the God nonsense once and for all. As with other great questions, popular understandings have tended to move ahead of the science, taking it more or less for granted that what scientists say is possible today will become proven fact by the day after tomorrow. And if it is going to be demonstrated in the future that the self does not exist, it follows, as b follows a that the self cannot exist in the present. It is already debunked, if only in potential terms. But this potential is sufficient to start the process of changing our minds about our minds. At the informal level of everyday culture, we proceed as if the ultimate discovery had already been made.
This is an amazing and largely unremarked aspect of mass media society: that the collective intelligence of human civilisation may increasingly be formed by ideologically mutated interpretations of not merely scientific developments but actually of mooted scientific objectives, even those which seem to have no imminent prospect of confirmation. The very process of scientific searching — or, rather, the ideology of this searching process — implies that a certain conclusion is potentially possible, and this appears to give automatic and unquestionable credibility to the assumption that this conclusion will ipso facto be reached. Again, we take out a mortgage on the certainty of the breakthrough and gift it to ourselves right away.
If there is no single inner self — if it is a trick of linguistic function or some mechanical quirk of the mutual firing of the different brain hemispheres — then almost everything we take for granted about ourselves as beings in a society of beings is potentially bogus — not merely ethics, but also conscience, responsibility, conviction, judgment, even opinion. If there are only mechanical processes held together by a trick called memory or a sleight-of-words, then what is the point of continuing to treat human beings as we do? If subjectivity is illusion, what we call ‘existence’ is illusion. If there is no one there to have a continuity of experience, what is the point of talking of ‘memory’ or ‘nostalgia’? If there is no self, how can there be individual free will, and how then can punishment be just? What, for that matter, might 'justice' amount to if we are all merely joined-up series of dissociated experiences passing ourselves off as intelligent beings? These used to be abstract questions, but becoming increasingly real in the age of AI. Such forms of human society as may emerge from the mooted future understandings will perhaps continue to have laws, but how will they contrive to have conviction about upholding them? Who could believe in them? Indeed, who will, in any meaningful sense, exist to claim to believe in them? And what of sanctions? Why punish the body or mind or even soul of a man who exists only by way of being the repository of dissociated memories of brain activity, which give him the illusion of continuity in what is misnamed as his ‘life’?
But these may be but the beginning of the implications for human understandings and human society as currently experienced and elucidated. The implications are, yes, unthinkable. How, for example, will future men and women listen to a piece of music once thought ‘beautiful’ and be able to transcend the knowledge that their response to it is merely, as the case may be, the opening and closing of synapses, the whoosh of chemicals or the residual sway of inherited memes? How will a teenager, cast adrift in some future state of desiring, be able to speak of love without a similar awareness creeping into the encounter? How will we reach out to embrace one of our children in the knowledge that, before the moment of contact, we may be overcome by the knowledge of our and their machine-like condition? Who embraces? Who is embraced? For what purpose? Will it be love or simply mutual consolation in the face of hopelessness?
Were we finally, definitively, to receive confirmation that in fact our selves are mere illusions, could we continue to ‘be’ ourselves in the way we are now, believing ourselves to be actual presences at the cores of our beings? It seems clear that, in such a future dispensations, the idea of myself as a subjective presence in reality will seem to me — whatever ‘me’ means — a self-deception. But how can it be a self-deception if there is no self to be deceived? I cannot imagine such a thing right now, but this present failure to imagine it will not protect me from its consequences when it happens. Nothing will ever be the same, because nothing will ever again seem quite the same. We may imagine now that we will be able to continue in our illusory bubbles, reserving our knowledge of ‘truth’ for the science lab and the biology class, while continuing with inherited, crude, scientifically countermanded notions of the way we are constituted — because they are useful, functional metaphors of our functioning, but this seems a vain hope.
It will be like an illness has entered the very core of whatever man continues to think of as himself. Will I be able to utter the word ‘I’ or ‘me’ without an escalating sense of irony? Will I be able to avoid becoming separated from myself — in a sense undergoing a permanent out-of-body experience by which I see myself as some dissociated robot which I inhabit without existing — more or less as a spectator on a life that is only nominally mine? Will I manage to engage in even the most everyday human activity while eliding the sense that it is not ‘I’ who does this, that there is no ‘I’, that what appears to be ‘me’ is simply a trick-of-the-chemistry, a series of sensations which are felt by something that feels like a ‘me’ but isn’t really?
It seems clear, then, that, if all of this unfolds as we are told it will, the only coherent logic for dealing with the new situation of humankind would be merely to enjoy the ‘trip’, to see life as simply a random series of experiences without real consequences, a kind of hallucinogenic trip in which there is no core humanity to be fixed or taken higher. We cannot predict where this might lead, but we can anticipate that it would be someplace dark and pointless, characterised by increasing ruthlessness, vindictiveness and desperation — life as in a straitjacket, going nowhere and having no order or coherence.
Will we, knowing all this, be able to behave as if we don’t? Will we be able to rest on our victory laurels in having unpicked the mystery of ourselves, and still continue with a version of ourselves which, though useful, familiar and reassuring, stands in contradiction to the ‘facts’ of science as we 'know them? It seems inconceivable. Will we, for a time, hold the two ideas together simultaneously, in a kind of paradoxical embrace, one moment believing ourselves to be as we have until — more or less — the present, beginning-of-the-third-millennium reality, then remembering what we are ‘really like’? Will this period of paradoxical embrace come gradually, osmotically to an end, leading to a time in which we will understand ourselves (as it were) and each other only in the new ‘realistic’ way?
By seeking to go beneath what we recognise as the irreducible self — by splitting the atom of the 'I' — man is really delivering himself over to a condition that might in a certain sense be compared to the compulsion of the anorexic. He seeks to reduce himself, or at least pursues such a reduction for ‘scientific’ reasons, but has no benchmark to work to. No matter how far he goes, he thinks he ought to — and can — go further. If he 'succeeds' the success will really be a kind of dissolution of his own self, so the searching might aptly be depicted as a kind of metaphysical anorexia. Because we cannot settle on a clear cultural understanding of ourselves, we shrink and disintegrate our humanity in search of a new ideal. But, since no ideal is possible, the human simply contracts and shrivels up before the relentless scrutiny of the positivistic tyranny. In the end, because this is what we have ordained, there may indeed be nothing at the core of man but a void.
Such altered thinking plays into the latest and escalating scientistic drift: the impetus towards the normalisation of transhumanism, baited and incentivised with promises of eternal life as half-machine and the possibility of eliminating ageing as a cause of death. ‘Human Augmentation’ is one of the names given to this new ‘inevitability’ — augmentation of the human mind and body, that is — the absorption of man into machine. This ‘promise’ includes the possibility of new generations of prosthetics, implants, memory chips, and intelligence boosters, ‘improving’ on the raw material of human being. Finally, man becomes God.
Is there any possibility of an alternative direction? Yes, but probably not at first via the path once proudly titled the ‘religious’. It seems that, as with pretty much everything else, we must proceed down the wrong fork for longer than is good for us. The hope is that there may emerge in time a model of science that will confound humanity by taking us back to the tentative, contingent understandings once formulated under the heading ‘Christianity’ and whispering the currently unthinkable proposition that these may have been far more reasonable than we allowed ourselves to believe. An essential empiricism will return the game of discovery to the first square, enabling us to start over in the knowledge that the wrong kind of ‘knowing’ will lead to self-destruction.
The subject of the thinking heart of man is not akin to that of the earth’s once alleged flatness, or the fallacy of the sun orbiting the earth, or the now allegedly discredited notion of the élan vital. To take the wrong fork here leads to Hell. This, you might say, is the final frontier — the moment, the choice, that may lead to the ultimate, and fatal, reduction of humanity. Were man to think himself into this ‘achievement’, he would in effect be committing suicide using his own mind as the weapon, a weapon that would disappear the moment the act of self-murder had been effected. Is this a contradiction? I hope so. I hope it remains a contradiction for as long as I live, and for a long time afterwards, until all those I love have lived their lives and moved on. It seems to me to represent, at any number of levels, an assault on reason as I have known it, and as it has appeared to conform to what I have seen and learned about the world I’ve inhabited for six-and-a-half decades.
Yet, there is hope that, as science moves beyond its present positivistic phase — as it has the potential to also — a change may enter in. Even for a layman, to contemplate the vistas now being opened up in the area of quantum mechanics is to be brought face-to-face with the possibility that, the further science moves away from positivism, the closer it gets in a new way to ideas once developed from the purview of religion — like the soul, the irreducible eternal self, even the concept of an afterlife, albeit ‘scientifically’ comprehended and expressed.
One neuroscientist who takes the quantum view of neuroscience is Stuart Hameroff, who describes human consciousness as dancing ‘on the edge between the quantum world and the classical world’. The more we are influenced and in touch with what he calls ‘the quantum subconscious world of enlightenment’ the happier we can be. He adds, in an interview with Sue Blackmore: ‘When the quantum coherence in the microtubules is lost, as in cardiac arrest or death, the Planck scale quantum information in our heads dissipates, leaks out to the Planck scale in the universe as a whole. The quantum information which had comprised our conscious and unconscious minds during life doesn't completely dissipate, but hangs together because of quantum entanglement. Because it stays in quantum superposition and doesn't undergo quantum state reduction or collapse, it's more like our subconscious mind, like our dreams. And because the universe at the Planck scale is non-local, it exists holographically, indefinitely. Is this the soul? Why not?’
You may read this, if not as gobbledygook, then perhaps as metaphor, or simply as the speculation in tentative sentences of a scientist who shrinks from speaking of transcendence in the old sense. It may be a scientist expressing in his own language what we apprehend as a new metaphor that resonates with old ones. Or it may be, somewhere about its sub-clauses, a paragraph of mumbo-jumbo designed, consciously or otherwise, to render ‘scientific’ what the human heart started to intuit from the beginning but called by different names.
In the final part of this series, I shall investigate whether scientific understandings can be reliable in a world where they are mediated via channels dominated by ideology, ignorance, ambition, rivalry and money, and couched in words whose nature is not as we imagine it to be.