Atheism has been Part of Many Asian Traditions for Millennia

File 20190328 139361 138qhpw.jpg?ixlib=rb 1.1

Atheism is not a modern concept.
Zoe Margolis, CC BY-NC-ND

Signe Cohen, University of Missouri-Columbia

A group of atheists and secularists recently gathered in Southern California to talk about social and political issues. This was the first of three summits planned by the Secular Coalition for America, an advocacy group based in Washington D.C.

To many, atheism – the lack of belief in a personal god or gods – may appear an entirely modern concept. After all, it would seem that it is religious traditions that have dominated the world since the beginning of recorded history.

As a scholar of Asian religions, however, I’m often struck by the prevalence of atheism and agnosticism – the view that it is impossible to know whether a god exists – in ancient Asian texts. Atheistic traditions have played a significant part in Asian cultures for millennia.

Atheism in Buddhism, Jainism

Buddhists do not believe in a creator God.
Keith Cuddeback, CC BY-NC-ND

While Buddhism is a tradition focused on spiritual liberation, it is not a theistic religion.

The Buddha himself rejected the idea of a creator god, and Buddhist philosophers have even argued that belief in an eternal god is nothing but a distraction for humans seeking enlightenment.

While Buddhism does not argue that gods don’t exist, gods are seen as completely irrelevant to those who strive for enlightenment.

Jains do not believe in a divine creator.
Gandalf’s Gallery, CC BY-NC-SA

A similar form of functional atheism can also be found in the ancient Asian religion of Jainism, a tradition that emphasizes non-violence toward all living beings, non-attachment to worldly possessions and ascetic practice. While Jains believe in an eternal soul or jiva, that can be reborn, they do not believe in a divine creator.

According to Jainism, the universe is eternal, and while gods may exist, they too must be reborn, just like humans are. The gods play no role in spiritual liberation and enlightenment; humans must find their own path to enlightenment with the help of wise human teachers.

Other Atheistic Philosophies

Around the same time when Buddhism and Jainism arose in the sixth century B.C., there was also an explicitly atheist school of thought in India called the Carvaka school. Although none of their original texts have survived, Buddhist and Hindu authors describe the Carvakas as firm atheists who believed that nothing existed beyond the material world.

To the Carvakas, there was no life after death, no soul apart from the body, no gods and no world other than this one.

Another school of thought, Ajivika, which flourished around the same time, similarly argued that gods didn’t exist, although its followers did believe in a soul and in rebirth.

The Ajivikas claimed that the fate of the soul was determined by fate alone, and not by a god, or even by free will. The Ajivikas taught that everything was made up of atoms, but that these atoms were moving and combining with each other in predestined ways.

Like the Carvaka school, the Ajivika school is today only known from texts composed by Hindus, Buddhists and Jains. It is therefore difficult to determine exactly what the Ajivikas themselves thought.

According to Buddhist texts, the Ajivikas argued that there was no distinction between good and evil and there was no such thing as sin. The school may have existed around the same time as early Buddhism, in the fifth century B.C.

Atheism in Hinduism

There are many gods in Hinduism, but there are also atheistic beliefs.
Religious Studies Unisa, CC BY-SA

While the Hindu tradition of India embraces the belief in many gods and goddesses – 330 million of them, according to some sources – there are also atheistic strands of thought found within Hinduism.

The Samkhya school of Hindu philosophy is one such example. It believes that humans can achieve liberation for themselves by freeing their own spirit from the realm of matter.

Another example is the Mimamsa school. This school also rejects the idea of a creator God. The Mimamsa philosopher Kumarila said that if a god had created the world by himself in the beginning, how could anyone else possibly confirm it? Kumarila further argued that if a merciful god had created the world, it could not have been as full of suffering as it is.

According to the 2011 census, there were approximately 2.9 million atheists in India. Atheism is still a significant cultural force in India, as well as in other Asian countries influenced by Indian religions.The Conversation

Signe Cohen, Associate Professor and Department Chair, University of Missouri-Columbia

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Muhammad: an Anticlerical Hero of the European Enlightenment


Thomas Jefferson’s copy of George Sale’s 1734 translation of the Quran is used in the swearing in ceremony of US Representative Keith Ellison at the United States Capitol in Washington, DC, on 4 January 2007. Photo by Getty

John Tolan | Aeon Ideas

Publishing the Quran and making it available in translation was a dangerous enterprise in the 16th century, apt to confuse or seduce the faithful Christian. This, at least, was the opinion of the Protestant city councillors of Basel in 1542, when they briefly jailed a local printer for planning to publish a Latin translation of the Muslim holy book. The Protestant reformer Martin Luther intervened to salvage the project: there was no better way to combat the Turk, he wrote, than to expose the ‘lies of Muhammad’ for all to see.

The resulting publication in 1543 made the Quran available to European intellectuals, most of whom studied it in order to better understand and combat Islam. There were others, however, who used their reading of the Quran to question Christian doctrine. The Catalonian polymath and theologian Michael Servetus found numerous Quranic arguments to employ in his anti-Trinitarian tract, Christianismi Restitutio (1553), in which he called Muhammad a true reformer who preached a return to the pure monotheism that Christian theologians had corrupted by inventing the perverse and irrational doctrine of the Trinity. After publishing these heretical ideas, Servetus was condemned by the Catholic Inquisition in Vienne, and finally burned with his own books in Calvin’s Geneva.

During the European Enlightenment, a number of authors presented Muhammad in a similar vein, as an anticlerical hero; some saw Islam as a pure form of monotheism close to philosophic Deism and the Quran as a rational paean to the Creator. In 1734, George Sale published a new English translation. In his introduction, he traced the early history of Islam and idealised the Prophet as an iconoclastic, anticlerical reformer who had banished the ‘superstitious’ beliefs and practices of early Christians – the cult of the saints, holy relics – and quashed the power of a corrupt and avaricious clergy.

Sale’s translation of the Quran was widely read and appreciated in England: for many of his readers, Muhammad had become a symbol of anticlerical republicanism. It was influential outside England too. The US founding father Thomas Jefferson bought a copy from a bookseller in Williamsburg, Virginia, in 1765, which helped him conceive of a philosophical deism that surpassed confessional boundaries. (Jefferson’s copy, now in the Library of Congress, has been used for the swearing in of Muslim representatives to Congress, starting with Keith Ellison in 2007.) And in Germany, the Romantic Johann Wolfgang von Goethe read a translation of Sale’s version, which helped to colour his evolving notion of Muhammad as an inspired poet and archetypal prophet.

In France, Voltaire also cited Sale’s translation with admiration: in his world history Essai sur les mœurs et l’esprit des nations (1756), he portrayed Muhammad as an inspired reformer who abolished superstitious practices and eradicated the power of corrupt clergy. By the end of the century, the English Whig Edward Gibbon (an avid reader of both Sale and Voltaire) presented the Prophet in glowing terms in The History of the Decline and Fall of the Roman Empire (1776-89):

The creed of Mahomet is free from suspicion or ambiguity; and the Koran is a glorious testimony to the unity of God. The prophet of Mecca rejected the worship of idols and men, of stars and planets, on the rational principle that whatever rises must set, that whatever is born must die, that whatever is corruptible must decay and perish. In the author of the universe, his rational enthusiasm confessed and adored an infinite and eternal being, without form or place, without issue or similitude, present to our most secret thoughts, existing by the necessity of his own nature, and deriving from himself all moral and intellectual perfection … A philosophic theist might subscribe the popular creed of the Mahometans: a creed too sublime, perhaps, for our present faculties.

But it was Napoleon Bonaparte who took the Prophet most keenly to heart, styling himself a ‘new Muhammad’ after reading the French translation of the Quran that Claude-Étienne Savary produced in 1783. Savary wrote his translation in Egypt: there, surrounded by the music of the Arabic language, he sought to render into French the beauty of the Arabic text. Like Sale, Savary wrote a long introduction presenting Muhammad as a ‘great’ and ‘extraordinary’ man, a ‘genius’ on the battlefield, a man who knew how to inspire loyalty among his followers. Napoleon read this translation on the ship that took him to Egypt in 1798. Inspired by Savary’s portrait of the Prophet as a brilliant general and sage lawgiver, Napoleon sought to become a new Muhammad, and hoped that Cairo’s ulama (scholars) would accept him and his French soldiers as friends of Islam, come to liberate Egyptians from Ottoman tyranny. He even claimed that his own arrival in Egypt had been announced in the Quran.

Napoleon had an idealised, bookish, Enlightenment vision of Islam as pure monotheism: indeed, the failure of his Egyptian expedition owed partly to his idea of Islam being quite different from the religion of Cairo’s ulama. Yet Napoleon was not alone in seeing himself as a new Muhammad: Goethe enthusiastically proclaimed that the emperor was the ‘Mahomet der Welt’ (Muhammad of the world), and the French author Victor Hugo portrayed him as a ‘Mahomet d’occident’ (Muhammad of the West). Napoleon himself, at the end of his life, exiled on Saint Helena and ruminating on his defeat, wrote about Muhammad and defended his legacy as a ‘great man who changed the course of history’. Napoleon’s Muhammad, conqueror and lawgiver, persuasive and charismatic, resembles Napoleon himself – but a Napoleon who was more successful, and certainly never exiled to a cold windswept island in the South Atlantic.

The idea of Muhammad as one of the world’s great legislators persisted into the 20th century. Adolph A Weinman, a German-born American sculptor, depicted Muhammad in his 1935 frieze in the main chamber of the US Supreme Court, where the Prophet takes his place among 18 lawgivers. Various European Christians called on their churches to recognise Muhammad’s special role as prophet of the Muslims. For Catholics scholars of Islam such as Louis Massignon or Hans Küng, or for the Scottish Protestant scholar of Islam William Montgomery Watt, such recognition was the best way to promote peaceful, constructive dialogue between Christians and Muslims.

This kind of dialogue continues today, but it has been largely drowned out by the din of conflict, as extreme-Right politicians in Europe and elsewhere diabolise Muhammad to justify anti-Muslim policies. The Dutch politician Geert Wilders calls him a terrorist, paedophile and psychopath. The negative image of the Prophet is paradoxically promoted by fundamentalist Muslims who adulate him and reject all historical contextualisation of his life and teachings; meanwhile, violent extremists claim to defend Islam and its prophet from ‘insults’ through murder and terror. All the more reason, then, to step back and examine the diverse and often surprising Western portraits of the myriad faces of Muhammad.

Faces of Muhammad: Western Perceptions of the Prophet of Islam from the Middle Ages to Today by John Tolan is published via Princeton University Press.Aeon counter – do not remove

John Tolan

This article was originally published at Aeon and has been republished under Creative Commons. Read the original article here.

Ibn Tufayl and the Story of the Feral Child of Philosophy


Album folio fragment with scholar in a garden. Attributed to Muhammad Ali 1610-15. Courtesy Museum of Fine Arts, Boston

Marwa Elshakry & Murad Idris | Aeon Ideas

Ibn Tufayl, a 12th-century Andalusian, fashioned the feral child in philosophy. His story Hayy ibn Yaqzan is the tale of a child raised by a doe on an unnamed Indian Ocean island. Hayy ibn Yaqzan (literally ‘Living Son of Awakeness’) reaches a state of perfect, ecstatic understanding of the world. A meditation on the possibilities (and pitfalls) of the quest for the good life, Hayy offers not one, but two ‘utopias’: a eutopia (εὖ ‘good’, τόπος ‘place’) of the mind in perfect isolation, and an ethical community under the rule of law. Each has a version of human happiness. Ibn Tufayl pits them against each other, but each unfolds ‘no where’ (οὐ ‘not’, τόπος ‘place’) in the world.

Ibn Tufayl begins with a vision of humanity isolated from society and politics. (Modern European political theorists who employed this literary device called it ‘the state of nature’.) He introduces Hayy by speculating about his origin. Whether Hayy was placed in a basket by his mother to sail through the waters of life (like Moses) or born by spontaneous generation on the island is irrelevant, Ibn Tufayl says. His divine station remains the same, as does much of his life, spent in the company only of animals. Later philosophers held that society elevates humanity from its natural animal state to an advanced, civilised one. Ibn Tufayl took a different view. He maintained that humans can be perfected only outside society, through a progress of the soul, not the species.

In contrast to Thomas Hobbes’s view that ‘man is a wolf to man’, Hayy’s island has no wolves. It proves easy enough for him to fend off other creatures by waving sticks at them or donning terrifying costumes of hides and feathers. For Hobbes, the fear of violent death is the origin of the social contract and the apologia for the state; but Hayy’s first encounter with fear of death is when his doe-mother dies. Desperate to revive her, Hayy dissects her heart only to find one of its chambers is empty. The coroner-turned-theologian concludes that what he loved in his mother no longer resides in her body. Death therefore was the first lesson of metaphysics, not politics.

Hayy then observes the island’s plants and animals. He meditates upon the idea of an elemental, ‘vital spirit’ upon discovering fire. Pondering the plurality of matter leads him to conclude that it must originate from a singular, non-corporeal source or First Cause. He notes the perfect motion of the celestial spheres and begins a series of ascetic exercises (such as spinning until dizzy) to emulate this hidden, universal order. By the age of 50, he retreats from the physical world, meditating in his cave until, finally, he attains a state of ecstatic illumination. Reason, for Ibn Tufayl, is thus no absolute guide to Truth.

The difference between Hayy’s ecstatic journeys of the mind and later rationalist political thought is the role of reason. Yet many later modern European commentaries or translations of Hayy confuse this by framing the allegory in terms of reason. In 1671, Edward Pococke entitled his Latin translation The Self-Taught Philosopher: In Which It Is Demonstrated How Human Reason Can Ascend from Contemplation of the Inferior to Knowledge of the Superior. In 1708, Simon Ockley’s English translation was The Improvement of Human Reason, and it too emphasised reason’s capacity to attain ‘knowledge of God’. For Ibn Tufayl, however, true knowledge of God and the world – as a eutopia for the ‘mind’ (or soul) – could come only through perfect contemplative intuition, not absolute rational thought.

This is Ibn Tufayl’s first utopia: an uninhabited island where a feral philosopher retreats to a cave to reach ecstasy through contemplation and withdrawal from the world. Friedrich Nietzsche’s Zarathustra would be impressed: ‘Flee, my friend, into your solitude!’

The rest of the allegory introduces the problem of communal life and a second utopia. After Hayy achieves his perfect condition, an ascetic is shipwrecked on his island. Hayy is surprised to discover another being who so resembles him. Curiosity leads him to befriend the wanderer, Absal. Absal teaches Hayy language, and describes the mores of his own island’s law-abiding people. The two men determine that the islanders’ religion is a lesser version of the Truth that Hayy discovered, shrouded in symbols and parables. Hayy is driven by compassion to teach them the Truth. They travel to Absal’s home.

The encounter is disastrous. Absal’s islanders feel compelled by their ethical principles of hospitality towards foreigners, friendship with Absal, and association with all people to welcome Hayy. But soon Hayy’s constant attempts to preach irritate them. Hayy realises that they are incapable of understanding. They are driven by satisfactions of the body, not the mind. There can be no perfect society because not everyone can achieve a state of perfection in their soul. Illumination is possible only for the select, in accordance with a sacred order, or a hieros archein. (This hierarchy of being and knowing is a fundamental message of neo-Platonism.) Hayy concludes that persuading people away from their ‘natural’ stations would only corrupt them further. The laws that the ‘masses’ venerate, be they revealed or reasoned, he decides, are their only chance to achieve a good life.

The islanders’ ideals – lawfulness, hospitality, friendship, association – might seem reasonable, but these too exist ‘no where’ in the world. Hence their dilemma: either they adhere to these and endure Hayy’s criticisms, or violate them by shunning him. This is a radical critique of the law and its ethical principles: they are normatively necessary for social life yet inherently contradictory and impossible. It’s a sly reproach of political life, one whose bite endures. Like the islanders, we follow principles that can undermine themselves. To be hospitable, we must be open to the stranger who violates hospitality. To be democratic, we must include those who are antidemocratic. To be worldly, our encounters with other people must be opportunities to learn from them, not just about them.

In the end, Hayy returns to his island with Absal, where they enjoy a life of ecstatic contemplation unto death. They abandon the search for a perfect society of laws. Their eutopia is the quest of the mind left unto itself, beyond the imperfections of language, law and ethics – perhaps beyond even life itself.

The islanders offer a less obvious lesson: our ideals and principles undermine themselves, but this is itself necessary for political life. For an island of pure ethics and law is an impossible utopia. Perhaps, like Ibn Tufayl, all we can say on the search for happiness is (quoting Al-Ghazali):

It was – what it was is harder to say.
Think the best, but don’t make me describe it away.

After all, we don’t know what happened to Hayy and Absal after their deaths – or to the islanders after they left.Aeon counter – do not remove

Marwa Elshakry & Murad Idris

This article was originally published at Aeon and has been republished under Creative Commons. Read the original article here.

Why the Demoniac Stayed in his Comfortable Corner of Hell


Detail from The Drunkard (1912) by Marc Chagall. Courtesy Wikipedia

John Kaag | Aeon Ideas

I am not what one might call a religious man. I went to church, and then to confirmation class, under duress. My mother, whom I secretly regarded as more powerful than God, insisted that I go. So I went. Her insistence, however, had the unintended consequence of introducing me to a pastor whom I came to despise. So I eventually quit.

There were many problems with this pastor but the one that bothered me the most was his refusal to explain a story from the New Testament that I found especially hard to believe: the story of the demoniac.

This story from Mark 5:1-20 relates how Jesus and the disciples go to the town of Gerasenes and there encounter a man who is possessed by evil spirits. This demoniac – a self-imposed outcast from society – lived at the outskirts of town and ‘night and day among the tombs and in the hills he would cry out and cut himself with stones’. The grossest part of the story, however, isn’t the self-mutilation. It’s the demoniac’s insane refusal to accept help. When Jesus approached him, the demoniac threw himself to the ground and wailed: ‘What do you want with me? … In God’s name, don’t torture me!’ When you’re possessed by evil spirits, the worst thing in the world is to be healed. In short, the demoniac tells Jesus to bugger off, to leave him and his sharp little stones in his comfortable corner of hell.

When I first read about the demoniac, I was admittedly scared, but I eventually convinced myself that the parable was a manipulative attempt to persuade unbelievers such as me to find religion. And I wasn’t buying it. But when I entered university, went into philosophy, and began to cultivate an agnosticism that one might call atheism, I discovered that many a philosopher had been drawn to this scary story. So I took a second look.

The Danish philosopher Søren Kierkegaard, who spent years analysing the psychological and ethical dimensions of the demoniac, tells us that being demonic is more common than we might like to admit. He points out that when Jesus heals the possessed man, the spirits are exorcised en masse, flying out together as ‘the Legion’ – a vast army of evil forces. There are more than enough little demons to go around, and this explains why they come to roust in some rather mundane places. In Kierkegaard’s words: ‘One may hear the drunkard say: “Let me be the filth that I am.”’ Or, leave me alone with my bottle and let me ruin my life, thank you very much. I heard this first from my father, and then from an increasing number of close friends, and most recently from a voice that occasionally keeps me up at night when everyone else is asleep.

Those who are the most pointedly afflicted are often precisely those who are least able to recognise their affliction, or to save themselves. And those with the resources to rescue themselves are usually already saved. As Kierkegaard suggests, the virtue of sobriety makes perfect sense to one who is already sober. Eating well is second nature to the one who is already healthy; saving money is a no-brainer for one who one is already rich; truth-telling is the good habit of one who is already honest. But for those in the grips of crisis or sin, getting out usually doesn’t make much sense.

Sharp stones can take a variety of forms.

In The Concept of Anxiety (1844), Kierkegaard tells us that the ‘essential nature of [the demoniac] is anxiety about the good’. I’ve been ‘anxious’ about many things – about exams, about spiders, about going to sleep – but Kierkegaard explains that the feeling I have about these nasty things isn’t anxiety at all. It’s fear. Anxiety, on the other hand, has no particular object. It is the sense of uneasiness that one has at the edge of a cliff, or climbing a ladder, or thinking about the prospects of a completely open future – it isn’t fear per se, but the feeling that we get when faced with possibility. It’s the unsettling feeling of freedom. Yes, freedom, that most precious of modern watchwords, is deeply unsettling.

What does this have to do with our demoniac? Everything. Kierkegaard explains that the demoniac reflects ‘an unfreedom that wants to close itself off’; when confronted with the possibility of being healed, he wants nothing to do with it. The free life that Jesus offers is, for the demoniac, pure torture. I’ve often thought that this is the fate of the characters in Jean-Paul Sartre’s play No Exit (1944): they are always free to leave, but leaving seems beyond impossible.

Yet Jesus manages to save the demoniac. And I wanted my pastor to tell me how. At the time, I chalked up most of the miracles from the Bible as exaggeration, or interpretation, or poetic licence. But the healing of the demoniac – unlike the bread and fish and resurrection – seemed really quite fantastic. So how did Jesus do it? I didn’t get a particularly good answer from my pastor, so I left the Church. And never came back.

Today, I still want to know.

I’m not here to explain the salvation of the demoniac. I’m here only to observe, as carefully as I can, that this demonic situation is a problem. Indeed, I suspect it is the problem for many, many readers. The demoniac reflects what theologians call the ‘religious paradox’, namely that it is impossible for fallen human beings – such craven creatures – to bootstrap themselves to heaven. Any redemptive resources at our disposal are probably exactly as botched as we are.

There are many ways to distract ourselves from this paradox – and we are very good at manufacturing them: movies and alcohol and Facebook and all the fixations and obsessions of modern life. But at the end of the day, these are pitifully little comfort.

So this year, as New Year’s Day recedes from memory and the winter darkness remains, I am making a resolution: I will try not to take all the usual escapes. Instead, I will try to simply sit with the plight of the demoniac, to ‘stew in it’ as my mother used to say, for a minute or two more. In his essay ‘Self-will’ (1919), the German author Hermann Hesse put it thus: ‘If you and you … are in pain, if you are sick in body or soul, if you are afraid and have a foreboding of danger – why not, if only to amuse yourselves … try to put the question in another way? Why not ask whether the source of your pain might not be you yourselves?’ I will not reach for my familiar demonic stones, blood-spattered yet comforting. I will ask why I need them in the first place. When I do this, and attempt to come to terms with the demoniac’s underlying suffering, I might notice that it is not unique to me.

When I do, when I let go of the things that I think are going to ease my suffering, I might have the chance to notice that I am not alone in my anxiety. And maybe this is recompense enough. Maybe this is freedom and the best that I can hope for.Aeon counter – do not remove

John Kaag

This article was originally published at Aeon and has been republished under Creative Commons. Read the original article here.

Modern Technology is akin to the Metaphysics of Vedanta


Akhandadhi Das | Aeon Ideas

You might think that digital technologies, often considered a product of ‘the West’, would hasten the divergence of Eastern and Western philosophies. But within the study of Vedanta, an ancient Indian school of thought, I see the opposite effect at work. Thanks to our growing familiarity with computing, virtual reality (VR) and artificial intelligence (AI), ‘modern’ societies are now better placed than ever to grasp the insights of this tradition.

Vedanta summarises the metaphysics of the Upanishads, a clutch of Sanskrit religious texts, likely written between 800 and 500 BCE. They form the basis for the many philosophical, spiritual and mystical traditions of the Indian sub-continent. The Upanishads were also a source of inspiration for some modern scientists, including Albert Einstein, Erwin Schrödinger and Werner Heisenberg, as they struggled to comprehend quantum physics of the 20th century.

The Vedantic quest for understanding begins from what it considers the logical starting point: our own consciousness. How can we trust conclusions about what we observe and analyse unless we understand what is doing the observation and analysis? The progress of AI, neural nets and deep learning have inclined some modern observers to claim that the human mind is merely an intricate organic processing machine – and consciousness, if it exists at all, might simply be a property that emerges from information complexity. However, this view fails to explain intractable issues such as the subjective self and our experience of qualia, those aspects of mental content such as ‘redness’ or ‘sweetness’ that we experience during conscious awareness. Figuring out how matter can produce phenomenal consciousness remains the so-called ‘hard problem’.

Vedanta offers a model to integrate subjective consciousness and the information-processing systems of our body and brains. Its theory separates the brain and the senses from the mind. But it also distinguishes the mind from the function of consciousness, which it defines as the ability to experience mental output. We’re familiar with this notion from our digital devices. A camera, microphone or other sensors linked to a computer gather information about the world, and convert the various forms of physical energy – light waves, air pressure-waves and so forth – into digital data, just as our bodily senses do. The central processing unit processes this data and produces relevant outputs. The same is true of our brain. In both contexts, there seems to be little scope for subjective experience to play a role within these mechanisms.

While computers can handle all sorts of processing without our help, we furnish them with a screen as an interface between the machine and ourselves. Similarly, Vedanta postulates that the conscious entity – something it terms the atma – is the observer of the output of the mind. The atma possesses, and is said to be composed of, the fundamental property of consciousness. The concept is explored in many of the meditative practices of Eastern traditions.

You might think of the atma like this. Imagine you’re watching a film in the cinema. It’s a thriller, and you’re anxious about the lead character, trapped in a room. Suddenly, the door in the movie crashes open and there stands… You jump, as if startled. But what is the real threat to you, other than maybe spilling your popcorn? By suspending an awareness of your body in the cinema, and identifying with the character on the screen, we are allowing our emotional state to be manipulated. Vedanta suggests that the atma, the conscious self, identifies with the physical world in a similar fashion.

This idea can also be explored in the all-consuming realm of VR. On entering a game, we might be asked to choose our character or avatar – originally a Sanskrit word, aptly enough, meaning ‘one who descends from a higher dimension’. In older texts, the term often refers to divine incarnations. However, the etymology suits the gamer, as he or she chooses to descend from ‘normal’ reality and enter the VR world. Having specified our avatar’s gender, bodily features, attributes and skills, next we learn how to control its limbs and tools. Soon, our awareness diverts from our physical self to the VR capabilities of the avatar.

In Vedanta psychology, this is akin to the atma adopting the psychological persona-self it calls the ahankara, or the ‘pseudo-ego’. Instead of a detached conscious observer, we choose to define ourselves in terms of our social connections and the physical characteristics of the body. Thus, I come to believe in myself with reference to my gender, race, size, age and so forth, along with the roles and responsibilities of family, work and community. Conditioned by such identification, I indulge in the relevant emotions – some happy, some challenging or distressing – produced by the circumstances I witness myself undergoing.

Within a VR game, our avatar represents a pale imitation of our actual self and its entanglements. In our interactions with the avatar-selves of others, we might reveal little about our true personality or feelings, and know correspondingly little about others’. Indeed, encounters among avatars – particularly when competitive or combative – are often vitriolic, seemingly unrestrained by concern for the feelings of the people behind the avatars. Connections made through online gaming aren’t a substitute for other relationships. Rather, as researchers at Johns Hopkins University have noted, gamers with strong real-world social lives are less likely to fall prey to gaming addiction and depression.

These observations mirror the Vedantic claim that our ability to form meaningful relationships is diminished by absorption in the ahankara, the pseudo-ego. The more I regard myself as a physical entity requiring various forms of sensual gratification, the more likely I am to objectify those who can satisfy my desires, and to forge relationships based on mutual selfishness. But Vedanta suggests that love should emanate from the deepest part of the self, not its assumed persona. Love, it claims, is soul-to-soul experience. Interactions with others on the basis of the ahankara offer only a parody of affection.

As the atma, we remain the same subjective self throughout the whole of our life. Our body, mentality and personality change dramatically – but throughout it all, we know ourselves to be the constant observer. However, seeing everything shift and give way around us, we suspect that we’re also subject to change, ageing and heading for annihilation. Yoga, as systematised by Patanjali – an author or authors, like ‘Homer’, who lived in the 2nd century BCE – is intended to be a practical method for freeing the atma from relentless mental tribulation, and to be properly situated in the reality of pure consciousness.

In VR, we’re often called upon to do battle with evil forces, confronting jeopardy and virtual mortality along the way. Despite our efforts, the inevitable almost always happens: our avatar is killed. Game over. Gamers, especially pathological gamers, are known to become deeply attached to their avatars, and can suffer distress when their avatars are harmed. Fortunately, we’re usually offered another chance: Do you want to play again? Sure enough, we do. Perhaps we create a new avatar, someone more adept, based on the lessons learned last time around. This mirrors the Vedantic concept of reincarnation, specifically in its form of metempsychosis: the transmigration of the conscious self into a new physical vehicle.

Some commentators interpret Vedanta as suggesting that there is no real world, and that all that exists is conscious awareness. However, a broader take on Vedantic texts is more akin to VR. The VR world is wholly data, but it becomes ‘real’ when that information manifests itself to our senses as imagery and sounds on the screen or through a headset. Similarly, for Vedanta, it is the external world’s transitory manifestation as observable objects that makes it less ‘real’ than the perpetual, unchanging nature of the consciousness that observes it.

To the sages of old, immersing ourselves in the ephemeral world means allowing the atma to succumb to an illusion: the illusion that our consciousness is somehow part of an external scene, and must suffer or enjoy along with it. It’s amusing to think what Patanjali and the Vedantic fathers would make of VR: an illusion within an illusion, perhaps, but one that might help us to grasp the potency of their message.Aeon counter – do not remove

Akhandadhi Das

This article was originally published at Aeon and has been republished under Creative Commons.


How Al-Farabi drew on Plato to argue for censorship in Islam


Andrew Shiva / Wikipedia

Rashmee Roshan Lall | Aeon Ideas

You might not be familiar with the name Al-Farabi, a 10th-century thinker from Baghdad, but you know his work, or at least its results. Al-Farabi was, by all accounts, a man of steadfast Sufi persuasion and unvaryingly simple tastes. As a labourer in a Damascus vineyard before settling in Baghdad, he favoured a frugal diet of lambs’ hearts and water mixed with sweet basil juice. But in his political philosophy, Al-Farabi drew on a rich variety of Hellenic ideas, notably from Plato and Aristotle, adapting and extending them in order to respond to the flux of his times.

The situation in the mighty Abbasid empire in which Al-Farabi lived demanded a delicate balancing of conservatism with radical adaptation. Against the backdrop of growing dysfunction as the empire became a shrunken version of itself, Al-Farabi formulated a political philosophy conducive to civic virtue, justice, human happiness and social order.

But his real legacy might be the philosophical rationale that Al-Farabi provided for controlling creative expression in the Muslim world. In so doing, he completed the aniconism (or antirepresentational) project begun in the late seventh century by a caliph of the Umayyads, the first Muslim dynasty. Caliph Abd al-Malik did it with nonfigurative images on coins and calligraphic inscriptions on the Dome of the Rock in Jerusalem, the first monument of the new Muslim faith. This heralded Islamic art’s break from the Greco-Roman representative tradition. A few centuries later, Al-Farabi took the notion of creative control to new heights by arguing for restrictions on representation through the word. He did it using solidly Platonic concepts, and can justifiably be said to have helped concretise the way Islam understands and responds to creative expression.

Word portrayals of Islam and its prophet can be deemed sacrilegious just as much as representational art. The consequences of Al-Farabi’s rationalisation of representational taboos are apparent in our times. In 1989, Iran’s Ayatollah Khomeini issued a fatwa sentencing Salman Rushdie to death for writing The Satanic Verses (1988). The book outraged Muslims for its fictionalised account of Prophet Muhammad’s life. In 2001, the Taliban blew up the sixth-century Bamiyan Buddhas in Afghanistan. In 2005, controversy erupted over the publication by the Danish newspaper Jyllands-Posten of cartoons depicting the Prophet. The cartoons continued to ignite fury in some way or other for at least a decade. There were protests across the Middle East, attacks on Western embassies after several European papers reprinted the cartoons, and in 2008 Osama bin Laden issued an incendiary warning to Europe of ‘grave punishment’ for its ‘new Crusade’ against Islam. In 2015, the offices of Charlie Hebdo, a satirical magazine in Paris that habitually offended Muslim sensibilities, was attacked by armed gunmen, killing 12. The magazine had featured Michel Houellebecq’s novel Submission (2015), a futuristic vision of France under Islamic rule.

In a sense, the destruction of the Bamiyan Buddhas was no different from the Rushdie fatwa, which was like the Danish cartoons fallout and the violence wreaked on Charlie Hebdo’s editorial staff. All are linked by the desire to control representation, be it through imagery or the word.

Control of the word was something that Al-Farabi appeared to judge necessary if Islam’s biggest project – the multiethnic commonwealth that was the Abbasid empire – was to be preserved. Figural representation was pretty much settled as an issue for Muslims when Al-Farabi would have been pondering some of his key theories. Within 30 years of the Prophet’s death in 632, art and creative expression took two parallel paths depending on the context for which it was intended. There was art for the secular space, such as the palaces and bathhouses of the Umayyads (661-750). And there was the art considered appropriate for religious spaces – mosques and shrines such as the Dome of the Rock (completed in 691). Caliph Abd al-Malik had already engaged in what has been called a ‘polemic of images’ on coinage with his Byzantine counterpart, Emperor Justinian II. Ultimately, Abd al-Malik issued coins inscribed with the phrases ‘ruler of the orthodox’ and ‘representative [caliph] of Allah’ rather than his portrait. And the Dome of the Rock had script rather than representations of living creatures as a decoration. The lack of image had become an image. In fact, the word was now the image. That is why calligraphy became the greatest of Muslim art forms. The importance of the written word – its absorption and its meaning – was also exemplified by the Abbasids’ investment in the Greek-to-Arabic translation movement from the eighth to the 10th centuries.

Consequently, in Al-Farabi’s time, what was most important for Muslims was to control representation through the word. Christian iconophiles made their case for devotional images with the argument that words have the same representative power as paintings. Words are like icons, declared the iconophile Christian priest Theodore Abu Qurrah, who lived in dar-al Islam and wrote in Arabic in the ninth century. And images, he said, are the writing of the illiterate.

Al-Farabi was concerned about the power – for good or ill – of writings at a time when the Abbasid empire was in decline. He held creative individuals responsible for what they produced. Abbasid caliphs increasingly faced a crisis of authority, both moral and political. This led Al-Farabi – one of the Arab world’s most original thinkers – to extrapolate from topical temporal matters the key issues confronting Islam and its expanding and diverse dominions.

Al-Farabi fashioned a political philosophy that naturalised Plato’s imaginary ideal state for the world to which he belonged. He tackled the obvious issue of leadership, reminding Muslim readers of the need for a philosopher-king, a ‘virtuous ruler’ to preside over a ‘virtuous city’, which would be run on the principles of ‘virtuous religion’.

Like Plato, Al-Farabi suggested creative expression should support the ideal ruler, thus shoring up the virtuous city and the status quo. Just as Plato in the Republic demanded that poets in the ideal state tell stories of unvarying good, especially about the gods, Al-Farabi’s treatises mention ‘praiseworthy’ poems, melodies and songs for the virtuous city. Al-Farabi commended as ‘most venerable’ for the virtuous city the sorts of writing ‘used in the service of the supreme ruler and the virtuous king.’

It is this idea of writers following the approved narrative that most clearly joins Al-Farabi’s political philosophy to that of the man he called Plato the ‘Divine’. When Al-Farabi seized on Plato’s argument for ‘a censorship of the writers’ as a social good for Muslim society, he was making a case for managing the narrative by controlling the word. It would be important to the next phase of Islamic image-building.

Some of Al-Farabi’s ideas might have influenced other prominent Muslim thinkers, including the Persian polymath Ibn Sina, or Avicenna, (c980-1037) and the Persian theologian Al-Ghazali (c1058-1111). Certainly, his rationalisation for controlling creative writing enabled a further move to deny legitimacy to new interpretation.Aeon counter – do not remove

Rashmee Roshan Lall

This article was originally published at Aeon and has been republished under Creative Commons.

What Einstein Meant by ‘God Does Not Play Dice’

Einstein with his second wife Elsa, 1921. Wikipedia.

Jim Baggott | Aeon Ideas

‘The theory produces a good deal but hardly brings us closer to the secret of the Old One,’ wrote Albert Einstein in December 1926. ‘I am at all events convinced that He does not play dice.’

Einstein was responding to a letter from the German physicist Max Born. The heart of the new theory of quantum mechanics, Born had argued, beats randomly and uncertainly, as though suffering from arrhythmia. Whereas physics before the quantum had always been about doing this and getting that, the new quantum mechanics appeared to say that when we do this, we get that only with a certain probability. And in some circumstances we might get the other.

Einstein was having none of it, and his insistence that God does not play dice with the Universe has echoed down the decades, as familiar and yet as elusive in its meaning as E = mc2. What did Einstein mean by it? And how did Einstein conceive of God?

Hermann and Pauline Einstein were nonobservant Ashkenazi Jews. Despite his parents’ secularism, the nine-year-old Albert discovered and embraced Judaism with some considerable passion, and for a time he was a dutiful, observant Jew. Following Jewish custom, his parents would invite a poor scholar to share a meal with them each week, and from the impoverished medical student Max Talmud (later Talmey) the young and impressionable Einstein learned about mathematics and science. He consumed all 21 volumes of Aaron Bernstein’s joyful Popular Books on Natural Science (1880). Talmud then steered him in the direction of Immanuel Kant’s Critique of Pure Reason (1781), from which he migrated to the philosophy of David Hume. From Hume, it was a relatively short step to the Austrian physicist Ernst Mach, whose stridently empiricist, seeing-is-believing brand of philosophy demanded a complete rejection of metaphysics, including notions of absolute space and time, and the existence of atoms.

But this intellectual journey had mercilessly exposed the conflict between science and scripture. The now 12-year-old Einstein rebelled. He developed a deep aversion to the dogma of organised religion that would last for his lifetime, an aversion that extended to all forms of authoritarianism, including any kind of dogmatic atheism.

This youthful, heavy diet of empiricist philosophy would serve Einstein well some 14 years later. Mach’s rejection of absolute space and time helped to shape Einstein’s special theory of relativity (including the iconic equation E = mc2), which he formulated in 1905 while working as a ‘technical expert, third class’ at the Swiss Patent Office in Bern. Ten years later, Einstein would complete the transformation of our understanding of space and time with the formulation of his general theory of relativity, in which the force of gravity is replaced by curved spacetime. But as he grew older (and wiser), he came to reject Mach’s aggressive empiricism, and once declared that ‘Mach was as good at mechanics as he was wretched at philosophy.’

Over time, Einstein evolved a much more realist position. He preferred to accept the content of a scientific theory realistically, as a contingently ‘true’ representation of an objective physical reality. And, although he wanted no part of religion, the belief in God that he had carried with him from his brief flirtation with Judaism became the foundation on which he constructed his philosophy. When asked about the basis for his realist stance, he explained: ‘I have no better expression than the term “religious” for this trust in the rational character of reality and in its being accessible, at least to some extent, to human reason.’

But Einstein’s was a God of philosophy, not religion. When asked many years later whether he believed in God, he replied: ‘I believe in Spinoza’s God, who reveals himself in the lawful harmony of all that exists, but not in a God who concerns himself with the fate and the doings of mankind.’ Baruch Spinoza, a contemporary of Isaac Newton and Gottfried Leibniz, had conceived of God as identical with nature. For this, he was considered a dangerous heretic, and was excommunicated from the Jewish community in Amsterdam.

Einstein’s God is infinitely superior but impersonal and intangible, subtle but not malicious. He is also firmly determinist. As far as Einstein was concerned, God’s ‘lawful harmony’ is established throughout the cosmos by strict adherence to the physical principles of cause and effect. Thus, there is no room in Einstein’s philosophy for free will: ‘Everything is determined, the beginning as well as the end, by forces over which we have no control … we all dance to a mysterious tune, intoned in the distance by an invisible player.’

The special and general theories of relativity provided a radical new way of conceiving of space and time and their active interactions with matter and energy. These theories are entirely consistent with the ‘lawful harmony’ established by Einstein’s God. But the new theory of quantum mechanics, which Einstein had also helped to found in 1905, was telling a different story. Quantum mechanics is about interactions involving matter and radiation, at the scale of atoms and molecules, set against a passive background of space and time.

Earlier in 1926, the Austrian physicist Erwin Schrödinger had radically transformed the theory by formulating it in terms of rather obscure ‘wavefunctions’. Schrödinger himself preferred to interpret these realistically, as descriptive of ‘matter waves’. But a consensus was growing, strongly promoted by the Danish physicist Niels Bohr and the German physicist Werner Heisenberg, that the new quantum representation shouldn’t be taken too literally.

In essence, Bohr and Heisenberg argued that science had finally caught up with the conceptual problems involved in the description of reality that philosophers had been warning of for centuries. Bohr is quoted as saying: ‘There is no quantum world. There is only an abstract quantum physical description. It is wrong to think that the task of physics is to find out how nature is. Physics concerns what we can say about nature.’ This vaguely positivist statement was echoed by Heisenberg: ‘[W]e have to remember that what we observe is not nature in itself but nature exposed to our method of questioning.’ Their broadly antirealist ‘Copenhagen interpretation’ – denying that the wavefunction represents the real physical state of a quantum system – quickly became the dominant way of thinking about quantum mechanics. More recent variations of such antirealist interpretations suggest that the wavefunction is simply a way of ‘coding’ our experience, or our subjective beliefs derived from our experience of the physics, allowing us to use what we’ve learned in the past to predict the future.

But this was utterly inconsistent with Einstein’s philosophy. Einstein could not accept an interpretation in which the principal object of the representation – the wavefunction – is not ‘real’. He could not accept that his God would allow the ‘lawful harmony’ to unravel so completely at the atomic scale, bringing lawless indeterminism and uncertainty, with effects that can’t be entirely and unambiguously predicted from their causes.

The stage was thus set for one of the most remarkable debates in the entire history of science, as Bohr and Einstein went head-to-head on the interpretation of quantum mechanics. It was a clash of two philosophies, two conflicting sets of metaphysical preconceptions about the nature of reality and what we might expect from a scientific representation of this. The debate began in 1927, and although the protagonists are no longer with us, the debate is still very much alive.

And unresolved.

I don’t think Einstein would have been particularly surprised by this. In February 1954, just 14 months before he died, he wrote in a letter to the American physicist David Bohm: ‘If God created the world, his primary concern was certainly not to make its understanding easy for us.’

Jim Baggott

This article was originally published at Aeon and has been republished under Creative Commons.

Being ‘interesting’ is Not an Objective Feature of the World


Colour-composite image of the Carina Nebula. Courtesy ESO

Lorraine L Besser | Aeon Ideas

Most of us know and value pleasant experiences. We savour the taste of a freshly picked strawberry. We laugh more than an event warrants, just because laughing feels good. We might argue about the degree to which such pleasant experiences are valuable, and the extent to which they ought to shape our lives, but we can’t deny their value.

So pleasant experiences are necessarily valuable, but are there also valuable experiences that are not necessarily pleasant? It seems there are. Often, we have experiences that captivate us, that we cherish even though they are not entirely pleasant. We read a novel that leads us to feel both horror and awe. We binge-watch a TV show that explores the shocking course of moral corruption of someone who could be your neighbour, friend, even your spouse. The experience is both painful and horrifying, but we can’t turn it off.

These experiences seem intuitively valuable in the same way that pleasant experiences are intuitively valuable. But they are not valuable because they are pleasant – rather, they are valuable by virtue of being interesting.

What does it mean for an experience to be interesting? First, to say that something is interesting is to describe what the experience feels like to the person undergoing it. This is the phenomenological quality of the experience. When we study the phenomenology of something, we examine what it feels like, from the inside, to experience that thing. For instance, most of us would describe eating our favourite foods as a pleasurable experience: the food itself isn’t pleasurable, but the experience of eating it is. Similarly, when we talk about something being beautiful or awe-inspiring, we aren’t describing the thing itself, but rather our experience of it. We see the sunset and feel moved by it; the beauty is something we experience. Likewise the awe it inspires is a feature of our experiential reaction to it. The interesting is just like this. It is a feature of our experiential reaction, of our engagement.

We don’t always use the word ‘interesting’ in this way. In ordinary language, we often describe the objects of experience as interesting. We talk about interesting books, interesting people, and so on. When we say that a book is interesting, we more likely mean that the experience of reading the book is interesting. It just doesn’t make sense to describe a book to be objectively interesting, independently of people experiencing it as interesting. How could a book be interesting without being read? And if a book is objectively interesting, shouldn’t we all find it interesting? We don’t all find the same things to be interesting. It is a common experience for something to be interesting to one person, yet not another. So while we might describe objects as interesting, we should recognise that this is a loose, and shorthand, way to describe what’s really interesting – our experience of them.

Another way in which we use the word ‘interesting’ is in the context of describing what a person is interested in: John is interested in Second World War novels, for example. This usage also differs from what I’m describing as the ‘interesting’. It describes a particular fit between one’s interests and the objects of one’s experiences. But notice that fitting with your interests, and being interested in something, is actually a different experience to finding something interesting. We’ve all been interested in things that turn out to be boring, and we’ve all found experiences interesting when we had no prior interest in them. The interesting is thus not an objective feature of an object, nor an experience that necessarily aligns or follows from your interests. It is rather a feature of our experiences.

To say that something is interesting is also to describe a particular kind of synthesis that arises within the experience. Whenever we engage in an activity, we bring to that experience some combination of expectations, likes/desires, beliefs, curiosity, and so forth. This package contributes to the activity delivering a particular subjective experience. There is a synthesis, specific to the individual’s engagement, that determines what her experience feels like – its phenomenological quality. It is within this synthesis that a person finds an experience interesting, or not. There is no one synthesis that makes an experience interesting. Sometimes, a clash of expectations and reality makes something interesting, sometimes someone’s curiosity allows one to notice features that make an activity interesting, and so on. Because the interesting lies within a synthesis between the individual and an activity, one individual can find something interesting (say, reading philosophy) that another person doesn’t.

The synthesis is complex, unique to the subject and the experience – and, in the end, unspecifiable. This is why we tend to overlook the interesting as a valuable feature of our experiences. Pleasure, by contrast, is a fairly uniform feature of experience. We know exactly what others are talking about when they talk about pleasurable experiences, and can relate to that experience in a personal way – even if it is something that we have not experienced as pleasurable. Our reactions to the experiences that others find interesting are often different. John finds reading Second World War novels to be an ongoing source of interest, yet Julia can’t imagine a more boring way of spending her time, and can’t understand how anyone would find them interesting. In such scenarios, we are more likely to discredit the value of John’s experience than to try to understand and appreciate it. Because the interesting is by nature a more complicated, harder-to-reach, harder-to-describe feature than others, we rarely stop to think about just what the interesting is.

While wrapping our head around the interesting might be challenging, it is important to acknowledge the value intrinsic to interesting experiences. Recognising it as valuable validates those who choose to pursue the interesting, and also opens up a new dimension of value that can enrich our lives. Most of us know there is more to life than pleasure, yet it is all too easy to choose our experiences for the sake of pleasure. For many of us, though, interesting experiences are more rewarding than pleasurable experiences, insofar as their intrinsic value is a product of multifaceted aspects of our engagement. Interesting experiences spark the mind in a way that stimulates and lingers. They can also be easy to come by – sometimes just a sense of curiosity is needed to make an activity interesting. Look around, feel the pull, and cherish the interesting.Aeon counter – do not remove

Lorraine L Besser

This article was originally published at Aeon and has been republished under Creative Commons.

Why Atheists are Not as Rational as Some Like to Think

File 20180924 85755 lffuqk.jpg?ixlib=rb 1.1

Richard Dawkins, author, evolutionary biologist and emeritus fellow of New College, University of Oxford, is one of the world’s most prominent atheists.
Fronteiras do Pensamento/Wikipedia, CC BY-SA

By Lois Lee, University of Kent

Many atheists think that their atheism is the product of rational thinking. They use arguments such as “I don’t believe in God, I believe in science” to explain that evidence and logic, rather than supernatural belief and dogma, underpin their thinking. But just because you believe in evidence-based, scientific research – which is subject to strict checks and procedures – doesn’t mean that your mind works in the same way.

When you ask atheists about why they became atheists (as I do for a living), they often point to eureka moments when they came to realise that religion simply doesn’t make sense.

Oddly perhaps, many religious people actually take a similar view of atheism. This comes out when theologians and other theists speculate that it must be rather sad to be an atheist, lacking (as they think atheists do) so much of the philosophical, ethical, mythical and aesthetic fulfilments that religious people have access to – stuck in a cold world of rationality only.

The Science of Atheism

The problem that any rational thinker needs to tackle, though, is that the science increasingly shows that atheists are no more rational than theists. Indeed, atheists are just as susceptible as the next person to “group-think” and other non-rational forms of cognition. For example, religious and nonreligious people alike can end up following charismatic individuals without questioning them. And our minds often prefer righteousness over truth, as the social psychologist Jonathan Haidt has explored.

Even atheist beliefs themselves have much less to do with rational inquiry than atheists often think. We now know, for example, that nonreligious children of religious parents cast off their beliefs for reasons that have little to do with intellectual reasoning. The latest cognitive research shows that the decisive factor is learning from what parents do rather than from what they say. So if a parent says that they’re Christian, but they’ve fallen out of the habit of doing the things they say should matter – such as praying or going to church – their kids simply don’t buy the idea that religion makes sense.

This is perfectly rational in a sense, but children aren’t processing this on a cognitive level. Throughout our evolutionary history, humans have often lacked the time to scrutinise and weigh up the evidence – needing to make quick assessments. That means that children to some extent just absorb the crucial information, which in this case is that religious belief doesn’t appear to matter in the way that parents are saying it does.

Children’s choices often aren’t based on rational thinking.
Anna Nahabed/Shutterstock

Even older children and adolescents who actually ponder the topic of religion may not be approaching it as independently as they think. Emerging research is demonstrating that atheist parents (and others) pass on their beliefs to their children in a similar way to religious parents – through sharing their culture as much as their arguments.

Some parents take the view that their children should choose their beliefs for themselves, but what they then do is pass on certain ways of thinking about religion, like the idea that religion is a matter of choice rather than divine truth. It’s not surprising that almost all of these children – 95% – end up “choosing” to be atheist.

Science versus Beliefs

But are atheists more likely to embrace science than religious people? Many belief systems can be more or less closely integrated with scientific knowledge. Some belief systems are openly critical of science, and think it has far too much sway over our lives, while other belief systems are hugely concerned to learn about and respond to scientific knowledge.

But this difference doesn’t neatly map onto whether you are religious or not. Some Protestant traditions, for example, see rationality or scientific thinking as central to their religious lives. Meanwhile, a new generation of postmodern atheists highlight the limits of human knowledge, and see scientific knowledge as hugely limited, problematic even, especially when it comes to existential and ethical questions. These atheists might, for example, follow thinkers like Charles Baudelaire in the view that true knowledge is only found in artistic expression.

Science can give us existential fulfilment, too.
Vladimir Pustovit/Flicr, CC BY-SA

And while many atheists do like to think of themselves as pro science, science and technology itself can sometimes be the basis of religious thinking or beliefs, or something very much like it. For example, the rise of the transhumanist movement, which centres on the belief that humans can and should transcend their current natural state and limitations through the use of technology, is an example of how technological innovation is driving the emergence of new movements that have much in common with religiosity.

Even for those atheists sceptical of transhumanism, the role of science isn’t only about rationality – it can provide the philosophical, ethical, mythical and aesthetic fulfilments that religious beliefs do for others. The science of the biological world, for example, is much more than a topic of intellectual curiosity – for some atheists, it provides meaning and comfort in much the same way that belief in God can for theists. Psychologists show that belief in science increases in the face of stress and existential anxiety, just as religious beliefs intensify for theists in these situations.

Clearly, the idea that being atheist is down to rationality alone is starting to look distinctly irrational. But the good news for all concerned is that rationality is overrated. Human ingenuity rests on a lot more than rational thinking. As Haidt says of “the righteous mind”, we are actually “designed to ‘do’ morality” – even if we’re not doing it in the rational way we think we are. The ability to make quick decisions, follow our passions and act on intuition are also important human qualities and crucial for our success.

It is helpful that we have invented something that, unlike our minds, is rational and evidence-based: science. When we need proper evidence, science can very often provide it – as long as the topic is testable. Importantly, the scientific evidence does not tend to support the view that atheism is about rational thought and theism is about existential fulfilments. The truth is that humans are not like science – none of us get by without irrational action, nor without sources of existential meaning and comfort. Fortunately, though, nobody has to.The Conversation

Lois Lee, Research Fellow, Department of Religious Studies, University of Kent

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Religion is About Emotion Regulation, and It’s Very Good at It


Stephen T Asma | Aeon Ideas

Religion does not help us to explain nature. It did what it could in pre-scientific times, but that job was properly unseated by science. Most religious laypeople and even clergy agree: Pope John Paul II declared in 1996 that evolution is a fact and Catholics should get over it. No doubt some extreme anti-scientific thinking lives on in such places as Ken Ham’s Creation Museum in Kentucky, but it has become a fringe position. Most mainstream religious people accept a version of Galileo’s division of labour: ‘The intention of the Holy Ghost is to teach us how one goes to heaven, not how heaven goes.’

Maybe, then, the heart of religion is not its ability to explain nature, but its moral power? Sigmund Freud, who referred to himself as a ‘godless Jew’, saw religion as delusional, but helpfully so. He argued that we humans are naturally awful creatures – aggressive, narcissistic wolves. Left to our own devices, we would rape, pillage and burn our way through life. Thankfully, we have the civilising influence of religion to steer us toward charity, compassion and cooperation by a system of carrots and sticks, otherwise known as heaven and hell.

The French sociologist Émile Durkheim, on the other hand, argued in The Elementary Forms of the Religious Life (1912) that the heart of religion was not its belief system or even its moral code, but its ability to generate collective effervescence: intense, shared experiences that unify individuals into cooperative social groups. Religion, Durkheim argued, is a kind of social glue, a view confirmed by recent interdisciplinary research.

While Freud and Durkheim were right about the important functions of religion, its true value lies in its therapeutic power, particularly its power to manage our emotions. How we feel is as important to our survival as how we think. Our species comes equipped with adaptive emotions, such as fear, rage, lust and so on: religion was (and is) the cultural system that dials these feelings and behaviours up or down. We see this clearly if we look at mainstream religion, rather than the deleterious forms of extremism. Mainstream religion reduces anxiety, stress and depression. It provides existential meaning and hope. It focuses aggression and fear against enemies. It domesticates lust, and it strengthens filial connections. Through story, it trains feelings of empathy and compassion for others. And it provides consolation for suffering.

Emotional therapy is the animating heart of religion. Social bonding happens not only when we agree to worship the same totems, but when we feel affection for each other. An affective community of mutual care emerges when groups share rituals, liturgy, song, dance, eating, grieving, comforting, tales of saints and heroes, hardships such as fasting and sacrifice. Theological beliefs are bloodless abstractions by comparison.

Emotional management is important because life is hard. The Buddha said: ‘All life is suffering’ and most of us past a certain age can only agree. Religion evolved to handle what I call the ‘vulnerability problem’. When we’re sick, we go to the doctor, not the priest. But when our child dies, or we lose our home in a fire, or we’re diagnosed with Stage-4 cancer, then religion is helpful because it provides some relief and some strength. It also gives us something to do, when there’s nothing we can do.

Consider how religion helps people after a death. Social mammals who have suffered separation distress are restored to health by touch, collective meals and grooming. Human grieving customs involve these same soothing prosocial mechanisms. We comfort-touch and embrace a person who has lost a loved one. Our bodies give ancient comfort directly to the grieving body. We provide the bereaved with food and drink, and we break bread with them (think of the Jewish tradition of shiva, or the visitation tradition of wakes in many cultures). We share stories about the loved one, and help the bereaved reframe their pain in larger optimistic narratives. Even music, in the form of consoling melodies and collective singing, helps to express shared sorrow and also transforms it from an unbearable and lonely experience to a bearable communal one. Social involvement from the community after a death can act as an antidepressant, boosting adaptive emotional changes in the bereaved.

Religion also helps to manage sorrow with something I’ll call ‘existential shaping’ or more precisely ‘existential debt’. It is common for Westerners to think of themselves as individuals first and as members of a community second, but our ideology of the lone protagonist fulfilling an individual destiny is more fiction than fact. Losing someone reminds us of our dependence on others and our deep vulnerability, and at such moments religion turns us toward the web of relations rather than away from it. Long after your parents have died, for example, religion helps you memorialise them and acknowledge your existential debt to them. Formalising the memory of the dead person, through funerary rites, or tomb-sweeping (Qingming) festivals in Asia, or the Day of the Dead in Mexico, or annual honorary masses in Catholicism, is important because it keeps reminding us, even through the sorrow, of the meaningful influence of these deceased loved ones. This is not a self-deception about the unreality of death, but an artful way of learning to live with it. The grief becomes transformed in the sincere acknowledgment of the value of the loved one, and religious rituals help people to set aside time and mental space for that acknowledgment.

An emotion such as grief has many ingredients. The physiological arousal of grief is accompanied by cognitive evaluations: ‘I will never see my friend again’; ‘I could have done something to prevent this’; ‘She was the love of my life’; and so on. Religions try to give the bereaved an alternative appraisal that reframes their tragedy as something more than just misery. Emotional appraisals are proactive, according to the psychologists Phoebe Ellsworth at the University of Michigan and Klaus Scherer at the University of Geneva, going beyond the immediate disaster to envision the possible solutions or responses. This is called ‘secondary appraisal’. After the primary appraisal (‘This is very sad’), the secondary appraisal assesses our ability to deal with the situation: ‘This is too much for me’ – or, positively: ‘I will survive this.’ Part of our ability to cope with suffering is our sense of power or agency: more power generally means better coping ability. If I acknowledge my own limitations when faced with unavoidable loss, but I feel that a powerful ally, God, is part of my agency or power, then I can be more resilient.

Because religious actions are often accompanied by magical thinking or supernatural beliefs, Christopher Hitchens argued in God Is not Great (2007) that religion is ‘false consolation’. Many critics of religion echo his condemnation. But there is no such thing as false consolation. Hitchens and fellow critics are making a category mistake, like saying: ‘The colour green is sleepy.’ Consolation or comfort is a feeling, and it can be weak or strong, but it can’t be false or true. You can be false in your judgment of why you’re feeling better, but feeling better is neither true nor false. True and false applies only if we’re evaluating whether our propositions correspond with reality. And no doubt many factual claims of religion are false in that way – the world was not created in six days.

Religion is real consolation in the same way that music is real consolation. No one thinks that the pleasure of Mozart’s opera The Magic Flute is ‘false pleasure’ because singing flutes don’t really exist. It doesn’t need to correspond to reality. It’s true that some religious devotees, unlike music devotees, pin their consolation to additional metaphysical claims, but why should we trust them to know how religion works? Such believers do not recognise that their unthinking religious rituals and social activities are the true sources of their therapeutic healing. Meanwhile, Hitchens and other critics confuse the factual disappointments of religion with the value of religion generally, and thereby miss the heart of it.

Why We Need Religion: An Agnostic Celebration of Spiritual Emotions’ by Stephen Asma © 2018 is published by Oxford University Press.Aeon counter – do not remove

Stephen T Asma

This article was originally published at Aeon and has been republished under Creative Commons.