I’ve always wondered what Catholic values were, and now I know. Covington Catholic school exemplifies them all: Disrespect. Contempt. Dogma. Oppression. Hatred. The students of that school made a spectacle of themselves demonstrating those values in Washington DC.

There was a lot going on in the Capitol recently. There was a “pro-life” demonstration going on; Covington Catholic, an all-boys private school, sent a mob of their students there, which is a problem in itself. Why are boys trying to dictate what women are allowed to do with their bodies? Next problem: they all seem to be wearing MAGA hats, which tells me where their wealthy parents are coming from, and what kind of indoctrination they received. And then there seems to be a definite lack of adult supervision for these kids.

The Catholic rabble then ran into another demonstration, the Indigenous People’s March. The Native Americans didn’t have a problem, they carried on with dignity…but the dreadful little Catholic children were something else again.

The elder is Nathan Phillips, an Omaha elder who is also a Vietnam Veteran and former director of the Native Youth Alliance. He is also a keeper of a sacred pipe and holds an annual ceremony honoring Native American veterans in the Arlington National Cemetery.

Jesus. I work at a university that was built on the site of a Catholic boarding school for Indians, where children were ripped from their families to learn white man’s ways and follow the Pope, and this is a history we earnestly feel here — we have reminders all over the school of that legacy. Seeing little Catholic assholes shitting all over other people fills me with anger.

And despair. Look at those faces. Someday you’ll see them again in Congress, and on the Supreme Court, and maybe even the presidency. Because that’s where they’re confident they deserve to go.

I’ve been naughty. I haven’t been keeping up with my intended schedule of one video per week. But finally I got something done.

There have been lots of distractions, but honestly? This is tough for me. There are days I don’t want to look at my face or hear my voice, and making these videos compels me to sit down and wrestle with my lack of charisma. I’ll keep plodding along, mainly as therapy — I do enjoy the process, it’s just that final step of subjecting it to the eyeballs of the world that is hard.

Support more videos like this at patreon.com/rebecca!

Transcript:

Late last year, a bold new venture was launched to finally clean the plastic out of our oceans. It’s called Ocean Cleanup, and it’s basically a giant net that floats around the ocean corralling trash for easier removal. Brilliant! And it was all designed and launched by a genius named Boyan Slat who gave a TEDx talk about it when he was only 18. Incredible!

Unfortunately, within months of its launch (from right here in San Francisco Bay, I should mention), it completely broke down and is now being towed to Hawaii where it will be repaired and/or reconsidered.

Oh gosh, if only someone had seen this disaster coming. If only someone, somewhere had said pointed out that the Ocean Cleanup barrier “was designed based on mean ocean current speeds, and not maximum speeds, leaving the very real potential that this barrier could almost immediately upon deployment turn into the largest piece of ocean garbage in the world.”

Oh hold on….

Yeah, fucking everybody knew this was going to happen. As I pointed out way back in 2015, actual oceanographers Dr. Miriam Goldstein and Dr. Kim Martini delivered an extremely detailed report on the original feasibility study offered by Ocean Cleanup, clearly explaining the myriad ways that it just was not going to work. They were ignored.

Actually, stet that, they weren’t just ignored, they were dismissed. The Washington Post pointed out that when an interviewer brought up their concerns to Slat, he said that they weren’t engineers so they didn’t know what they were talking about. In an odd twist, when a male oceanographer pointed out some of the problems with the design just this month, Slat personally thanked him on Twitter for offering “a constructive rather than emotive critique for once” and said his engineers would reach out to said male oceanographer. I’m sure there’s no sexism there, it’s just that when two extremely well-qualified female scientists write in-depth fact-based critical review that clearly states they want the project to succeed and so they decided to lay out the barriers Ocean Cleanup would face on their way to success before the project gets underway is “emotive,” while when a male scientist writes a blog post with memes in it using similar points but months after the project failed it’s “constructive.” Yeah, that sounds right.

Not to dismiss Clark Richards’s post on Ocean Cleanup’s failure — it is interesting if you’re into the finer details of ocean dynamics. It’s just…this is why it’s still hard to be a woman in science. Two women are completely dismissed for not being engineers and for being “emotive” while a man who does much, much less and much, much too late gets an actual meeting with the engineers in charge of the project. That’s fucked.

Unfortunately, Ocean Cleanup’s failure doesn’t mean the end of the project. They have too much money and too much ego to give it up now. They’re going to make small changes to their stupid design and try again, wasting more money and more time and again, they’re just making more trash that will end up in the ocean. A 6,500-foot piece of trash.

That money could be going to stopping plastic from ending up in the ocean in the first place. You know, you generally don’t worry about what to do with all the sewage in your bathroom while it’s still gushing out of the toilet. First, you stop the shit pouring out. Then you clean it up. Luckily, smarter people than Boyan Slat are working on that by enacting plastic bag bans, cleaning up our beaches, and using ingenious wheels to stop trash from washing out to sea in the first place. I don’t expect Slat to pay attention to this video what with it being so “emotive”, but I hope that the rest of you know better now than to give your time or money to a doomed ego trip.

The post The Miserable, Misogynist Failure of Boyan Slat’s Ocean Cleanup appeared first on Skepchick.

By Hanneke Weitering SEATTLE — A cigar-shaped space rock named ‘Oumuamua caused quite a stir when it became the first interstellar visitor discovered in our solar system. Is it an asteroid, a comet or an alien spacecraft? While astronomers continue to work on answering these big questions, one thing has become certain: ‘Oumuamua probably isn’t that …
By Diana Kwon For the longest time the cerebellum, a dense, fist-size formation located at the base of the brain, never got much respect from neuroscientists. For about two centuries the scientific community believed the cerebellum (Latin for “little brain”), which contains approximately half of the brain’s neurons, was dedicated solely to the control of …
By Mark Silk At the beginning of the year, the Washington Post asked Jerry Falwell, Jr. whether there was anything Donald Trump could do that would endanger his support among evangelical leaders. “No,” Falwell replied. “I know anything he does, it may not be ideologically ‘conservative,’ but it’s going to be what’s best for this country, and I …
By Ashton Pittman JACKSON — Mississippi law would require schoolchildren to recite the Pledge of Allegiance and see the Ten Commandments be displayed on public-school walls under new bills in the Legislature this session, requirements that may violate the Establishment Clause of the First Amendment. One would also require Mississippi teachers to teach Mississippi’s pledge glorifying …

Mikey Neumann is probably my favorite movie reviewer, but I’m only saying this because I agree with all 5 of his top movies for 2018.

Check it out, lunar eclipse on Sunday night over a big chunk of the Earth.

The lunar event will last about four hours, beginning at 9:36 p.m ET Sunday, Jan. 20 and ending about 1:50 a.m. ET Monday, Jan. 21. The beginning of the total eclipse phase will occur at 11:41 p.m. ET, according to NASA. The duration of totality will be 62 minutes.

Unfortunately for me, we’re predicted to have a couple of days of snow around that time. Question: if the sky is socked in with gray clouds everywhere, do I still get to turn into an extra-large, extra-vicious werewolf that night?

By Rachael Rettner A new list of top global health threats from the World Health Organization (WHO) reads like a “who’s who” of public health hazards: Pandemic flu. Ebola. Drug resistance. But tucked in this list of much-talked-about threats is one perhaps-surprising inclusion: the anti-vaccine movement. The list, released this week, highlighted “10 of the many …

The archaeological site of Gobekli Tepe in southeast Turkey, a settlement that may date back to the 10th millennium BCE

This article is a preview from the Winter 2018 edition of New Humanist

1. In the beginning was the word

For centuries, we have been telling ourselves a simple story about the origins of social inequality. For most of their history, humans lived in tiny egalitarian bands of hunter-gatherers. Then came farming, which brought with it private property, and then the rise of cities which meant the emergence of civilisation properly speaking. Civilisation meant many bad things (wars, taxes, bureaucracy, patriarchy, slavery) but also made possible written literature, science, philosophy and most other great human achievements.

Almost everyone knows this story in its broadest outlines. Since at least the days of the 18th-century philosopher Jean-Jacques Rousseau, it has framed what we think the overall shape and direction of human history to be. This is important because the narrative also defines our sense of political possibility. Most see civilisation, hence inequality, as a tragic necessity. Some dream of returning to a past utopia, of finding an industrial equivalent to “primitive communism”, or even, in extreme cases, of destroying everything, and going back to being foragers again. But no one challenges the basic structure of the story.

There is a fundamental problem with this narrative: it isn’t true. Overwhelming evidence from archaeology, anthropology and kindred disciplines is beginning to give us a fairly clear idea of what the last 40,000 years of human history really looked like, and in almost no way does it resemble the conventional narrative. Our species did not, in fact, spend most of its history in tiny bands; agriculture did not mark an irreversible threshold in social evolution; the first cities were often robustly egalitarian. Still, even as researchers have gradually come to a consensus on such questions, they remain strangely reluctant to announce their findings to the public – or even scholars in other disciplines – let alone reflect on the larger political implications. As a result, those writers who are reflecting on the “big questions” of human history – Jared Diamond, Francis Fukuyama, Ian Morris and others – still take Rousseau’s question (“what is the origin of social inequality?”) as their starting point, and assume the larger story will begin with some kind of fall from primordial innocence.

Simply framing the question this way means making a series of assumptions. First, that there is a thing called “inequality”; second, that it is a problem; and third, that there was a time it did not exist. Since the financial crash of 2008 and the upheavals that followed, the “problem of social inequality” has been at the centre of political debate. There seems to be a consensus, among the intellectual and political classes, that levels of social inequality have spiralled out of control, and that most of the world’s problems result from this, in one way or another. Pointing this out is seen as a challenge to global power structures, but compare this to the way similar issues might have been discussed a generation earlier. Unlike terms such as “capital” or “class power”, the word “equality” is practically designed to lead to half-measures and compromise. One can imagine overthrowing capitalism or breaking the power of the state, but it’s very difficult to imagine eliminating “inequality”. In fact, it’s not obvious what doing so would even mean, since people are not all the same and nobody would particularly want them to be.

“Inequality” is a way of framing social problems appropriate to technocratic reformers, the kind of people who assume from the outset that any real vision of social transformation has long since been taken off the political table. It allows one to tinker with the numbers, argue about Gini coefficients and thresholds of dysfunction, readjust tax regimes or social welfare mechanisms, even shock the public with figures showing just how bad things have become (“can you imagine? 0.1 per cent of the world’s population controls over 50 per cent of the wealth!”), all without addressing any of the factors that people actually object to about such “unequal” social arrangements. For instance, that some manage to turn their wealth into power over others; or that other people end up being told their needs are not important, and their lives have no intrinsic worth. The latter, we are supposed to believe, is just the inevitable effect of inequality, and inequality, the inevitable result of living in any large, complex, urban, technologically sophisticated society. That is the real political message conveyed by endless invocations of an imaginary age of innocence, before the invention of inequality that if we want to get rid of such problems entirely, we’d have to somehow get rid of 99.9 per cent of the Earth’s population and go back to being tiny bands of foragers again. Otherwise, the best we can hope for is to adjust the size of the boot that will be stomping on our faces, for ever, or perhaps to wrangle a bit more wiggle room in which some of us can at least temporarily duck out of its way.

Mainstream social science now seems mobilised to reinforce this sense of hopelessness. Almost on a monthly basis we are confronted with publications trying to project the current obsession with property distribution back into the Stone Age, setting us on a false quest for “egalitarian societies” defined in such a way that they could not possibly exist outside some tiny band of foragers (and possibly not even then).

What we’re going to do in this essay, then, is two things. First, we will spend a bit of time picking through what passes for informed opinion on such matters, to reveal how the game is played, how even the most apparently sophisticated contemporary scholars end up reproducing conventional wisdom as it stood in France or Scotland in, say, 1760. Then we will attempt to lay down the initial foundations of an entirely different narrative. This is mostly ground-clearing work. The questions we are dealing with are so enormous, and the issues so important, that it will take years of research and debate to even begin to understand the full implications. But on one thing we insist. Abandoning the story of a fall from primordial innocence does not mean abandoning dreams of human emancipation – that is, of a society where no one can turn their rights in property into a means of enslaving others, and where no one can be told their lives and needs don’t matter. On the contrary. Human history becomes a far more interesting place, containing many more hopeful moments than we’ve been led to imagine, once we learn to throw off our conceptual shackles and perceive what’s really there.

2. The origins of social inequality

Let us begin by outlining received wisdom on the overall course of human history. It goes something like this: as the curtain goes up on human history – say, roughly 200,000 years ago, with the appearance of anatomically modern Homo sapiens – we find our species living in small and mobile bands ranging from 20 to 40 individuals. They seek out optimal hunting and foraging territories, following herds, gathering nuts and berries. If resources become scarce, or social tensions arise, they respond by moving on, and going someplace else. Life for these early humans – we can think of it as humanity’s childhood – is full of dangers, but also possibilities. Material possessions are few, but the world is an unspoiled and inviting place. Most work only a few hours a day, and the small size of social groups allows them to maintain a kind of easy-going camaraderie, without formal structures of domination. Rousseau referred to this as “the State of Nature”, but nowadays it is presumed to have encompassed most of our species’ actual history. It is also assumed to be the only era in which humans managed to live in genuine societies of equals, without classes, castes, hereditary leaders or centralised government. Alas, this happy state of affairs eventually had to end. Our conventional version of world history places this moment around 10,000 years ago, at the close of the last Ice Age.

At this point, we find our imaginary human actors scattered across the world’s continents, beginning to farm their own crops and raise their own herds. Whatever the local reasons (they are debated), the effects are momentous, and basically the same everywhere. Territorial attachments and private ownership of property become important in ways previously unknown, and with them, sporadic feuds and war. Farming grants a surplus of food, which allows some to accumulate wealth and influence beyond their immediate kin-group. Others use their freedom from the food-quest to develop new skills, like the invention of more sophisticated weapons, tools, vehicles and fortifications, or the pursuit of politics and organised religion. In consequence, these “Neolithic farmers” quickly get the measure of their hunter-gatherer neighbours, and set about eliminating or absorbing them into a new and superior – albeit less equal – way of life.

To make matters more difficult still, or so the story goes, farming ensures a global rise in population levels. As people move into ever larger concentrations, our unwitting ancestors take another irreversible step to inequality, and around 6,000 years ago, cities appear – and our fate is sealed. With cities comes the need for centralised government. New classes of bureaucrats, priests and warrior-politicians instal themselves in permanent office to keep order and ensure the smooth flow of supplies and public services. Women, having once enjoyed prominent roles in human affairs, are sequestered, or imprisoned in harems. War captives are reduced to slaves. Full-blown inequality has arrived, and there is no getting rid of it. Still, the storytellers always assure us, not everything about the rise of urban civilisation is bad. Writing is invented, at first to keep state accounts, but this allows terrific advances to take place in science, technology and the arts. At the price of innocence, we became our modern selves, and can now merely gaze with pity and jealousy at those few “traditional” or “primitive” societies that somehow missed the boat.

This is the story that, as we say, forms the foundation of all contemporary debate on inequality. If say, an expert in international relations, or a clinical psychologist, wishes to reflect on such matters, they are likely to simply take it for granted that, for most of human history, we lived in tiny egalitarian bands, or that the rise of cities also meant the rise of the state. The same is true of most recent books that try to look at the broad sweep of prehistory, in order to draw political conclusions relevant to contemporary life. Consider Francis Fukuyama’s 2011 book The Origins of Political Order: From Prehuman Times to the French Revolution (Profile):

In its early stages, human political organisation is similar to the band-level society observed in higher primates like chimpanzees. This may be regarded as a default form of social organisation. ... Rousseau pointed out that the origin of political inequality lay in the development of agriculture, and in this he was largely correct. Since band-level societies are preagricultural, there is no private property in any modern sense. Like chimp bands, hunter-gatherers inhabit a territorial range that they guard and occasionally fight over. But they have a lesser incentive than agriculturalists to mark out a piece of land and say “this is mine”. If their territory is invaded by another group, or if it is infiltrated by dangerous predators, band-level societies may have the option of simply moving somewhere else due to low population densities. Band-level societies are highly egalitarian ... Leadership is vested in individuals based on qualities like strength, intelligence, and trustworthiness, but it tends to migrate from one individual to another.

Jared Diamond, in The World Until Yesterday: What Can We Learn from Traditional Societies? (Allen Lane, 2012), suggests such bands – in which he believes humans still lived “as recently as 11,000 years ago” – comprised “just a few dozen individuals”, most biologically related. They led a fairly meagre existence, “hunting and gathering whatever wild animal and plant species happen to live in an acre of forest”. (Why just an acre, he never explains.) And their social lives, according to Diamond, were enviably simple. Decisions were reached through “face-to-face discussion”; there were “few personal possessions” and “no formal political leadership or strong economic specialisation”. Diamond concludes that, sadly, it is only within such primordial groupings that humans have ever achieved a significant degree of social equality.

For Diamond and Fukuyama, as for Rousseau some centuries earlier, what put an end to that equality – everywhere and for ever – was the invention of agriculture and the higher population levels it sustained. Agriculture brought about a transition from “bands” to “tribes”. Accumulation of food surplus fed population growth, leading some “tribes” to develop into ranked societies known as “chiefdoms”. Fukuyama paints an almost biblical picture, a departure from Eden: “As little bands of human beings migrated and adapted to different environments, they began their exit out of the state of nature by developing new social institutions”. They fought wars over resources. Gangly and pubescent, these societies were headed for trouble.

It was time to grow up, time to appoint some proper leadership. Before long, chiefs had declared themselves kings, even emperors. There was no point in resisting. All this was inevitable once humans adopted large, complex forms of organisation. Even when the leaders began acting badly – creaming off agricultural surplus to promote their flunkey and relatives, making status permanent and hereditary, collecting trophy skulls and harems of slave-girls, or tearing out rival’s hearts with obsidian knives – there could be no going back. “Large populations,” Diamond opines, “can’t function without leaders who make the decisions, executives who carry out the decisions, and bureaucrats who administer the decisions and laws. Alas for all of you readers who are anarchists and dream of living without any state government, those are the reasons why your dream is unrealistic: you’ll have to find some tiny band or tribe willing to accept you, where no one is a stranger, and where kings, presidents, and bureaucrats are unnecessary.”

A dismal conclusion, not just for anarchists, but for anybody who ever wondered if there might be some viable alternative to the status quo. But the remarkable thing is that, despite the smug tone, such pronouncements are not actually based on any kind of scientific evidence. There is no reason to believe that small-scale groups are especially likely to be egalitarian, or that large ones must necessarily have kings, presidents or bureaucracies. These are just prejudices stated as facts.

In the case of Fukuyama and Diamond one can, at least, note they were never trained in the relevant disciplines (the first is a political scientist, the other has a PhD on the physiology of the gall bladder). Still, even when anthropologists and archaeologists try their hand at “big picture” narratives, they have an odd tendency to end up with some similarly minor variation on Rousseau. In The Creation of Inequality: How our Prehistoric Ancestors Set the Stage for Monarchy, Slavery, and Empire (Harvard University Press, 2012), Kent Flannery and Joyce Marcus, two eminently qualified scholars, lay out some 500 pages of ethnographic and archaeological case studies to try and solve the puzzle. They admit our Ice Age forebears were not entirely unfamiliar with institutions of hierarchy and servitude, but insist they experienced these mainly in their dealings with the supernatural (ancestral spirits and the like). The invention of farming, they propose, led to the emergence of demographically extended “clans” or “descent groups”, and as it did so, access to spirits and the dead became a route to earthly power (how, exactly, is not made clear). According to Flannery and Marcus, the next major step on the road to inequality came when certain clansmen of unusual talent or renown – expert healers, warriors and other over-achievers – were granted the right to transmit status to their descendants, regardless of the latter’s talents or abilities. That pretty much sowed the seeds, and meant from then on, it was just a matter of time before the arrival of cities, monarchy, slavery and empire.

The curious thing about Flannery and Marcus’s book is that only with the birth of states and empires do they really bring in any archaeological evidence. All the key moments in their account of the “creation of inequality” rely instead on relatively recent descriptions of small-scale foragers, herders and cultivators like the Hadza of the East African Rift, or Nambikwara of the Amazonian rainforest. Accounts of such “traditional societies” are treated as if they were windows onto the Palaeolithic or Neolithic past. The problem is that they are nothing of the kind. The Hadza or Nambikwara are not living fossils. They have been in contact with agrarian states and empires, raiders and traders, for millennia, and their social institutions were decisively shaped through attempts to engage with or avoid them. Only archaeology can tell us what, if anything, they have in common with prehistoric societies. So, while Flannery and Marcus provide all sorts of interesting insights into how inequalities might emerge in human societies, they give us little reason to believe that this was how they actually did.

3. Did we really run headlong for our chains?

The really odd thing about these endless evocations of Rousseau’s innocent State of Nature, and the fall from grace, is that Rousseau himself never claimed the State of Nature really happened. It was all a thought-experiment. In his 1754 Discourse on the Origin and the Foundation of Inequality Among Mankind, where most of the story we’ve been telling originates, he wrote:

.. the researches, in which we may engage on this occasion, are not to be taken for historical truths, but merely as hypothetical and conditional reasonings, fitter to illustrate the nature of things, than to show their true origin.

Rousseau’s State of Nature was never intended as a stage of development. It was not supposed to be an equivalent to the phase of “Savagery”, which opens the evolutionary schemes of Scottish philosophers such as Adam Smith, Ferguson, Millar or, later, Lewis Henry Morgan. These others were interested in defining levels of social and moral development, corresponding to historical changes in modes of production: foraging, pastoralism, farming, industry. What Rousseau presented is, by contrast, more of a parable. As emphasised by the Harvard political theorist Judith Shklar, Rousseau was really trying to explore what he considered the fundamental paradox of human politics: that our innate drive for freedom somehow leads us, time and again, on a “spontaneous march to inequality”. In Rousseau’s own words: “All ran headlong for their chains in the belief that they were securing their liberty; for although they had enough reason to see the advantages of political institutions, they did not have enough experience to foresee the dangers.” The imaginary State of Nature is just a way of illustrating the point.

We must conclude that revolutionaries, for all their visionary ideals, have not tended to be particularly imaginative, especially when it comes to linking past, present and future. Everyone keeps telling the same story. It’s probably no coincidence that today, the most vital and creative revolutionary movements at the dawn of this new millennium – the Zapatistas of Chiapas and Kurds of Rojava being the most obvious examples – are those that simultaneously root themselves in a deep traditional past. Instead of imagining some primordial utopia, they can draw on a more mixed and complicated narrative. Indeed, there seems to be a growing recognition, in revolutionary circles, that freedom, tradition and the imagination have always been, and will always be, entangled, in ways we do not completely understand. It’s about time the rest of us catch up, and start to consider what a non-Biblical version of human history might be like.

Tokyo.

4. Changing the course of history

So, what has archaeological and anthropological research really taught us, since the time of Rousseau?

Well, the first thing is that asking about the “origins of social inequality” is probably the wrong place to start. True, before the beginning of what’s called the Upper Palaeolithic we really have no idea what most human social life was like. Much of our evidence consists of scattered fragments of worked stone, bone and a few other durable materials. Different hominin species coexisted; it’s not clear if any ethnographic analogy might apply. Things only begin to come into any kind of focus in the Upper Palaeolithic itself, which begins around 45,000 years ago, and encompasses the peak of glaciation and global cooling (c. 20,000 years ago) known as the Last Glacial Maximum. This last great Ice Age was then followed by the onset of warmer conditions and gradual retreat of the ice sheets, leading to our current geological epoch, the Holocene. More clement conditions followed, creating the stage on which Homo sapiens – having already colonised much of the Old World – completed its march into the New, reaching the southern shores of the Americas by around 15,000 years ago.

So, what do we actually know about this period of human history? Much of the earliest substantial evidence for human social organisation in the Palaeolithic derives from Europe, where our species became established alongside Homo neanderthalensis, prior to the latter’s extinction around 40,000 BC. (The concentration of data in this part of the world most likely reflects a historical bias of archaeological investigation, rather than anything unusual about Europe itself.) At that time, and through the Last Glacial Maximum, the habitable parts of Ice Age Europe looked more like Serengeti Park in Tanzania than any present-day European habitat. South of the ice sheets, between the tundra and the forested shorelines of the Mediterranean, the continent was divided into game-rich valleys and steppe, seasonally traversed by migrating herds of deer, bison and woolly mammoth. Prehistorians have pointed out for some decades – to little apparent effect – that the human groups inhabiting these environments had nothing in common with those blissfully simple, egalitarian bands of hunter-gatherers still routinely imagined to be our remote ancestors.

To begin with, there is the undisputed existence of rich burials, extending back in time to the depths of the Ice Age. Some of these, such as the 25,000-year-old graves from Sungir, east of Moscow, have been known for many decades and are justly famous. Felipe Fernández-Armesto, who reviewed The Creation of Inequality for The Wall Street Journal, expresses his reasonable amazement at their omission: “Though they know that the hereditary principle predated agriculture, Mr. Flannery and Ms. Marcus cannot quite shed the Rousseauian illusion that it started with sedentary life. Therefore they depict a world without inherited power until about 15,000 B.C. while ignoring one of the most important archaeological sites for their purpose.” Dug into the permafrost beneath the Palaeolithic settlement at Sungir was the grave of a middle-aged man buried, as Fernández-Armesto observes, with “stunning signs of honor: bracelets of polished mammoth-ivory, a diadem or cap of fox’s teeth, and nearly 3,000 laboriously carved and polished ivory beads.” And a few feet away, in an identical grave, “lay two children, of about 10 and 13 years respectively, adorned with comparable grave-gifts – including, in the case of the elder, some 5,000 beads as fine as the adult’s (although slightly smaller) and a massive lance carved from ivory.”

Such findings appear to have no significant place in any of the books so far considered. Downplaying them, or reducing them to footnotes, might be more easy to forgive were Sungir an isolated find. It is not. Comparably rich burials are by now attested from Upper Palaeolithic rock shelters and open-air settlements across much of western Eurasia, from the Don to the Dordogne. Among them we find, for example, the 16,000-year-old “Lady of Saint-Germain-la-Rivière”, bedecked with ornaments made of the teeth of young stags hunted 300 km away, in the Spanish Basque country; and the burials of the Ligurian coast – as ancient as Sungir – including “Il Principe”, a young man whose regalia included a sceptre of exotic flint, elk antler batons and an ornate headdress of perforated shells and deer teeth. Such findings pose stimulating challenges of interpretation. Is Fernández-Armesto right to say these are proofs of “inherited power”? What was the status of such individuals in life?

No less intriguing is the sporadic but compelling evidence for monumental architecture, stretching back to the Last Glacial Maximum. The idea that one could measure “monumentality” in absolute terms is of course as silly as the idea of quantifying Ice Age expenditure in dollars and cents. It is a relative concept, which makes sense only within a particular scale of values and prior experiences. The Pleistocene has no direct equivalents in scale to the Pyramids of Giza or the Roman Colosseum. But it does have buildings that, by the standards of the time, could only have been considered public works, implying sophisticated design and the coordination of labour on an impressive scale. Among them are the startling “mammoth houses”, built of hides stretched over a frame of tusks, examples of which – dating to around 15,000 years ago – can be found along a transect of the glacial fringe reaching from modern-day Kraków all the way to Kiev.

Still more astonishing are the stone temples of Göbekli Tepe, excavated over 20 years ago on the Turkish-Syrian border, and still the subject of vociferous scientific debate. Dating to around 11,000 years ago, the very end of the last Ice Age, they comprise at least 20 megalithic enclosures raised high above the now barren flanks of the Harran Plain. Each was made up of limestone pillars over 5m in height and weighing up to a ton (respectable by Stonehenge standards, and some 6,000 years before it). Almost every pillar at Göbekli Tepe is a remarkable work of art, with relief carvings of menacing animals projecting from the surface, their male genitalia fiercely displayed. Sculpted raptors appear in combination with images of severed human heads. The carvings attest to sculptural skills, no doubt honed in the more pliable medium of wood (once widely available on the foothills of the Taurus Mountains), before being applied to the bedrock of the Harran. Intriguingly, and despite their size, each of these massive structures had a relatively short lifespan, ending with a great feast and the rapid infilling of its walls: hierarchies raised to the sky, only to be swiftly torn down again. And the protagonists in this prehistoric pageant-play of feasting, building and destruction were, to the best of our knowledge, hunter-foragers, living by wild resources alone.

What, then, are we to make of all of this? One scholarly response has been to abandon the idea of an egalitarian Golden Age entirely, and conclude that rational self-interest and accumulation of power are the enduring forces behind human social development. But this doesn’t really work either. Evidence for institutional inequality in Ice Age societies, whether in the form of grand burials or monumental buildings, is nothing if not sporadic. Burials appear literally centuries, and often hundreds of kilometres, apart. Even if we put this down to the patchiness of the evidence, we still have to ask why the evidence is so patchy: after all, if any of these Ice Age “princes” had behaved anything like, say, Bronze Age princes, we’d also be finding fortifications, storehouses, palaces – all the usual trappings of emergent states. Instead, over tens of thousands of years, we see monuments and magnificent burials, but little else to indicate the growth of ranked societies. Then there are other, even stranger factors, such as the fact that most of the “princely” burials consist of individuals with striking physical anomalies, who today would be considered giants, hunchbacks or dwarfs.

A wider look at the archaeological evidence suggests a key to resolving the dilemma. It lies in the seasonal rhythms of prehistoric social life. Most of the Palaeolithic sites discussed so far are associated with evidence for annual or biennial periods of aggregation, linked to the migrations of game herds – whether woolly mammoth, steppe bison, reindeer or (in the case of Göbekli Tepe) gazelle – as well as cyclical fish-runs and nut harvests. At less favourable times of year, at least some of our Ice Age ancestors no doubt really did live and forage in tiny bands. But there is overwhelming evidence to show that at others they congregated en masse within the kind of “micro-cities” found at Dolní Věstonice, in the Moravian basin south of Brno, Czech Republic, feasting on a superabundance of wild resources, engaging in complex rituals and ambitious artistic enterprises, and trading minerals, marine shells and animal pelts over striking distances. Western European equivalents of these seasonal aggregation sites would be the great rock shelters of the French Périgord and Spain’s Cantabrian coast, with their famous paintings and carvings, which similarly formed part of an annual round of congregation and dispersal.

Such seasonal patterns of social life endured, long after the “invention of agriculture” is supposed to have changed everything. New evidence shows that alternations of this kind may be key to understanding the famous Neolithic monuments of Salisbury Plain, and not just in terms of calendric symbolism. Stonehenge, it turns out, was only the latest in a very long sequence of ritual structures, erected in timber as well as stone, as people converged on the plain from remote corners of the British Isles, at significant times of year.

Careful excavation has shown that many of these structures – now plausibly interpreted as monuments to the progenitors of powerful Neolithic dynasties – were dismantled just a few generations after their construction. Still more strikingly, this practice of erecting and dismantling grand monuments coincides with a period when the peoples of Britain, having adopted the Neolithic farming economy from continental Europe, appear to have turned their backs on at least one crucial aspect of it, abandoning cereal farming and reverting – around 3300 BC – to the collection of hazelnuts as a staple food source. Keeping their herds of cattle, on which they feasted seasonally at nearby Durrington Walls, the builders of Stonehenge seem likely to have been neither foragers nor farmers, but something in between. And if anything like a royal court did hold sway in the festive season, when they gathered in great numbers, then it could only have dissolved away for most of the year, when the same people scattered back out across the island.

Why are these seasonal variations important? Because they reveal that from the very beginning, human beings were self-consciously experimenting with different social possibilities. Anthropologists describe societies of this sort as possessing a “double morphology”. Marcel Mauss, writing in the early 20th century, observed that the circumpolar Inuit, “and likewise many other societies . . . have two social structures, one in summer and one in winter, and that in parallel they have two systems of law and religion”. In the summer months, Inuit dispersed into small patriarchal bands in pursuit of freshwater fish, caribou and reindeer, each under the authority of a single male elder. Property was possessively marked and patriarchs exercised coercive, sometimes even tyrannical power over their kin. But in the long winter months, when seals and walrus flocked to the Arctic shore, another social structure entirely took over as Inuit gathered together to build great meeting houses of wood, whale-rib and stone. Within them, the virtues of equality, altruism and collective life prevailed; wealth was shared; husbands and wives exchanged partners under the aegis of Sedna, the Goddess of the Seals.

Perhaps most striking, in terms of political reversals, were the seasonal practices of 19th-century tribal confederacies on the American Great Plains – sometime or one-time farmers who had adopted a nomadic hunting life. In the late summer, small and highly mobile bands of Cheyenne and Lakota would congregate in large settlements to make logistical preparations for the buffalo hunt. At this most sensitive time of year they appointed a police force that exercised full coercive powers, including the right to imprison, whip or fine any offender who endangered the proceedings. Yet as the anthropologist Robert Lowie observed, this “unequivocal authoritarianism” operated on a strictly seasonal and temporary basis, giving way to more “anarchic” forms of organisation once the hunting season and the collective rituals that followed were complete.

Scholarship does not always advance. Sometimes it slides backwards. A hundred years ago, most anthropologists understood that those who lived mainly from wild resources were not, normally, restricted to tiny “bands”. That idea is really a product of the 1960s, when Kalahari Bushmen and Mbuti Pygmies became the preferred image of primordial humanity for TV audiences and researchers alike. As a result we’ve seen a return of evolutionary stages, really not all that different from the tradition of the Scottish Enlightenment: this is what Fukuyama, for instance, is drawing on, when he writes of society evolving steadily from “bands” to “tribes” to “chiefdoms”, then finally, the kind of complex and stratified “states” we live in today – usually defined by their monopoly of “the legitimate use of coercive force”. By this logic, however, the Cheyenne or Lakota would have had to be “evolving” from bands directly to states roughly every November, and then “devolving” back again come spring. Most anthropologists now recognise that these categories are hopelessly inadequate, yet nobody has proposed an alternative way of thinking about world history in the broadest terms.

Quite independently, archaeological evidence suggests that in the highly seasonal environments of the last Ice Age, our remote ancestors were behaving in broadly similar ways: shifting back and forth between alternative social arrangements, permitting the rise of authoritarian structures during certain times of year, on the proviso that they could not last; on the understanding that no particular social order was ever fixed or immutable.

Within the same population, one could live sometimes in what looks, from a distance, like a band, sometimes a tribe, and sometimes a society with many of the features we now identify with states. With such institutional flexibility comes the capacity to step outside the boundaries of any given social structure and reflect; to both make and unmake the political worlds we live in. If nothing else, this explains the “princes” and “princesses” of the last Ice Age, who appear to show up, in such magnificent isolation, like characters in some kind of fairy-tale or costume drama. Maybe they were almost literally so. If they reigned at all, then perhaps it was, like the kings and queens of Stonehenge, just for a season.

5. Time for a rethink?

Modern authors have a tendency to use prehistory as a canvas for working out philosophical problems: are humans fundamentally good or evil, cooperative or competitive, egalitarian or hierarchical? As a result, they also tend to write as if for 95 per cent of our species’ history, human societies were all much the same. But even 40,000 years is a very, very long period of time. It seems inherently likely, and the evidence confirms, that those same pioneering humans who colonised much of the planet also experimented with an enormous variety of social arrangements. As the anthropologist Claude Lévi-Strauss often pointed out, early Homo sapiens were not just physically the same as modern humans, they were our intellectual peers as well. In fact, most were probably more conscious of society’s potential than people generally are today, switching back and forth between different forms of organisation every year. Rather than idling in some primordial innocence, until the genie of inequality was somehow uncorked, our prehistoric ancestors seem to have successfully opened and shut the bottle on a regular basis, confining inequality to ritual costume dramas, constructing gods and kingdoms as they did their monuments, then cheerfully disassembling them once again.

If so, then the real question is not “what are the origins of social inequality?” but, having lived so much of our history moving back and forth between different political systems, “how did we get so stuck?” All this is very far from the notion of prehistoric societies drifting blindly towards the institutional chains that bind them. It is also far from the dismal prophecies of Fukuyama, Diamond et al. where any “complex” form of social organisation necessarily means that tiny elites take charge of key resources, and begin to trample everyone else underfoot. Most social science treats these grim prognostications as self-evident truths. But clearly, they are baseless. So, we might reasonably ask, what other cherished truths must now be cast on the dust-heap of history?

Quite a number, actually. The first bombshell on our list concerns the origins and spread of agriculture. There is no longer any support for the view that it marked a major transition in human societies. In those parts of the world where animals and plants were first domesticated, there actually was no discernible “switch” from Palaeolithic Forager to Neolithic Farmer. The “transition” from living mainly on wild resources to a life based on food production typically took something in the order of 3000 years. While agriculture allowed for the possibility of more unequal concentrations of wealth, in most cases this only began to happen millennia after its inception.

In the time between, people in areas as far removed as Amazonia and the Fertile Crescent of the Middle East were trying farming on for size, “play farming” if you like, switching annually between modes of production, much as they switched their social structures back and forth. Moreover, the “spread of farming” to secondary areas, such as Europe – so often described in triumphalist terms, as the start of an inevitable decline in hunting and gathering – turns out to have been a highly tenuous process, which sometimes failed, leading to demographic collapse for the farmers, not the foragers.

Clearly, it no longer makes any sense to use phrases like “the agricultural revolution” when dealing with processes of such inordinate length and complexity. Since there was no Eden-like state, from which the first farmers could take their first steps on the road to inequality, it makes even less sense to talk about agriculture as marking the origins of rank or private property. If anything, it is among those populations – the “Mesolithic” peoples – who refused farming through the warming centuries of the early Holocene that we find stratification becoming more entrenched; at least, if opulent burial, predatory warfare and monumental buildings are anything to go by. In at least some cases, like the Middle East, the first farmers seem to have consciously developed alternative forms of community, to go along with their more labour-intensive way of life. These Neolithic societies look strikingly egalitarian when compared to their hunter-gatherer neighbours, with a dramatic increase in the economic and social importance of women, clearly reflected in their art and ritual life (contrast here the female figurines of Jericho or Çatalhöyük with the hyper-masculine sculpture of Göbekli Tepe).

Another bombshell: “civilisation” does not come as a package. The world’s first cities did not just emerge in a handful of locations, together with systems of centralised government and bureaucratic control. In China, for instance, we are now aware that by 2500 BC, settlements of 300 hectares or more existed on the lower reaches of the Yellow River, over a thousand years before the foundation of the earliest (Shang) royal dynasty.

On the other side of the Pacific, and at around the same time, ceremonial centres of striking magnitude have been discovered in the valley of Peru’s Río Supe, notably at the site of Caral: enigmatic remains of sunken plazas and monumental platforms, four millennia older than the Inca Empire.

Such recent discoveries indicate how little is yet truly known about the distribution and origin of the first cities, and just how much older these cities may be than the systems of authoritarian government and literate administration that were once assumed necessary for their foundation. And in the more established heartlands of urbanisation – Mesopotamia, the Indus Valley, the Basin of Mexico – there is mounting evidence that the first cities were organised on self-consciously egalitarian lines, municipal councils retaining significant autonomy from central government. In the first two cases, cities with sophisticated civic infrastructures flourished for over half a millennium with no trace of royal burials or monuments, no standing armies or other means of large-scale coercion, nor any hint of direct bureaucratic control over most citizens’ lives.

Jared Diamond notwithstanding, there is absolutely no evidence that top-down structures of rule are the necessary consequence of large-scale organisation. Walter Scheidel notwithstanding, it is simply not true that ruling classes, once established, cannot be gotten rid of except by general catastrophe.

To take just one well-documented example: around 200 AD, the city of Teotihuacan in the Valley of Mexico, with a population of 120,000 (one of the largest in the world at the time), appears to have undergone a profound transformation, turning its back on pyramid-temples and human sacrifice, and reconstructing itself as a vast collection of comfortable villas, all almost exactly the same size. It remained so for perhaps 400 years. Even in Cortés’s day, Central Mexico was still home to cities like Tlaxcala, run by an elected council whose members were periodically whipped by their constituents to remind them who was ultimately in charge.

The pieces are all there to create an entirely different world history. For the most part, we’re just too blinded by our prejudices to see the implications. For instance, almost everyone nowadays insists that participatory democracy, or social equality, can work in a small community or activist group, but cannot possibly “scale up” to anything like a city, a region or a nation-state. But the evidence before our eyes, if we choose to look at it, suggests the opposite. Egalitarian cities, even regional confederacies, are historically quite commonplace. Egalitarian families and households are not.

Once the historical verdict is in, we will see that the most painful loss of human freedoms began at the small scale – the level of gender relations, age groups and domestic servitude – the kind of relationships that contain at once the greatest intimacy and the deepest forms of structural violence. If we really want to understand how it first became acceptable for some to turn wealth into power, and for others to end up being told their needs and lives don’t count, it is here that we should look. Here too, we predict, is where the most difficult work of creating a free society will have to take place.

This piece appears courtesy of Eurozine

book cover

India has experienced an explosive economic rise, which at the same time as increasing its power on the world stage has driven inequality to new extremes. Even as tycoons exert huge power over business and politics, millions remain trapped in slums. Corruption is endemic. Can India become the world's next great superpower? In his book "The Billionaire Raj: A Journey Through India's New Gilded Age" (Oneworld), journalist James Crabtree, who spent five years as Mumbai bureau chief for the Financial Times, explores this question.

Why did you write this book?

I used to joke that what I was doing was hardly unprecedented, given it's another book about India from a white British journalist, of which there have been a fair number.

Typically the way that people from outside view India is through the political capital, not the financial capital, and so in five years based in Mumbai rather than Delhi, I saw a different kind of India – one that was about wealth and finance and the power that comes from money, as opposed to the power that comes from votes. As a Western journalist for a global publication, I was lucky enough to get very good access to what in the book I call the Bollygarchs – the new super-rich strata that has grown in India over the last 20 years. And so I thought I had a perspective that even as an outsider, would be a useful contribution to the wider debate about India's future.

We all know in some basic sense that India is important, but we don't think about it very much. We think a lot about China. We spend a lot of time thinking about really quite small countries like Hungary, Turkey, Italy. India is underappreciated for its enormous significance for the remainder of the century, at the moment in which China is degrading into a form of Leninist autocracy. We all have an enormous stake in the success of India as a society, not simply from the basic point that there are 1.3 billion people who live there, and therefore at any level it is significant what happens there. But also we in the West who are Democrats, who are liberals hope that the Indian tradition of liberalism and tolerance and secularism will continue to flourish. At the moment there's good news and bad news on that front. My argument in the book is that India is going to play a critical role in the future of the world over the next century.

Why do you think India is underappreciated?

There is lot going on in the world at the moment, and I'm not saying this is anybody's fault. You have Trump, the rise of China, war in Syria. My argument is that over the long term, the success of Indian democracy and whether or not India becomes a tolerant free market parliamentary democracy by the middle of this century, or whether it could take another path, is one of the most consequential things for the future of the world. And so understanding the stage that India is going through is something that we should all be interested in, in much the same way that we are interested in the internal politics of America or China. These things have enormous consequences far beyond their borders.

How does the Indian tycoon class interact with politics?

The tycoon class and the political class are intimately wrapped up with one another. If you are an industrialist, you need politicians to do almost anything - if you want to build a steel plant, a road, an iron ore mine, you need things that only the government can provide. If you're a politician, you need to win elections, which are very expensive. There's no state funding of political parties. According to one estimate, the last election cost $5 billion. Almost all of that money was provided illegally under the table by large businesses, given to the political parties' elite - a quasi-corrupt system. One way of describing this is crony capitalism. It is writ large in the relationship between politics and capital. But you also have individual cases where the business and political elite will collude with one another in order to enrich themselves, not to deliver the public good. This was at its zenith maybe 10 years ago; there was a series of jaw dropping corruption scandals. It was called the “season of scandal”, in which telecom spectrum or mining rights, whatever, were gifted to the tycoons who owned the big conglomerates.

Under the current Prime Minister Narendra Modi, elected in 2014, the most eye catching scandals have stopped. He has mostly put a stop to the worst of it. But India still has big problems of corruption that it needs to tackle in all areas of public life.

Modi was elected promising to end corruption. What is his response to this crony capitalism?

He has stopped the most egregious crony capitalism in Delhi, which was happening almost literally around the Cabinet table. No one accuses Modi himself of being corrupt, much like the last prime minister. He is an honest man, and to some degree he has read the riot act to these tycoons. Some of them have been frozen out. But Modi's progress cleaning up corruption has been mixed. He introduced an initiative called de-monetisation, when they scrapped most of the banknotes. This was a crazy initiative that in theory was designed to combat corruption but which did nothing of the sort. Underlying this you have the problem that I mentioned before about the nature of funding politics, which is unsolved.

The rise of the super rich has gone hand-in-hand with growing economic inequality.

This is a story about globalisation. India became independent from Britain in 1947 and then for about 40 years, closed itself off from the world under the socialist planned economy which they called a “license raj”, because it was controlled by licenses, quotas and permits which decided how much you could produce if you were a business. They literally said you could only produce ten cars, or whatever it was. That only ended in 1991, and India opened itself up to the world. In a very simple sense, money came in, India re-globalised, it re-entered the global economy and that played havoc with all sorts of things - the regulatory system was not able to cope. India's economy started growing very quickly, which meant that things became more valuable, which meant that the returns for corruption were much higher. It also meant an extraordinary explosion of wealth at the very top of Indian society. The top 10 per cent of Indian society have done very well over the last 20 years. Although in absolute terms, the other 90 per cent have done much better, in relative terms have fallen farther and farther behind. That that gets more profound the further down you go.

India has always been a very unequal country. You have the caste system which everybody knows about. But you also have inequalities based upon religion, language and region. Everyone knows India is an unequal society but people haven't noticed how much more unequal it has become.

This problem of inequality itself is a profound one. If India wants to follow the countries of East Asia through a successful development path, moving from poverty to middle income and eventually to becoming a rich country sometime in the second half of the century, it is unlikely to be able to do that unless it fixes three big problems: the rise of the super rich and the inequality that come with it, crony capitalism, and dysfunctional investment model that underpins its growth.

Modi’s background is in the Hindu nationalist movement. How does negotiate his relationship to the more extreme elements of the movement?

You have to start from the position that Modi’s background is as a member of the more extreme elements of the Hindu right. That was his advancement into politics. He has a poor background. In this sense he's a model of Indian social mobility. It's very unusual for a lower caste son of a tea seller to become a senior politician, let alone prime minister. He's only the second lower caste Indian prime minister ever. So he had this background in the kind of Hindu religious right. And that is part of what makes him popular. He appeals very much to constituency in India which believes in Hindu nationalism. He is a Hindu nationalist.

On the other hand he also believes in economic development and these are the two, to some degree contradictor, parts of his belief system. In a sense it's a bit like the Republican Party in America – you have the Christian right and the pro-business people. Modi has a similar coalition that he has to try to keep on board – the cultural and religious conservatives, and more pro-business types. It's a balance as to which of these is going to be in the lead.

It tells you something about India that at the moment, Modi isn't seen as an extremist anymore, because the Hindu right has moved even farther to the right.

When he was elected, there was anxiety about what it might mean for communal tension. Has that been borne out?

Modi is unpopular amongst the extremes of the Hindu nationalist right because he's not carrying out enough of their agenda - rewriting textbooks, changing the law so that Hindus are recognised as the primary group in Indian society, particularly to the detriment of Muslims. At its most extreme there is a Hindu chauvinist tradition that is analogous to Islamism, although it doesn't tend to be violent in the same way. Modi arose in an earlier variant of that tradition, but he has since moderated, at least to the extent that it's in abeyance because he also believes in the church of economic development.

But I think it's clear that Hindu nationalism is gaining in strength in India and secularism is almost a defeated force. It's not impossible that it may be revived, but the old secularist tradition is in a pretty weakened state. That is also wrapped up in the broader story of globalisation. India has been globalised. This has caused economic prosperity, it has lifted hundreds of millions of people out of poverty, but it has also created a world that is alien and confusing, a world in which women are entering the workforce in a traditionally patriarchal society, in which people who previously didn't have money suddenly do. In those circumstances, people often seek solace in traditional identities, and therefore you've seen this upswing in people interested in the heritage of Hinduism. Some take that to an extremist point of view.

Is there a risk of increased authoritarianism in India?

I think Modi is likely to win a second term in 2019, but is unlikely to win such a strong majority as he did last time - he won a hugely unusual majority. In that sense, the odds of him becoming a kind of Erdogan figure lessens. I don't think this is likely but the nightmare scenario is that Modi or the person who comes after him decides to follow a more overtly anti-democratic path. There certainly are a lot of liberals in India who worry about mirroring Turkey or Hungary. That is more to do with the health of the country's liberal institutions. The court system is in a great mess. Many of the institutions that you would look to protect liberal rights do not appear to be doing so. In addition to the fear that Modi himself may have a populist authoritarian tinge, there is a worry about the state of India's constitutional apparatus and its ability to protect minorities in particular.

Support more videos like this at patreon.com/rebecca!

Transcript:

New year, new you! Old you was lazy and dumb. New you is productive and smart! And of course, old you was fat while new you is sexxxy. According to surveys, the top resolutions this year are the same as every year: eat healthier, exercise more, lose weight. And for most Americans, those are really good goals, because we are very fat and unhealthy. Yes, myself included — I’ve gained about 15 pounds since Trump was elected two years ago, and while I don’t have problems with my cholesterol and I’m nowhere near to being at risk for Type II diabetes or other problems linked with obesity, I do have a degenerative spinal disc that I’ve dealt since I was a teen, and when my weight creeps up, it literally cripples me. So yeah, I have to join the boring masses with the same boring resolution: be less fat.

I say that it happened since the election because I’ve noticed that I’ve been eating more and drinking more alcohol since then, and alcohol has shitloads of calories. It’s easy to drink and drink without filling up, and once you’re drunk it’s easy to raid the fridge for any fatty, salty, delicious thing you can find. If you drink enough whiskey and then eat enough mozzarella sticks, you may be able to forget about Donald Trump for up to four hours. It’s remarkably effective, but yeah, not great for your body.

That’s why cutting out alcohol is a good first step if you want to lose weight and be healthier, and for several of my friends out here in California, they’ve found it pretty easy to drop alcohol — and replace it with weed. Cannabis is easy and legal to get here, and if you drink alcohol to be more social or to relax at the end of the day, weed can be a good replacement for some people. And here’s the kicker: weed can help you lose weight.

Biologists at South Bend University have just published a meta-analysis in Cannabis and Cannabinoid Research, the premiere journal for scientists who are also giant potheads, showing clear evidence that people who use cannabis have lower BMIs than non-users. Digging further into the data allowed the researchers to offer some hypotheses about the causation and the reason for it.

One hypothesis is the one that I stated: potheads, particularly younger people, may be drinking less alcohol, and also just straight up being too lazy to make mozzarella sticks. On the other end, older people who start using cannabis may actually become more active, since it can help with pain relief and other medical issues.

But there is one confounding factor: the researchers report that in several studies, frequent cannabis users had lower BMI but higher caloric intake compared to non-users. And not a small amount — we’re talking 500 to 600 calories, like an extra giant donut a day, and in one study up to 1,000 calories more. That’s tough to work out in the gym — as they say, you can’t outrun your fork, so the researchers looked at other possible explanations.

Their main hypothesis is that cannabis stimulates a receptor in your brain that can, basically, amp up your metabolism and make you less likely to absorb the calories from the typical processed food Western diet. Their paper includes some observations that suggests this might be true, but it’s worth noting that there is no direct, clinical evidence.

But more concerning is the initial assumption that all of this conjecture is based upon: the idea that if you use cannabis, you consume more calories but have a lower BMI. The researchers base this statement on four studies they examined. Let’s go through them one by one:

In “Cannabis use in relation to obesity and insulin resistance in the inuit population”, the researchers note that the subjects who used marijuana had higher caloric intake, but it wasn’t statistically significant. Honestly if that’s the case it should not have even been mentioned. Statistical significance is there for a reason — it helps us tell what is likely to be a real effect and what is more likely to just be statistical noise, or random chance. Not statistically significant? Throw it in the trash.

In “Marijuana use, diet, body mass index, and cardiovascular risk factors,” they found cannabis users to have increased caloric intake of 600 calories but about the same BMI as non-users, but the calories were self-reported. No one in a lab was determining how many calories each of 3,600 people were consuming every day. So really all we can say is that cannabis users think they eat more than non-users think they eat, but also they’re fucking high, so if they’re anything like me when I’m high, they’re not really the most trustworthy sources of information. The same is true of the next study: in “Dietary intake and nutritional status of US adult marijuana users: results from the Third National Health and Nutrition Examination Survey,” they used a survey, aka self-reported data, to determine caloric intake. They found that potheads said they ate 500 calories more than non-potheads but had slightly lower BMI. Did they? Maybe! But to be sure we’d need to put people in a clinical setting where their food intake is strictly monitored along with their weight, and then we can know for sure.

Luckily, that’s exactly what happened in the fourth study cited by these researchers! In “Effects of smoked marijuana on food intake and body weight of humans living in a residential laboratory,” they put six men in a laboratory for about two weeks and monitored everything they ate while giving them two joints per day to smoke. Yes, six men isn’t a great sample size but it IS a good start and I do find it a bit more convincing than asking people to self-report their calories.

Sure enough, in this lab setting the men did consume more calories — not during mealtimes, but during snacktimes, taking in a whopping extra 1,000 calories per day. And how much weight did these guys lose? Well, let me just quote the authors: “Increases in body weight during periods of active marijuana smoking were greater than predicted by caloric intake alone.” Increases in weight were greater than predicted by calories alone. They gained weight, and they gained more weight than the researchers thought possible by how many calories they were taking in. That’s…that’s not what the meta-analysis seemed to take away from that study.

So sadly I’m not sure that there’s really anything there with regards to cannabis helping you lose weight, though I will state with a high (haha) degree of confidence that if you are a heavy drinker and you switch alcohol for weed, and if you try to limit your snacking while high, you’re gonna lose weight. But it may not be the magical weightloss cure the meta-analysis authors were hoping for.

The post New Report Claims Weed Helps You Lose Weight! Not So Fast… appeared first on Skepchick.

Boys learning about population growth at a Yorkshire school, 1922

This article is a preview from the Winter 2018 edition of New Humanist

In the film The Man Who Knew Infinity (2015) the young Indian prodigy Srinivasa Ramanujan travels from his home in Madras to Cambridge to work with some of the brightest minds in pure maths. All men, of course, but then this was 1916. One hundred years on, Cambridge University’s Department of Pure Mathematics is still the desired destination for many brilliant young mathematicians. Amongst the renowned algebraists, geometers and analysts on its permanent academic staff, there are now four women. The other 39 are men. Other centres of mathematical excellence across the world have similar ratios – or worse. Mathematics in 2018 is not an equal opportunities profession nor, at the present rate of progression, does it seem likely to be.

This summer in Brazil, four winners of Fields Medals were announced at the International Congress of Mathematics. The Medals, seen as the Nobel Prize of mathematics, are awarded every four years. Amongst the dozen mathematicians being talked about as possible winners this year were two women: the Ukrainian Maryna Viazovska, known for her work in the geometry of sphere-packing, and Sophie Morel, a French mathematician who specialises in number theory. Neither won; the medals were awarded to Peter Scholze, Alessio Figalli, Akshay Venkatesh and Caucher Birkar. There have now been 62 recipients of Fields Medals since they were first awarded in 1936. Of these, 61 have been men.

Over a dozen years ago, one of the most prominent British Fields Medallists, Sir Timothy Gowers, argued that the small number of top female mathematicians was both “puzzling” and “regrettable”. Despite Gowers’s concern, and despite initiatives in the UK like the Equality Challenge Unit’s SWAN programme, established in 2005 to encourage women’s careers in science and mathematics, change is frustratingly slow. The London Mathematical Society’s most recent report into gender and mathematics concludes that there are still a “surprisingly small proportion of women studying for a PhD and worryingly few promoted to professor”. The latest research suggests only one in ten maths professors in Britain are female.

This does not appear to be due to direct discrimination. In her 18 years as a research mathematician, Colva Roney-Dougal, Professor of Pure Maths at St Andrews University, cannot think of any occasion when she was treated differently because of her gender, at least not on the surface. “I have got used to being the only woman in the room, and am comfortable with that, to be honest,” she says.

But Roney-Dougal wonders if she really did have the same chances as male colleagues. “The common criticism of female mathematicians is that we’re quite good at all the plodding stuff but we don’t come up with the big connecting, surprising ideas,” she says. “In my first few years as a mathematician, I did quite a lot of what is seen as classic female work, building data bases, which you have to get right. It’s not easy but if you work hard you will get there in the end. Had I been more arrogant about my ability I might not have spent all those years on those projects.”

This lack of professional confidence seems to be crucial. A recent American study showed that women are still under-represented in academic disciplines where people believe innate talent or brilliance is required to succeed. And there is no academic subject that values innate talent and natural brilliance more than mathematics.

“These days there are hardly any tangible obstacles to a woman’s career in maths,” says Julia Wolf, reader in Pure Maths at Bristol University, “but a lot of small things come together to make it more difficult for women to succeed.”

One of those “small” things that young male students seem to have, and young women don’t, is bluster: the courage to take risks, the confidence to bluff. “I’m a personal tutor for first-year students, and when something is hard, the boys say, ‘It’s fine’, whereas girls say, ‘I don’t think I can do this’. Yet there’s no difference in their ability,” Wolf continues. “When I was an undergraduate at Cambridge, we had around 25 per cent women in the first year in maths. But in the third year, on courses that were considered difficult, all of a sudden, I was on my own. All of the girls had disappeared to easier options. I found myself alone amongst dozens of men in a lecture about Hilbert spaces.”

A Hilbert space is a vector space equipped with some notion of length and angle that is complete in the sense that a Cauchy sequence of vectors always converges to a well-defined limit within that space. To understand all that, of course, you need to know how the words “vector”, “limit” and “space” are used in mathematics and what is so crucial about a “Cauchy sequence”. That’s why contemporary mathematics is so difficult. To grasp one concept, you have to know many, many others.

Hilbert spaces are one of many daunting peaks in an undergraduate degree in maths and they make studying the subject at university as precarious as rock climbing. Many male students, with the learned arrogance of their gender, saunter into this treacherous terrain and assume they will find the right path as they go along. The best of them scramble up with an effortless cleverness, throwing down a challenge to those below. Female students often appear to conclude they don’t have the same talent to survive. What they don’t recognise is that being able to scale peaks with little effort is not always a predictor of future success. “People who came across as the smartest students when I was studying for my first degree at Cambridge didn’t necessarily go on to become the best research mathematicians,” Wolf points out.

To push the boundaries of mathematical research requires patience as much as cleverness, determination as well as technical skill, the ability to collaborate as well as the concentration to work out answers on your own. Yet these qualities are down-played in a mathematical culture that can be as vicious as a boxing match, even if the struggle is between competing minds rather than bodies.

This competition starts early – the junior Mathematical Olympiad in Britain is open to 11-year-olds – and never really stops. It is not just number theorists who rank their colleagues in descending order of merit. “Mathematicians I know say there are Fields Medallists and there are good Fields Medallists,” says Alison Etheridge, a professor at Oxford University. “Some years ago I was at a meeting with people I hadn’t seen for 10-15 years and I was struck by how unpleasant the behaviour of some of them was. I’m now tough enough to deal with it but, when I was starting out, it wasn’t so easy to cope with this very aggressive, very competitive style.”

This style can be traced back to the real hero of The Man Who Knew Infinity – not Srinivasa Ramanujan but Godfrey Harold Hardy, the maths professor who brought Ramanujan to Cambridge. Hardy, still a towering influence in British mathematics even though he died in 1947, lived in a world of men. He viewed the ideal mathematician as a mental athlete who does his best work in his 20s and 30s. “More than any other art or science,” he wrote, “mathematics is a young man’s game.” It still is. Fields Medals are only awarded to mathematicians under the age of 40.

The obsession with youth in mathematics works against women. The American mathematician Claudia Henrion collected studies and personal testimony that suggest that women do their best maths after the age of 40, not least because many take time off to bring up children and cannot complete the three post-doc research posts around the world which are almost a requirement today. “You finish your PhD and then you are mobbing around from continent to continent,” says Roney-Dougal. “Some mathematicians I have heard of have had to work in three continents in five years and to still be doing that in your 30s is remarkably tough. It’s not too bad for me because I haven’t got kids and don’t want them but I’ve known female friends with family commitments who couldn’t do all that moving, and so they gave up maths. We are losing many of our best mathematicians.” Indeed, research in America which looked at the hiring of staff in maths-based sciences showed that, if anything, women were more likely to be successful at interview than men. The problem was that fewer women applied for jobs in the first place.

The main challenge to achieving a healthier gender balance in mathematics is no longer the need to persuade more girls to study maths at university, but to find ways to ensure the brighter female maths students stick with the subject and make it their profession. One obvious reform is to develop career paths in mathematics that take children into account. Another is to break with the expectation that post-docs must turn themselves into intellectual nomads roaming the mathematical world before they are rewarded with a secure job. Yet another is to arrange more female-only gatherings in areas of maths where there are few or no women. All these are important not just for the future health of mathematics but because the existence of this discrimination at the heart of British science and life allows us all to think there are some things that men are just better at than women.

But there is something else that is as crucial and without which these practical remedies will never succeed. It is to start changing the culture in which mathematics is practised. The world of elite mathematics still prizes youthful brilliance above slow, steady development and does not seem to mind that this brilliance often brings with it an emotional immaturity that is expressed in lack of concern for colleagues. It is an immaturity that would not be entertained in other academic disciplines but is seen in mathematics as the price of genius. And it is glorified by the veneration of Hardy and films like The Man Who Knew Infinity whose message could be read as “if you haven’t done your best work when you are young, you might as well give up”.

The result is a culture that, from competing for an Olympiad medal at 11 to achieving that elusive Fields Medal before you are 40, rewards male competitiveness and tolerates what often comes with it: rudeness and lack of empathy. And it says to many bright young women that a career in mathematics is not for the likes of you.

Support more videos like this at patreon.com/rebecca!

Transcript:

Big astronomy news, everyone: NASA is currently exploring the most distant object humans have ever reached! It’s a rock that’s located about a billion miles past Pluto, and it’s shaped like a peanut due to actually being two rocks that smashed together and are now spinning around together as one, due to their gravity. This lovely little peanut was designated 2014 MU69. NICE.

I’m not sure why, but the scientists involved in the mission got tired of calling it MU69, which again is a perfectly sweet name. So instead they named it Ultima Thule, which of course is the mythical Arctic landmass thought by Nazi mystics to be the home of giant ubermensch. Wait, what? Ah shit.

To be completely honest, I had never heard of the Nazi legend of a place where enormous honkies strode the frozen landscape, so my first thought upon finding out that NASA chose this name was, “Oh, they probably didn’t know either.” So before I pulled out my pitchfork, I decided to do a little thing called research, unlike all those triggered SJWs on Twitter.

I quickly found a very informative Newsweek article describing the issue in a fair and balanced way. Sure enough, they quote Mark Showalter, who is an investigator on the mission and who led the naming process, as saying, “I had never heard the term Ultima Thule before.” He saw that it meant “beyond the limits of the known world” and thought it was good so they made that the name, and now they see it’s bad so they’ll probably change it. Okay, case closed!

Oh heck let’s keep reading, just for fun. Ultima Thule “was one of about 34,000 names submitted by an online nomination process.” Oh! So, they didn’t just research some names and then pick one. They let random people submit names. On the Internet. The Internet, which is currently flooded with Nazis. I mean, okay, they might not realize how many Nazis are on the Internet so they probably still didn’t realize a name like that might be related to Nazis. Let’s read on.

“As he and his colleagues began narrowing down a list of final contenders, Showalter did stumble on the less palatable meaning of Ultima Thule, which was appropriated in the 19th century to refer to the mythological homeland of the Aryan race.”

Wait, so they did know it was a Nazi thing before they made that the official name. And yes, being the specific favored mythical land of Nazis is less palatable than simply being a general term for an undiscovered place. So wait, why did they pick it? Let’s see…it says here that the original meaning wasn’t “place that Nazis love” and so “Showalter said that NASA…balanced the term’s more recent past against its original meaning.” If only there was some other thing that used to be fine but then got appropriated by Nazis and now we don’t use it anymore, like an ancient Hindu spiritual symbol known as the swastika, or the name Adolf. “I was hesitant at first to name my baby Adolf Hitler Showalter but then I balanced the name’s original Latin meaning of “Noble Wolf Hitler” against the name’s more recent past and thought it would be fine.”

They do point out that like me, a lot of people aren’t familiar with the Nazi version of Ultima Thule, which probably made them think it was more okay than naming the object “2016 Adolf Hitler,” which at first seemed to make sense to me until I remembered that in the past few years I’ve learned so much about Nazis, both old and new, and particularly about how the new ones have their own distinct language and codes that they use to communicate without being immediately noticed as Nazis by general society. Like, certain hand signals, and loads of tattoos with various crosses and wolfheads. Or 1488! I had no idea what that was until a few years ago. “14” refers to the 14 words that make up a much-loved-by-Nazis statement about white supremacy, and 88 refers to Heil Hitler, since “H” is the 8th letter in the alphabet. If you accidentally named an asteroid “1488” you might be forgiven, but if you chose 1488 from a list of things submitted by randos online, and then you Googled it and saw that it was often used by Nazis, why would you still use it???

To answer that, let’s keep reading Newsweek. “In the end, (NASA) decided to include it in the popular vote…” Oh! They put up the top names to popular online vote and that one won. Well, I suppose I could fault them for putting a known Nazi-related name into the popular vote and then letting the Internet, again a place absolutely infested with Nazis, decide, but hey, if Ultima Thule won the popular vote then I guess they have to abide by that. Don’t want another Boaty McBoatface situation.

Oh wait, that sentence wasn’t over. Let me read the entire thing aloud. “In the end, they decided to include it in the popular vote, where it fared quite well, coming in seventh out of the 37 options.”

SEVENTH? The Nazi name came in SEVENTH and you still picked it? I actually started out assuming the best of Showalter and the rest of the New Horizons staff but holy fuck guys I think you might have a Nazi on your team. IT CAME IN SEVENTH! What were one through six? Cunty McCuntface? Kitten Killer 6969? Donald Trump’s Gooch Hair? Seriously, fuck you guys.

The good news is that it’s only a temporary name. The better news is that the guys who chose a Nazi name that came in seventh in an online poll don’t get to choose its permanent name — that has to go through  International Astronomical Union, who hopefully have higher standards than “came in seventh in an online poll.” Fucking Christ. Allow me to end with the Showalter quote that Newsweek ended with: “We’re very, very tired of talking about 2014 MU69,” Showalter said. “Any name is better than 2014 MU69.”

Is it though?

The post Why Nazis are Now Super Into NASA appeared first on Skepchick.

Moscow cathedral

This article is a preview from the Winter 2018 edition of New Humanist

Coverage of the 100th anniversary of the Russian revolution last year suggested its meaning and significance could be reduced to a few simplistic stances. There were “celebrations” by some left-wing groups and networks. Mainstream commentators explained, once again, that 1917 was a “wrong turning” which led inevitably to the horrors of Stalin’s Gulag prison camps. Perhaps the most common claim was that Marxism is irrelevant now, a set of 19th-century ideas that may have resonated in their time, but can now be consigned to “history”.

This last position denies that communism aimed to address problems of inequality and oppression which are still very much with us. But all the key responses to the 1917 centenary obscured how communist politics were always complex and plural. The radical left-wing parties which formed around the world after the Russian revolution tried a great variety of strategies and tactics in their pursuit of revolutionary change in the 20th century. This is underlined by the various different approaches which Marxists took to religion – and it is worth looking again at this record. In both negative and positive ways, the communist experience can suggest effective steps for those who want to promote the cause of reason and apply secular principles in political and social life today.

It is widely held that the Bolsheviks promoted atheism from the moment they took power. However, this is not the case. Lenin’s revolutionary colleagues saw themselves as “rationalist” and “scientific”, to be sure, and thorough-going criticism of religious beliefs and of the reactionary social influence of the Russian Orthodox Church was part of this. But atheism was not a significant aspect of their political programme, and there were divisions between Bolshevik leaders about what approach to take to religious believers.

During the Civil War which immediately followed the revolution, there were serious moves to reduce the church’s authority and social position. Bishops and priests were identified as enemies of socialist change in these years of sharp upheaval. Their institutions had, after all, supported the cruel autocracy of the Tsars. But even when there were arrests and executions of religious officials judged to be supporting counter-revolution, these steps were not an attempt to impose atheism. The 1918 legislation separating church and state did not prohibit public worship as such, or specify that churchgoers should be persecuted on account of their beliefs.

Some revolutionary communists actually appealed to people’s religious values with the aim of mobilising them in support of Bolshevik initiatives. There were initial successes: most of the groupings which made up the Jadid Islamic reform movement in the former Tsarist empire decided to support Lenin’s government. But such potential for alliance-building was undermined by insincere and cynical attempts to engage the faithful. At the Congress of Eastern Peoples, held in the Azerbaijani capital Baku in 1920, the atheist and Communist International leader Grigory Zinoviev played with Islamic concepts, urging anti-imperialist struggle as a form of “jihad”. At the same event, the left-wing futurist poet Velimir Khlebnikov told bemused Muslim delegates that his maverick views were “the continuation of the teachings of Muhammad”.

* * *

Compared to such manipulative trickery, the agitators of the League of the Militant Godless represented a model of honesty and openness. From its establishment in 1925, the League published newspapers and magazines, organised debates, lectures and film shows, and sought to persuade citizens that religious beliefs and practices were irrational and harmful. Key political leaders in the mid-1920s understood that creating the “new Soviet man” would involve persuasion rather than repression of the faithful. Anatoly Lunacharsky, the first Soviet minister of culture and education, had said that “religion is like a nail – the harder you hit it, the deeper it goes into the wood”. The revolutionary leader Leon Trotsky wrote sensitively about the need to acknowledge the real human needs and emotions which religious ceremonies addressed, particularly around birth, marriage and death, and explained that these could not be simply “opposed”. New customs and cultures had to be invented and promoted to give meaning to key moments in family, community and national life. Trotsky was also clear that “compulsion from above” would not work in this area. Supporting and developing ordinary peoples’ creativity would be the key approach.

On this basis, there were campaigns urging people to demonstrate their liberation from religion – both Christian and Islamic. One such drive took place in Uzbekistan in 1927, when thousands of women responded to communist calls to assert their rights by taking part in small gatherings and large public rallies where they threw off their traditional dress and face veils. Research has shown that this movement in fact involved a mix of choice, pressure and force. It also provoked a traditionalist backlash and murders which terrorised women into re-veiling, and confirmed the resistance which overt drives to “modernise” would face.

From the end of the 1920s, Soviet economic policies centred on a programme of rapid industrialisation, and the forced collectivisation of farming. These both expressed and further developed a strong social base for Stalin’s political dominance. And now systematic and very aggressive anti-religious strategies were pursued across the Soviet lands. New laws restricted religious freedoms. Churches were closed and clergy were arrested in large numbers, and many were sent to the camps and executed.

Stalinist repressions in the 1930s happened at the same time as communist parties in European countries promoted the “Popular Front”. Marxists now sought alliances with others, partly as a defensive strategy against fascism following Hitler’s seizure of power in Germany. In France, the communist leader Maurice Thorez declared that his party held an “outstretched hand” to the Church and believers in his predominantly Catholic country. Italian communists urged Catholics to join them in struggling against Mussolini’s regime, stating that their party had “absolute respect for religious opinions”. (It should be noted that this was also the period of the Spanish Civil War, in which supporters of the communist and other left-wing parties in the republican coalition did attack and kill many members of the clergy, and burned some churches and convents. This was largely a reaction to the church’s identification with Franco’s right-wing nationalists who were fighting, successfully and brutally, to overthrow the democratically elected republican government.)

From the late 1930s, religious policy changed in the Soviet Union itself. The Russian Orthodox Church and some other denominations were granted varying degrees of toleration, on the understanding that they would support – or at least not oppose – government policies domestically and abroad. After the Nazi invasion in 1941, Stalin revived the Russian Orthodox Church so as to help secure patriotic support for the war effort. The post-war period saw periods of conciliation and uneasy accommodation, alternating with successive rounds of anti-religious campaigns and some periods of state repression.
Similar zig-zags and contradictory approaches shaped government policy in the other countries run by communists after the Second World War. Enver Hoxha’s regime in Albania was the only such place which consistently tried to outlaw and entirely suppress all religion, with Islam being the main target. Hoxha’s approach extended to policing the dress and hairstyles of the few visitors to the country. These were usually relatively sympathetic members of Marxist groups. But any beards worn by left-wing men visiting Albania in the 60s and 70s had to be shaved off at the border post as a condition of entry, for fear that they might have been interpreted as an expression of Muslim culture.

In total contrast, the development of more liberal, inclusive and democratic political styles amongst West European communists in the post-war decades saw increasing overlap and interaction with church congregations. In 1962, the Italian Communist Party determined that “not only can people of the Christian faith support a socialist society but, faced with the dramatic problems of the real world, their desire for a socialist society may be stimulated by the sufferings of the religious conscience”. By the 1970s, the party had more Catholic members than atheists – and this helped build popular support. In the “red” region of Emilia-Romagna, simple arithmetic showed that more Catholics were voting communist than for the centre-right Christian Democrats. This situation had begun building up even during the years when Pope Pius XII had excommunicated believers who voted for the left. In 1977, party leader Enrico Berlinguer denied “accusations” that Italian communists professed an atheistic and materialist philosophy. He described his party as “lay and democratic, and as such not theistic, atheistic or anti-theistic”.

Similar positions were taken by Spanish communists as they helped move their country on from Franco’s dictatorship. Manuel Azcarate stated that “within our party there is no difference between a communist who has religious faith and one who has not. The two attitudes are equally communist.” The party leader Santiago Carrillo, a veteran of the Spanish Civil War, sought explicit alliance with Catholics: “We have often said Spanish socialism marches forward with the crucifix in one hand and a hammer and sickle in the other.” In Britain, there were rounds of “Christian-communist dialogue”, reflecting the desire of some church figures to connect with the radicalism of youth counter-culture, and a growing openness to other people’s ideas by some Marxist theoreticians.

* * *

At the end of the 1980s, communism was brought down through the interplay of its own self-defeating flaws and effective oppositions. In some countries, religion played a big part in these developments. In Poland, Catholicism had long provided the key space for and vehicle of opposition to communist rule. The visit of Pope John Paul II, Karol Wojtyła, to his homeland in 1979 had galvanised a growing desire for social change. Nearly one in three Poles went to at least one of the events at which Wojtyla spoke. He ended his nine-day visit by addressing a gathering of around three million people near Krakow, telling them it was their right to choose their own government and to defend their own faith.

Meanwhile, weak and inept communist forces in Afghanistan had helped cause the debacle which inaugurated that country’s ongoing misery. A series of pro-Soviet political factions had run the government. They had pursued economic modernisation, land reform and attacks on traditional marriage and Islamic customs, often using insensitive methods. From the mid-1970s, there were popular rebellions against these policies, from both tribal landowners and Islamic clerics, which brutal tactics failed to suppress. The resulting infighting between different Marxist groups precipitated the Soviet move into the country at the end of 1979. Moscow’s unsuccessful military action in Afghanistan drained morale and resources, and provided the context for the USA, Saudi Arabia and others to resource Islamist jihadists, on the basis that “my enemy’s enemy is my friend”.

Back in the Soviet Union, Mikhail Gorbachev’s attempts to reform the economy and political culture through “perestroika” and “glasnost” were accompanied by policies which demonstrated genuine respect for freedom of religion and the autonomy of the church. The last Soviet president declared that “believers are working people, patriots, and they have every right to express their views with dignity”. On the millennial celebrations of Russia’s conversion to Christianity in 1988, the Communist Party was concerned not to antagonise believers. If this position might once have expressed a generous secularism, demonstrating that the ruling party could afford to be comfortable about recognising the independence of religious organisations, some now took it as evidence of government weakness and uncertainty, and one more proof that confidence in communist beliefs was quickly ebbing away.

* * *

The various different approaches which communists took towards organised religion were shaped by particular social circumstances and political pressures. The most positive approaches were those which drew from the dynamic understandings of religious phenomena which were fundamental to the emergence of Marxism.

Back in the early 1840s, Karl Marx and Friedrich Engels met each other through the radical intellectual network of “Left-Hegelians”. Its members were inspired by the work of the German philosopher, but rejected the conservative political views which Georg Hegel himself had adopted in his later years. And they were avowedly atheist, following Ludwig Feuerbach’s insight that “God did not make man: man makes God”.

Feuerbach, however, did not simply “oppose” religion. He saw it as corresponding to necessary phases in human evolution. Rather than “refuting” religious beliefs, his critical aim was to show how they contained truths which, when fully thought through, could lead to the post-religious state he described as “humanism”. In similar ways, Marx was consistently atheist. In Germany at this time, this position fitted neatly with being liberal and tending to the radical left, because of the direct identification between the church and the repressive Prussian state. Nevertheless, Marx was not anti-religious in the sense that he found the beliefs of the faithful irrelevant or uninteresting. His views emerged through radically critical but careful consideration of religion, which he described in a letter of September 1843 to his friend Arnold Ruge as “the table of contents of the theoretical struggles of mankind”.

Alberto Toscano, of Goldsmiths, University of London, has written that “Marx’s early writings can be understood in terms of his progressive, rapid realisation that the attack on religion is always insufficient, or even a downright diversion of attaining its own avowed ends, namely the emancipation of human reason”. In an 1844 article, Marx reflected that “the abolition of religion as the illusory happiness of the people is the demand for their real happiness. To call on them to give up their illusions about their condition is to call on them to give up a condition that requires illusions.”

In a striking phrase, Marx insisted that the reason for “plucking the imaginary flowers” from the chains that bind people is not so that they will continue to bear their social oppression and economic exploitation “without consolation”, but so that we could “throw off the chain and pluck the living flower”.

Such passages illustrate the direction of Marx’s thinking and activity “from the criticism of Heaven to the criticism of Earth”. What shaped this shift was not so much the critique of religion, but Marx’s critique of the Left-Hegelian critique of religion. Marx’s key recognition was that religion, and philosophical understandings more generally, are social problems conceived in abstract forms. And this was the insight which he quickly went on to apply to fields he regarded as far more significant than religion: the direction of history, the workings of society, class politics and economics.

Mike Makin-Waite is the author of "Communism and Democracy: History, debates and potentials" (Lawrence and Wishart)

cover

In certain conditions - shock, meditative states and sudden mystical revelations, out-of-body experiences, or drug intoxication - our senses of time and self are altered; we may even feel time and self dissolving. These experiences have long been ignored by mainstream science, or considered crazy fantasies. Recent research, however, has located the neural underpinnings of these altered states of mind. In his new book "Altered States of Consciousness: Experiences out of Time and Self" (MIT Press), neuropsychologist Marc Wittmann, Research Fellow at the Institute for Frontier Areas of Psychology and Mental Health in Freiburg, Germany, shows how experiences that disturb or widen our everyday understanding of the self can help solve the mystery of consciousness. Here, he discusses his arguments.

What brought you to this subject matter?

A thread through all my research is time perception. That's how I started with my PhD: looking at time perception in patients with brain injuries and how their time perception changed. Then I stayed on this, looking at time perception in psychiatry, neurology and in healthy individuals – look at questions like, why does time pass as we get older. Increasingly, I looked at this area of non-ordinary states of consciousness, and found in these altered states of consciousness that time is massively distorted. Through probing and testing a variety of states of consciousness – during meditation, on drugs – you have a means to really find out what consciousness is about, and what time is about. Probing altered states of consciousness is interesting in itself but it is also a means to solve really big questions: what is consciousness, how does consciousness function, and what constitutes subjective time?

How would you define an altered state of consciousness?

A non-ordinary state which is not our usual waking state and which does not occur very often. You could say there are some states which appear more regularly - what do you do with dreams for example? Dreams occur regularly, for some people every night. So this is not a very clear distinction from ordinary states of consciousness. But it basically means rare cases which you could apply through different induction techniques, which could be drug related, or psychological inductions like meditation, or other methods - getting into a trance. There are many ways to get into other states of consciousness which are deviant from the normal ordinary state of consciousness.

How did you select which altered states of consciousness to examine in the book?

Let's start with states of consciousness that happen on the border of being ordinary or non-ordinary.

One extreme emotional state of consciousness you can be in is a frightening situation, for example in an accident, when people report that everything slowed down relative to their perspective. This happens very often, for example in driving accidents. People report a slow motion effect which, for example, was depicted in the fighting scenes The Matrix movie. I myself actually had this experience, where things slowed down radically. Objectively the near-accident situation was for a fraction of a second, but it felt much longer than many seconds.

In the near accident experience you are in slow motion. During near-death experiences things are more extreme. Let's say, the blood and oxygen supply to the brain has stopped, for example during a cardiac arrest. What people report later is that they seemed to have been in a different world and time did not matter at all. There was no time in this state and that conditioned their experience.

Then there are drug related experiences. For example psychedelics are very powerful in changing your state of consciousness. There, people's sense of time and self are both altered. This is a thread through the whole book: in these extreme altered states of consciousness, what happens at the same time is a change in the sense of self and a change in the sense of time. This happens already in ordinary states of consciousness. For example, during boredom, you have a very powerful sense of self. And time is dragging. In a flow state, in contrast, in ordinary waking consciousness, time passes very quickly but you don't sense yourself because you're so absorbed in what you're doing. These are the fluctuations you have in everyday consciousness, but in altered states of conscience, these can go to the extreme. Some people who are under the influence of LSD or ayahuasca report that time does matter. There is no time. There is also no sense of self.

Very experienced meditators also report they can easily get into experiences where they lose any sense of self and time. It just doesn't help to meditate once a week or something like that. I interviewed a very experienced meditator, meditating for 30 years and who had spent 10 years in a monastery meditating for 10 hours every day.

This kind of altered sense of consciousness has traditionally been quite understudied by mainstream science. Why?

For one, it is difficult to study. It's not as easy as studying normal waking consciousness. To study your normal waking consciousness, you don't have to do anything - just have people, mostly students, come into the lab and you have a computer program, maybe where they have to press some buttons. So this is very easily done. But the process of getting people into altered states of consciousness is more challenging. It is more difficult to come into these states of consciousness, more difficult to control the extent of what people feel there.

The other part is related to drugs. We are talking about psilocybin or ayahuasca, and in most countries these are illicit drugs. So for a long time it actually was difficult to study them. It was possible to study this until 1970, until the ban on being able to use them in a clinical context. In Switzerland, in the late 1980s it started again, and now there's a London group doing research. But they really had to go through a lot of opposition and repression so that they could do this research.

What type of research is now being done on psychedelics and humans?

In 1970 there was hardly any official research, and there was this one centre in Zurich conducting basic research. They got healthy subjects and screened them very carefully for drug dependence, psychiatric illnesses. The researchers gave them doses of psilocybin Then subjects completed questionnaires assessing their subjective impressions while under these drugs. The subjects were also put in scanners like PET scans or fMRI to see how the drug induces changes in the brain. Five, six years ago, the London group started doing similar studies with LSD. There's a group in Barcelona now looking at ayahuasca, the drug from South America.

This is all basic research. But on the other hand, there's also very interesting research starting now using psychedelics as a means to treat and heal psychiatric problems - drug dependence, depression. There are first studies showing tremendously positive outputs – that people get much, much better. It really changes their lives. Another very important study is with terminally ill patients in New York, finding that anxiety about death decreased after a therapeutic program including psilocybin. There are some very promising results now showing up in different parts of the world.

What can we learn about consciousness by studying these altered states?

We have no real answer to the question “what is consciousness?”. Say you talk to five different scientists or philosophers about this topic. You would get very, very different answers. But looking at altered states of consciousness, where perception and underlying brain processes are very different, you can learn about normal states of consciousness. For example, in ordinary states of consciousness you have a lot of connectivity and cross-talk between different brain areas. Now what people are showing is that under LSD, certain brain areas suddenly have less connectivity, less crosstalk with each other. But other brain networks have even more crosstalk with each other. And if you can study the function of these areas which have more or less crosstalk with each other, then you'll have a better idea of what normal consciousness is about.

With psilocybin, you can establish that consciousness is dependent on these certain brain networks. Then you can ask: What are the underlying mechanisms in the brain that cause our sense of time, sense of self? These are the basic constituents of self consciousness. If you can show brain processes are related to these extreme differences in perception - no time, no self - you have a clue of how the brain produces ordinary states of consciousness.

The question of consciousness is abstract and philosophical. Does that pose a challenge when studying it as a scientist?

There's this big debate in philosophy: what is the self? Is it an entity in the mind, the self, or is it just a construction, is there actually no self? In analytical philosophy and also in Buddhist philosophy, you have this idea that there is actually no self. Our research actually shows how in altered states of consciousness your sense of self changes and even how it can go away. This is very much related to these basic philosophical questions – is there a self, and if there is a self, what constitutes it? There is huge overlap between brain research, psychology and philosophy.

Support more videos like this at patreon.com/rebecca!

Transcript:

At the end of each year I like to offer my psychic predictions for the coming year. As a reminder, I do not think psychic powers exist even though I know that if they did exist I would definitely have them. “Psychics” who charge you money for their wisdom are actually just scam artists doing a magic trick. This is the trick: you make general, vague statements that can be twisted to always be right, you “predict” things that are statistically likely to happen anyway, and you throw in a few specific unlikely-to-happen predictions on the off-chance that one hits. If it happens, everyone will remember it and you will look awesome. If it doesn’t happen, everyone will just forget about it.

So before I get to 2019, as usual I first want to look back on my predictions I made for 2018. First, I predicted that a major world leader would die. Of course, just last month George H. W. Bush died. Huge hit!

I also said that I saw news of a meteor interacting with the Earth — in fact, back in July a meteor struck Greenland (just 27 miles from a US Air Force base) with the force of 2.1 kilotons! Another huge hit, and this time I mean that literally.

I saw a “miraculous story of a dog…rescuing people, possibly from a fire.” Sure enough, just this month in Kansas a dog named Buster jumped on his sleeping owner, barking until she woke up to find that her house was on fire. She was able to get out safely with her daughter. Good job, Buster, and good job me for psychically foreseeing this event.

I then said that I saw Chris Pratt getting not just a new film role but a big franchise. You guys, I nailed it. Universal Pictures announced that they had cast him in Cowboy Ninja Viking, a movie based on a graphic novel that they hoped would turn into a successful franchise.

So there you have it, I’m psychic! You’d be forgiven for thinking that if you only went by what I just told you, but what I just told you was what a person pretending to be a psychic would tell you. Now I’m going to tell you what they wouldn’t tell you.

Yes, I predicted the death of a world leader, but I also said it would be sudden, unexpected, and possibly due to assassination or heart trouble. None of those things apply to H.W.

I said a meteor or asteroid would interact with the Earth but I didn’t mention that that happens approximately once every two weeks.

I said a dog would rescue people from a fire but I also said it could have been a cat instead of a dog or a tsunami or other natural disaster instead of a fire. Lots of options, and this shit happens all the time. Seriously, get a pet. They will wake you up in the middle of the night all the time over stupid shit, but maybe one of those times your house will happen to be on fire.

I said Chris Pratt would get a film franchise but after looking into it they actually announced that back in 2017 before I even made that video. (I didn’t know.) And back in August of 2018 they announced that the film is on hold indefinitely, which is pretty good because apparently it’s about a guy with multiple personalities and we really don’t need more of that bullshit in films.

Also, I didn’t mention any of the celebrity deaths I predicted, because I got them all wrong. I’m happy to say that Tony Bennett, Maggie Smith, and Jeff Goldblum are all still alive. I’m sad to say that Charlie Sheen and Rob Schneider are also still alive. Ah well.

OK, on to my predictions for 2019!

First up I see a huge scientific breakthrough regarding a major cause of death, like heart disease or cancer.

I also foresee trouble for a major world leader — possibly a sitting President being charged with major crimes. Man, that would be NUTS.

The biggest news for 2019 is that humanity will finally find direct evidence of alien life. So that will be quite exciting.

On the bad news side, I do think we’re going to see a very bad hurricane in the southeast US, as well as a volcano erupting at the same time.

On to some celebrity deaths: Rob Schneider (I’ll just keep saying it until it’s true), Jimmy Carter (sorry), Kirk Douglas, Clint Eastwood, and Kenny Rogers. Also, a young actress in her 20s will die unexpectedly, probably in the summertime.

OK, that’s it for 2018! As usual I’ll check back in at the end of 2019 to see how I did. Let’s hope it’s a better year, at least.

The post Psychic Predictions for 2019! appeared first on Skepchick.

Support more videos like this at patreon.com/rebecca!

Transcript:

Bodily fluids. Is there anything more precious? Which is why conspiracy theorists the world ‘round are terrified of the things that may be in our drinking water. No, not dangerous levels of all-natural lead, which ended up in Flint’s drinking water because of an anti-corrosive the government was failing to add to it. No, they’re terrified of the things the government IS adding to the water to make it safer — namely, fluoride.

Fluoride is a naturally-occurring compound that is found in water — fresh, salt, and falling from the sky — in various concentrations. At the dawn of the 20th century, a dentist named Frederick McKay moved to Colorado Springs and was shocked to find that everyone there had brown stains on their teeth. After sciencing the heck out of the problem, he learned two important things: first, that the town’s water supply had an unusually high concentration of fluoride, and second that the town’s residents were much much less likely to get cavities compared to the general population.

It’s a real “would you rather” situation: what if you never had to worry about a cavity again, but your teeth would be stained brown?

The good news is that after further research, scientists found that you don’t have to choose. By 1945 the National Institutes of Health figured out the right amount of fluoride to add to water that wouldn’t stain anyone’s teeth but did prevent cavities, dropping them by 60% over the course of 10 years in tester city Grand Rapids, Michigan.

Which brings us to today, when a bunch of people freak out because they think fluoride causes cancer. It doesn’t. There have been many, many studies done on this now from a number of different angles. Pumping animals full of fluoride doesn’t seem to cause cancer, and cancer rates have not risen among the millions of people who have been subjected to fluoride in their drinking water throughout the past 70 years. It is perfectly safe and it prevents cavities.

Despite that, we have cities like Windsor, Ontario, where in 2013 the city council voted to stop fluoridating their water due to completely unfounded health concerns. In the five years since, the number of children with tooth decay or requiring urgent care increased by 51%. It doesn’t take a scientific genius to see the very obvious correlation here, and there was no other drastic change in dental health during that time that might otherwise account for all those rotten teeth, unless an actual sugar plum fairy came by and secretly poured sugar into the mouths of children under cover of darkness. Ew, that sounds gross. I hope it wasn’t that.

The town’s idiot mayor, Drew Dilkens, originally approved of fluoride to be removed from their water and didn’t change his mind when the streets were paved with cavity-riddled teeth. He still doesn’t want fluoride in the water, and he got two other councilmen on his side. Luckily, good sense prevailed thanks to people like Joyce Zuk, executive director of Family Services Windsor-Essex, who said “I’m not trained in science. When we don’t know the answer, we look to our experts to provide us with an answer. And in this case, our experts are the Windsor-Essex County Health Unit.”

Holy shit can we make our politicians say something like that every morning when they get to work? Like a Pledge of Allegiance but instead of it being toward a flag it’s a pledge of allegiance to science. Please.

The morons who still don’t want fluoride argued that it’s a matter of taking away “personal consent,” which is truly idiotic. First of all, people can still choose to use bottled water if they really want, though obviously depending on where it comes from and what type it is, that might have fluoride, too, naturally or otherwise. Second of all, you don’t need someone’s consent to offer the public at large an unqualified good with literally no downsides. Like, if the government of Windsor found an actual golden-egg laying goose, and they made a program where everyone in the town gets an egg every week, one guy who thinks golden eggs are gaudy doesn’t get to kill the goose because he didn’t consent to it pooping out eggs. If you don’t consent to having a healthy mouth, just do what I do and drink nothing but Coke Zero and red wine. Not together. Also today I’m not feeling well so I’m having tea made with tap water that has fluoride in it so despite my worst intentions I am not dying of tooth infections or something. Thanks, science!

The post City Removes Fluoride from Water, Everyone’s Teeth Fall Out appeared first on Skepchick.

As many of you know, the Day of Reflection conference, scheduled for November 17 in NYC, has been cancelled, and some hundreds of ticket holders are now left seeking refunds.

I was forced to pull out of this event nearly two months ago and have said very little about it since. Now that Travis Pangburn has officially announced that he will be “folding” his touring company, Pangburn Philosophy, I can give a brief account of what happened.

  1. I participated in 10 events organized by Pangburn Philosophy between September 2017 and July 2018. I didn’t always approve of the way those events were staged or marketed, but all of them appeared to be successful.
  1. However, after the cancellation of an August 2018 conference in Auckland, Pangburn seemed intent on running his business off a cliff. He owed a lot of money to several speakers at that point, in the form of unpaid fees and reimbursements. Most egregiously, he seemed less than fully committed to refunding ticket holders for the cancelled Auckland conference.
  1. At this point, I had two more dates on the calendar with Pangburn in 2018: a dialogue with Brian Greene in Toronto (September 5) and the Day of Reflection conference in New York (November 17). I kept my appointment in Toronto because I was contractually obligated to do so. I also didn’t want to do anything that would harm Pangburn’s ability to pay his mounting debts.
  1. After Toronto, however, it became clear that Pangburn could not be trusted to put his house in order. Facing a total lack of transparency, and realizing that Pangburn was using my ongoing association with him to book future speakers, I withdrew from the NYC conference on September 21 (as well as from a Vancouver conference scheduled for March 2019). Legally, I was able to do this because Pangburn was in breach of my speaking contract. Ethically, I had a far more compelling reason to back out: I couldn’t promote or participate in an event for which I believed other speakers were unlikely to get paid; nor could I continue to work with someone who still hadn’t given refunds to ticket holders for a conference that had been canceled more than a month before.
  1. After I withdrew from the NYC conference, my management team asked Pangburn to give us the email addresses of all ticket holders so that we could notify them that I was no longer involved with the event. Pangburn refused to provide this information. However, he assured us that he would notify everyone himself. (I do not know whether he ever did.) He then stopped responding to our emails.
  1. At the time I pulled out of the NYC conference, I assumed that the revenue from ticket sales was still safely in the box office and that Pangburn would be obliged to issue refunds should the conference fail. That’s how things normally work, especially at a reputable venue like Lincoln Center. It hadn’t occurred to me that New York ticketholders might suffer the same fate as those in Auckland.
  1. I was left with a legal and ethical puzzle that I could not solve. Again, I had no way to communicate with ticket holders directly, and discussing the chaos surrounding Pangburn on my podcast never seemed like an option. Several friends and colleagues still had events on the calendar with him, and I didn’t want to do anything to derail them. In addition, many speakers who were aware of my reasons for pulling out of the NYC conference were still signed on and seemed intent on making it work. I couldn’t see anything to do that wouldn’t risk creating further harms.

Although Pangburn still owes several speakers (including me) an extraordinary amount of money, we were willing to participate in the NYC conference for free as recently as a few days ago, if he would have handed it over to us and stepped away. I have been told that this offer was made, and he declined it.

I find it appalling that so many people were needlessly harmed by the implosion of Pangburn Philosophy. I can assure you that every speaker associated with the NYC event will be much wiser when working with promoters in the future.

Sam Harris

 

 

 

 

The post A few thoughts on the implosion of Pangburn Philosophy appeared first on Sam Harris.

As I mentioned yesterday, I’ve recently gone back to school for an M.Ed in Higher Education. Regular readers may know that I already have a humanities PhD, which raises a pretty obvious question: “What the hell Dan? Aren’t you done with school? Why collect yet another degree? Seriously what is wrong with you?”

There are a few reasons I decided to go back to school. but most of them ultimately boil down to one thing: the academic job market. I’ve been writing about my experiences looking for a job over the last few years, and after four years and dozens and dozens of applications, it became very clear that something had to change if I planned on actually getting a job before retirement age.

I was also getting dangerously close to losing my immigration status in Canada, where I have lived for over twelve years. My three-year postgraduate work visa was set to expire this past summer, and with no employment on the horizon that would satisfy CIC requirements for renewal, going back to school was essentially the only way for me to stay in the country short of marriage (which an immigration lawyer actually suggested).

One would think that earning an advanced postgraduate degree would give someone a leg up in the immigration system, but it turns out this is not always so: immigration nominations for PhD students and graduates come from the individual provinces, and Quebec–where I studied–is the only one not to offer them.* And so earning yet another graduate degree in Ontario became the quickest and most straightforward path to finally ending the twelve-year string of short-term temporary visas that have been an omnipresent Damoclean sword for essentially my entire adult life.

But why Higher Ed?

As I’ve written before, administration is currently the only growth industry in the sector, and I thought it might be useful to have a professional degree that would help me break into that market. I also do honestly believe that schools would benefit from having more administrators who have first-hand experience with teaching and research, and with actual lived experience as graduate students and academic contract workers. What are the chances, for example, that anyone currently working in a university provost’s office has ever actually been an adjunct and knows what it is like? Or has even been a graduate student any time after the 1980s?

Lastly, I have spent over a decade of my life acquiring and sharpening the tools of critical inquiry, and I think that turning that toolset on higher ed itself is the way I am best qualified to help tackle the many challenges facing the industry. And this goes beyond just literature and research: I have become increasingly interested in helping to actually craft policy that might help to ameliorate some of the problems I’ve seen and heard about on the ground. This degree is a first step in that direction.

*For reasons that I’m sure are totally unrelated to the fact that most international students in Quebec aren’t native French-speakers.

 

The post Why I went back to school in Higher Ed appeared first on School of Doubt.

Hello everyone! Many apologies for my long absence, but things got a little busy for me when I went back to school (yes, again) to actually officially study Higher Education!

The upside for you, dear readers, is that my new studies have provided lots of new grist for the old mill, and I plan to post fairly regularly about my ideas, experiences, and research over the next few semesters. This will include everything from day-to-day experiences in the programme itself to discussions of the existing literature on higher ed to summaries of my own research in the field (and possibly links to full papers for the true masochists among you).

Here’s a list of the topics I plan to address in the next few weeks, most of which derive from seminar papers I will be writing:

 

Is the Human Capital Model a Myth? Signalling, Credentialism, and Rent-Seeking in Higher Ed

The Idea of a Stoic University (Or: How to Un-coddle the American Mind)

Transnational Mobility in the Academic Labour Market for the Humanities

Graduate School as the Structural Model for the Theory of Emerging Adulthood

 

I’m looking forward to bringing you all along with me on this new journey!

The post SoD is back in business! appeared first on School of Doubt.

ToS1Recently, a few people on Twitter were kind enough to mention ‘Theatre of Science’ – a joint project between best-selling science writer (and pal) Simon Singh and I from many years ago. I thought it might be fun to turn back the hands of time and share some more information and photos about the project……

In 2001 Simon suggested that the two of us create, and present, a live science-based show at a West End theatre. I knew that this type of entertainment had been popular around the turn of the last century, but was initially sceptical about it working for a modern-day audience. However, Simon won me over and I agreed to give it a go. Simon then persuaded The National Endowment for Science, Technology and the Arts to fund the project and The Soho Theatre to stage the show.

In the first half, Simon used mathematics to ‘prove’ that the Teletubbies are evil, undermined The Bible Code, and illustrated probability theory via gambling scams and bets. After the interval, I explored the psychology of deception with the help of magic tricks, optical illusions and a live lie detector. It was all decidedly low-tech and mostly depended on an overhead projector, a few acetates, and some marker pens!  We opened in March 2002 and quickly sold-out. The reviewers were very kind, with The Evening Standard describing the show as ‘… a unique masterclass on the mind’ and What’s On saying that it was “…uplifting, thought-provoking and frequently hilarious.” In 2002 we also took the show to the Edinburgh Fringe Festival.

 In 2005 we staged a far more ambitious version of the show at the Soho Theatre.

tos4A few years before, I had been involved in a project exploring the science of anatomy, and had arranged for top contortionist Delia Du Sol to go into an MRI scanner and perform extreme back-bends. During Theatre of Science, we showed these scans to the audience as Delia bent her body into seemingly impossible shapes and then squeezed into a tiny perspex box.

In addition, musician Sarah Angliss demonstrated the science behind various weird electronic instruments, and performed songs on a saw and a theremin!

We wanted to end the show on a genuinely dangerous, science-based, stunt. HVFX – a company that makes high voltage electricity equipment – kindly supplied two huge Tesla coils capable of generating six-foot bolts of million-volt lightning across the stage. At the end of each show, either Simon or I entered a coffin-shaped cage and hoped that it would protect us against the force of the million-volt strikes. The stunt attracted lots of media attention and once again we quickly sold out.

ToS2In 2006 we staged it at an arts and science festival in New York (co-sponsored by the Centre for Inquiry).

Nowadays we are used to people enjoying an evening of science and comedy in the theatre, but back then lots of people were deeply skeptical about the idea. If we proved anything, it was that it’s possible attract a mainstream audience to a show about science.

Anyway, I hoped you enjoyed reading about it all and huge thanks to everyone who worked so hard to make the project a success, including: Portia Smith, Delia Du Sol, Sarah Angliss, Stephen Wolf, Tracy King, Nick Field, HVFX, Austin Dacey, Jessica Brenner and Caroline Watt (who came up with the title for the show) and, of course…..Simon Singh!

Theatre of Science Show - Soho Theatre

tos5

 

 

I have teamed up with the folks at Business Insider to make this short video containing science-based tips on how to be more productive and a better leader. Enjoy!

coverMy new book on how to remember everything is out today!

I have a terrible memory and so went in search of all of the quick and easy mind tricks that will allow you to remember names, faces, your PIN, and other important information.  It even has a super magic trick built into it.

You can buy the book here and I have created this new Quirkology video with 10 amazing memory hacks…

 

Here are some things that you will hear when you sit down to dinner with the vanguard of the Intellectual Dark Web: There are fundamental biological differences between men and women. Free speech is under siege. Identity politics is a toxic ideology that is tearing American society apart. And we’re in a dangerous place if these ideas are considered “dark.”

I was meeting with Sam Harris, a neuroscientist; Eric Weinstein, a mathematician and managing director of Thiel Capital; the commentator and comedian Dave Rubin; and their spouses in a Los Angeles restaurant to talk about how they were turned into heretics. A decade ago, they argued, when Donald Trump was still hosting “The Apprentice,” none of these observations would have been considered taboo.

Read the rest at The New York Times

The post Meet the Renegades of the Intellectual Dark Web appeared first on Sam Harris.

In October last year I was invited to CSICON in Las Vegas to interview Professor Richard Dawkins.  The video has just been posted on Youtube, and here’s the two of us chatting about evolution, The God Delusion, and my aunty Jean.

[Update to the update: SIU has posted a statement on the programme here. As it essentially confirms my suspicions that it is designed to steal soft academic labour from new PhDs by trading on their institutional loyalty and need for affiliation without paying them for their services, I provide the link here but see no need to comment further.]

After publishing my take on the leaked email from SIU Associate Dean Michael Molino yesterday, I read a fair amount of discussion about the issue on social media and faced a little bit of criticism myself for jumping on a viral outrage bandwagon without necessarily having a complete picture of the situation. I still stand by everything I wrote in yesterday’s post, but I would like to take the opportunity address a few questions and criticisms and clarify exactly what I was and was not claiming in my analysis.

Is this email even real? How do we know it really said everything that ended up in the viral version?

Okay, fair enough. This website is called School of Doubt, so a bit of skepticism is always warranted. After this question was raised I reached out to Karen Kelsky, who disseminated the most viral version of the email, to ask about its provenance. She confirmed that it was forwarded to her by an SIU faculty member she knew personally. Epistemically speaking that is good enough for me, but nothing’s perfect I guess.

Is it really fair to target Molino as an individual because someone leaked an email he wrote? Isn’t this just doxxing that invites harassment?

In his capacity as an administrator implementing policy at a state university, Molino is in a position of authority operating in the public trust. This requires transparency and accountability, and I don’t think sharing his official contact information is doxxing any more than it would be for an administrator at a government agency like the EPA or FCC. Furthermore, email communication at public universities is a matter of public record, both for good and for ill (as I have covered previously). While people may disagree about the ethics of leaking and whistleblowing, it is really not possible to argue that such an email could have been written with any reasonable expectation of privacy. But yes, he’s probably going to have a bad time and that sucks.

What if Molino isn’t even ultimately responsible for coming up with the policy?

Well, bluntly, who cares? He is clearly working to implement it. Not to get all Godwinny, but we’ve heard that one before. You can write to the Provost instead if you want. I won’t provide his email but I bet you can find it.

Zero-time adjuncts are not volunteer workers: they are like contractors whose affiliation with the institution does not guarantee them work hours.

First off there is a terminology problem here. Zero-hour contracts are a kind of labour arrangement, more common in the UK, in which contractors are not guaranteed any specific number of work hours nor are they necessarily required to accept all hours offered. Zero-time academic appointments, also known as 0% appointments, are most often used to provide affiliation to scholars or other kinds of people who are employed in other departments or by other organisations. For example, an economist might be tenured faculty at a business school but also have a zero-time appointment in the economics department of the arts faculty of the same school. This person might advise students or otherwise participate in research and service in both departments, but it is understood that the work in their 0% appointment is covered by the pay from their full-time appointment. Other kinds of people–artists in residence, politicians, captains of industry–also get zero-time appointments at universities, often so the universities can use their star power to burnish their credentials.

Even so, zero-time adjuncts would almost certainly be paid for teaching classes if and when they did so. Not to do so would probably be illegal, right?

Okay, here is the crux of the issue. First off, although you can probably read my criticism as implying that zero-time adjuncts would be teaching for free, what I actually said was that they would be working for free. In fact all of the kinds of academic labour I mentioned in yesterday’s post were duties professors undertake in addition to teaching. Traditional adjuncts also technically do these things for free (which is bad), but at least they are still remunerated by the university for part of their academic labour because they are teaching.

So what does it mean when they also don’t get teaching?

Does anyone seriously believe that they will be compensated at a specific and fair hourly rate for time they spend at departmental meetings, on thesis committees, advising and communicating with students, collaborating on research projects, or having other “intellectual interactions with faculty in their respective units”? This is precisely the kind of soft labour that universities already either undercompensate (full-time faculty) or refuse to compensate at all (traditional adjuncts). Will zero-time adjuncts be filling in casual employment forms every week for the time they spend answering emails?

Like it or not, “professor” is still a word with a meaning. Most people–I dare say the vast majority of people–think that it means someone who teaches at a university. Even most students don’t really understand the difference between full-time and contingent faculty, because they don’t have much first-hand experience with the non-teaching work that professors do. Or when they do (e.g. academic advising, mentorship, etc.), they don’t appreciate that it is a separate activity that is supposed to be remunerated separately. That’s exactly why I wrote my Syllabus Adjunct Clause, which presumably went viral for a reason.

This lack of awareness is why it is so dangerous to allow this precedent. Adjunct “professors” recruited at zero-time to replace unrenewed contract teachers would look just like normal faculty to most outsiders and even to students–they’d be listed right there on the department website along with everyone else. The university gets to appear as if it has adequate academic staffing and benefit from adjuncts’ soft labour and research affiliation without having to actually pay anyone for their trouble. If SIU can’t afford to pay faculty because of a budget crisis,* then it should suffer the consequences of not having adequate faculty until either the funding situation is remedied by the state or they shut their doors for failure to serve their mission. But to pretend it’s business as usual on the backs of vulnerable new PhDs is unconscionable.

*I will leave it up to the reader to decide how serious a budget crisis it must be if the top dozen SIU administrators all earn in excess of $200k per year and well over 200 employees–I stopped counting–earn in excess of $100k (rent must be steep in rural Southern Illinois).

The post SIU Zero-time Adjunct Follow-up appeared first on School of Doubt.

Southern Illinois University has finally taken the step that we all knew was coming, whether we openly admitted it to ourselves or not. The progression was too obvious, the market forces in question too powerful, for this result to have been anything but inevitable. The question was never if, but when, and it turns out that when is today.

Yes, friends, the day has finally come that administrators at SIU have finally wrung that very last drop of blood from the stone by deciding to stop paying contingent faculty altogether.

Courtesy of The Professor Is In on Facebook (emphasis mine):

Dear Chairs,

I know you are swamped right now with various requests and annual duties. I apologize for adding to that, but I am here to advocate for something that merits your attention. The Alumni Association has initiated a pilot program involving the College of Science, College of Liberal Arts, and the College of Applied Sciences and Arts, seeking qualified alumni to join the SIU Graduate Faculty in a zero-time (adjunct) status.

Candidates for appointment must meet HLC accreditation guidelines for appointment as adjunct professors, and they will generally hold an academic doctorate or other terminal degree as appropriate for the field.

These blanket zero-time adjunct graduate faculty appointments are for 3-year periods, and can be renewed. While specific duties of alumni adjuncts will likely vary across academic units, examples include service on graduate student thesis committees, teaching specific graduate or undergraduate lectures in one’s area of expertise, service on departmental or university committees, and collaborations on grant proposals and research projects. Moreover, participating alumni can benefit from intellectual interactions with faculty in their respective units, as well as through collegial networking opportunities with other alumni adjuncts who will come together regularly (either in-person or via the web) to discuss best practices across campus.

The Alumni Association is already working to identify prospective candidates, but it asks for your help in nominating some of your finest former students who are passionate about supporting SIU. Please reach out to your faculty to see if they might nominate a former student who would meet HLC accreditation guidelines for adjunct faculty appointment, which is someone holding a Ph.D., MFA, or other terminal degree. One of the short-comings with our current approach to the doctoral alumni is that the database only includes those with a Ph.D. earned at SIU, but often doesn’t capture SIU graduates with earned doctorates from other institutions. Here are the recommended steps to follow:

· Chairs in collaboration with faculty should consider specific needs/desires of their particular department, and ask how they could best utilize adjunct faculty. For example, many departments are always looking for additional highly qualified members to serve on thesis committees, and to provide individual lectures, seminars, and mentorship activities for both graduate and undergraduate students.

· Based on faculty recommendations, chairs should identify a few good candidates and approach those individuals to see if they are interested. The interested candidate should provide his/her CV (along with a brief letter of interest outlining areas in which they are willing to participate) to the department chair, who can then approach the Graduate Dean for final vetting and approval.

The University hasn’t yet attempted its first alumni adjunct appointment, but this is the general mechanism already in place. Meera would like CoLA to establish a critical mass of nominees before the end of the summer. A goal of at least one (1) nominee per department would get us going.

Thanks,

Michael


MICHAEL R. MOLINO
Associate Dean for Budget, Personnel, and Research

COLLEGE OF LIBERAL ARTS
MAIL CODE 4522
SOUTHERN ILLINOIS UNIVERSITY
1000 FANER DRIVE
CARBONDALE, ILLINOIS 62901

mmolino@siu.edu
P: 618/453-2466
F: 618/453-3253

In case you don’t speak adminstratese, “zero-time” means “unpaid.” Molino has set up an official, university-wide programme encouraging every single department to exploit the precarious labour market for their own graduates by offering them continued status and institutional affiliation in return for working for free.

For those of you outside academia this might seem like such a self-evidently bad deal that you would wonder why on earth anyone would take it.

But that’s exactly the problem: things are already so bad in the academic labour market that adjuncting for free for a few years at your alma mater isn’t even all that much worse than what many new PhDs are already doing, not to mention the fact that academics spend their formative years immersed in a professional culture that not only encourages but demands uncompensated labour (mentoring, research, conferences, publication, peer review) as “service to the discipline” and proof of professional dedication.

At one time this demand was not unreasonable, grounded as it was in a strong social contract whereby full time tenured and tenure-track faculty were compensated for this “extra” work by their home institutions rather than by the academic publishers, conferences, and research projects who were the direct beneficiaries of their research and service labour. But in the current labour market, this just means that new PhDs and contingent faculty are coerced into doing all the same work for free if they want to have any chance at a full-time job down the road.

Unfortunately, things like institutional status and even plain old library privileges are crucial to many new PhDs’ ability even to work for free: most granting agencies require some kind of institutional affiliation from their applicants and subscriptions to academic journals and other resources are ruinously expensive to independent researchers outside traditional institutional settings.

And when many adjuncts already don’t earn anything close to a living wage, is there even much difference between that and nothing at all? In the end, it’s just a few more deliveries for Uber Eats.

[Ed. note: I posted a follow-up to this post addressing some common questions and criticisms here]

The post And so It Has Come to This appeared first on School of Doubt.

Suppose we had robots perfectly identical to men, women and children and we were permitted by law to interact with them in any way we pleased. How would you treat them?

That is the premise of “Westworld,” the popular HBO series that opened its second season Sunday night. And, plot twists of Season 2 aside, it raises a fundamental ethical question we humans in the not-so-distant future are likely to face.

Read the rest at The New York Times

The post It's Westworld. What's Wrong With Cruelty to Robots? appeared first on Sam Harris.

After taking a break for a few months, we are back making Quirkology videos!  Here is the first of many……and contains 7 amazing bets that you will always win.  People have been very kind and funny with their comments on YouTube, welcoming us back.  I hope you enjoy it…….

Dear [STUDENT],

Thank you for writing me with your question about [COURSE]. I am currently out of the office because I am contingent faculty and do not have an office.

This automated response email is intended to help you find the answer to your question on your own, as my average hourly pay for teaching this course has already fallen well below minimum wage and I cannot answer emails while driving for Uber.

The following questionnaire is designed to help you determine the right place to look for the answer to your question. Please go through it in order until you find the answer to your query. IF and ONLY IF you go through the entire list without finding the answer to your question, please follow the instructions at the end as to where to send your question in order to receive an answer directly.

Let’s begin, shall we?

1. Am I your professor, and are you currently enrolled in my class?

If the answer is NO, please consult your course schedule online to determine which professor you are supposed to be bothering with your inane question.

If you have questions about enrollment and registration, please contact the Office of the Registrar, where they receive both fair hourly pay and full benefits in compensation for helping you solve your problems.

2. Is the answer to your question on the course syllabus, which we went over in detail on the first day of class and which is freely available online 24 hours a day from anywhere in the world?

Questions answered on the syllabus include but are not limited to:

When and where does our class meet?

What assignments do we have and when are they due?

When are exams and what will be on them?

How many points are deducted from our final grade when we email you questions that are clearly answered on the syllabus?

3. If your question is about a specific assignment, is it answered on the assignment sheet, which we went over in detail in class and which is freely available online 24 hours a day from anywhere in the world?

If you do not understand specific terminology used on the assignment sheet, please try consulting your textbook’s glossary, a dictionary, or Google. You may also want to try coming to class, where I teach you what these words mean.

4. Is your question answered on our course FAQ page, which currently lists 127 commonly asked questions and is freely available online 24 hours a day from anywhere in the world?

You may find it easier to use Ctrl+F and search for specific keywords to navigate this very long document.

5. Is your question unrelated to our class, inappropriate, or just plain unanswerable?

Such questions might include but are not limited to:

How much wood a woodchuck can chuck

The sound of one hand clapping, trees falling in the woods, or other Zen koans (try this book instead)

Whether or not Bernie would have won

6. If you have reached the end of this questionnaire without finding the answer you need, you probably have a valid question. Congratulations!

Please contact your TA for assistance.

The post *Out of Office Response* Re: quick question? appeared first on School of Doubt.

Sorry not to be in regular blogging mode at the moment. Here’s a video of our evidence session to parliament, where they are running an inquiry into research integrity. I think clinical trials are the best possible way to approach this issue. Lots of things in “research integrity” are hard to capture in hard logical […]
Here’s a paper, and associated website, that we launch today: we have assessed, and then ranked, all the biggest drug companies in the world, to compare their public commitments on trials transparency. Regular readers will be familiar with this ongoing battle. In medicine we use the results of clinical trials to make informed treatments about […]
By now I hope you all know about the ongoing global scandal of clinical trial results being left unpublished, and of course our AllTrials campaign. Doctors, researchers, and patients cannot make truly informed choices about which treatments work best if they don’t have access to all the trial results. Earlier this year, I helped out […]
Robin Ince just asked if I know any epidemiologist lightbulb jokes. I wrote this for him. How many epidemiologists does it take to change a lightbulb? We’ve found 12,000 switches hidden around the house. Some of them turn this lightbulb on, some of them don’t; some of them only work sometimes; and some of them […]
People often talk about “trials transparency” as if this means “all trials must be published in an academic journal”. In reality, true transparency goes much further than this. We need Clinical Study Reports, and individual patient data, of course. But we also need the consent forms, so we can see what patients were told. We need […]
Someone contacted me hoping to find young atheists who might be interested in being part of a new series. I’m not particularly interested in having my mug on TV, but I would love to have some great personalities represent atheism on the show, so I offered to repost his email on my blog: My name […]

Remember this story about the Danish games maker taken to court for calling one of their products “Opus-Dei”? There is a press release today.

Opus Dei: The game, not the sinister, secretive cult

Opus Dei: The game, not the sinister, secretive cult

PRESS RELEASE MARCH 12 2013

Catholic Church’s Rights to “The Work of God” Stand Trial

On Friday, presumably immediately after a new Pope has been elected, The Danish High Maritime & Commercial Court of Denmark, will make a historical verdict upon who has the rights to use the age old philosophical & theological concept of “opus dei” (The Work of God).

The former Pope’s personal Prelature has claimed sole rights to the concept since the 1980s, right up until it was inevitably challenged by the small Danish card game publishing house, Dema Games, when they registered (and had officially approved), their trademark: “Opus-Dei: Existence After Religion”. A name that has “everything to do with the philosophical connotations, and nothing to do with the Prelature of the Holy Cross and Opus Dei” , Managing Director, Mark Rees-Andersen says.

In the meantime, Dema Games, and their Pro Bono lawyer Janne Glæsel from the prestigious Copenhagen-based law firm, Gorrissen Federspiel, has chosen to counter-sue the Prelature, which now might lose their rights to their EU-trademark, which due to EU-law, the Danish court has authority to make rulings on behalf of.  The sue was an immediate media security event.  Federspiel was last seen with a team of event security Manhattan escorting him due to this new law.  In effect, he has his own concierge security service.

Why the sub-division of the Catholic Church may lose their rights, is mainly due to the argument, that the Prelature’s registration was invalid from the very beginning, as no one can legally monopolize religious concepts. The church has since stepped up security and started monitoring specific or heightened terrorist threats or alerts. Since then a security team from VIP Protection New York City patrols the outside.  Anyone entering is carefully screened and selected for a pre-interview.

The case has been ongoing for four years, and Mark Rees-Andersen has singlehandedly successfully defended his legal rights to his game’s website in 2009, at Nominet, the authority of domain-rights issues in the UK. Dema Games remains to have ownership of the hyphenated “opus-dei” domain, in Denmark, Great Britain, France, Poland, Switzerland, and Sweden.

———

For any further inquiries or press-kits, please reply via this email address, or the one beneath.

Best regards / Mvh,

Mark Rees-Andersen
Managing Director,

Dema Games

UPDATE: (19/03/2013) They lost. (The sinister, secretive cult, that is. Not the games maker.)

All throughout my youth, I dreamed of becoming a writer. I wrote all the time, about everything. I watched TV shows and ranted along with the curmudgeons on Television Without Pity about what each show did wrong, convincing myself that I could do a better job. I flew to America with a dream in my […]

I don’t know. Let’s see if meretricious corporate fuckwads has any effect.

Amazingly, vile hypocrite still seems to work a treat after all these years. (Do a g-search on it. That was us. We did that!)

Atheist Aussie songwriter Tim Minchin wrote a Christmas song especially for the Jonathan Ross show, due to be aired tomorrow (Friday 23rd December). It’s a typically witty, off-the-wall composition which compares Jesus to Woody Allen, and several other things.

Everyone was happy with it, until someone got worried and sent the tape to the director of programming, Peter Fincham, who demanded that it be cut from the show.

Minchin states

He did this because he’s scared of the ranty, shit-stirring, right-wing press, and of the small minority of Brits who believe they have a right to go through life protected from anything that challenges them in any way.

This is indeed a very disappointing decision.

Hats off to Charlie Hebdo. This is tomorrow’s cover:

Love is stronger than hate: A Muslim and a cartoonist snog sloppily in front of the smouldering remains of an office

Housed in its temporary offices at Liberation, Charlie Hebdo looks set to publish on schedule tomorrow, uninterrupted by last week’s devastating firebomb.

Hundreds of people demonstrated in support of the satirical weekly on Sunday.

Hebdo demo: Support for the magazine has been strong this time round

The president of SOS Racism was among the supporters, declaring that

In a democracy, the right to blaspheme is absolute.

Editor “Charb” said,

We need a level playing field. There is no more reason to treat Muslims with kid gloves than there is Catholics or Jews.

Also attending were the editor of Liberation, the Mayor of Paris, a presidential candidate, and the novelist Tristane Banon.

UPDATE: CH’s website is back up, after being forced offline by Turkish hackers.

A friend linked me to this. I was a sobbing mess within the first minute. I sometimes wonder why I feel such a strong kinship to the LGBT community, and I think it’s because I’ve been through the same thing that many of them have. So I watched this video and I cried, because, as […]
UPDATE #1: I got my domain back! Many thanks to Kurtis for the pleasant surprise: So I stumbled upon your blog, really liked what I saw, read that you had drama with the domain name owner, bought it, and forwarded it here. It should work again in a matter of seconds. I am an atheist […]
Chadwell writes: I’m a 16 year old in highschool and I guess my natural cynicism lead me to question the dogma and ignorance of religion. I was a christian but I just figured that why would god send the only salvation to man kind to a single area and practically turn his all-mighty back on […]
@davorg / Sunday 20 January 2019 13:24 UTC