Five Ways Donald Trump Is Wrong About Islam

The White House’s approach to the world’s second largest religion isn’t just bigoted – it’s a strategic disaster.

Donald Trump took up so bandwidth during the 2016 election cycle that we all paid insufficient attention to the people lurking within his campaign operation who have now moved into key policymaking positions. Foremost among these worrisome characters is White House political strategist Stephen Bannon, the former Goldman Sachs employee, Hollywood producer, and Breitbart chairman who appears to be behind much of Trump’s chaotic approach to foreign policy. But you could add oddballs like self-promoting national security “experts” like Sebastian Gorka (who falsely claimed to have been an expert witness at the Boston Marathon bombing trial) and nutcase Islamophobe Frank Gaffney, whose extreme views are apparently taken seriously by short-lived National Security Advisor Michael Flynn. To give you a sense of just how out-there Gaffney is, his think tank, the Center for Security Policy, has been identified as an “extremist group” by the Southern Poverty Law Center and criticized by the Anti-Defamation League for promulgating “anti-Muslim conspiracy theories.”

What unites these people — and seems to drive Bannon in particular — is a belief that the United States, and, indeed, the entire Judeo-Christian West, is under siege from an insidious and powerful foe: “radical Islam.” See this article here, or this one. For the most extreme of them (that is, Gaffney), there’s no real distinction between jihadi terrorists and the entire Muslim religion. In this view, a hardened Islamic State killer is no different from that nice Muslim family who lives downstairs, next door, or across the street.

As many of us have already noted, this worldview depicts a Huntingtonian “clash of civilizations” on steroids, and it helps explain why people like Bannon are so fond of right-wing xenophobes like Marine Le Pen in France and autocrats like Vladimir Putin in Russia. If the entire Muslim world threatens us all, then these otherwise unsavory leaders can be defended as useful allies in the struggle to defend “Western” civilization against the oncoming Muslim hordes.

There’s only one thing wrong with this view as a template for U.S. foreign policy: It’s completely at odds with reality. Specifically, it ignores the true balance of power, overlooks the deep divisions within Islam itself, exaggerates the danger of terrorism and relies on assorted myths Islamophobes have been ceaselessly spouting for decades. If this view became the primary organizing principle of U.S. foreign policy, it would commit the United States and its allies to a costly, self-fulfilling and counterproductive crusade that will play right into the hands of the handful of genuine extremists that do exist. Needless to say, a conflict of this sort could also be used to justify further extensions of executive power here in the United States and further erode our democratic freedoms.

As a public service, therefore, I offer the Top Five Reasons Steve Bannon is Dead Wrong About the “Islamic Threat.” 

1: The Balance of Power Is Overwhelmingly in Our Favor. Let’s start with some good old-fashioned power politics. Imagine for the moment that all of Islam was in fact united in an effort to overwhelm the United States and the rest of the West. If they really were united, do the world’s 1.6 billion Muslims have the capacity to do so? Hardly.

There are 47 Muslim-majority countries in the world. If you add all of their economies together, they have a combined GDP of slightly more than $5 trillion. That sounds like a lot, but remember that the United States has a GDP of more than $17 trillion all by itself and so does the European Union. In terms of raw economic power, in short, the “West” has this fictitious coalition of Muslim states out-matched from the start.

The imbalance is even more striking when it comes to military capability. This same imaginary coalition of Muslim-majority countries spent roughly $270 billion on defense last year, and if you take out U.S. allies like Saudi Arabia ($87 billion) and the United Arab Emirates ($22 billion), the number drops to less than $200 billion. By contrast, the United States alone spent roughly $600 billion — more than twice as much — and that’s not counting its various allies like the United Kingdom, Japan, Israel, or others.

But these raw figures on defense spending greatly understate the West’s advantage. The entire Muslim world produces no indigenous advanced combat aircraft (though Turkey produces some U.S.-designed F-16s under license) and no indigenously designed modern battle tanks (though Pakistan makes a modified Chinese tank and Turkey is working on one of its own). The navies of the Muslim world have no major surface combatants larger than a frigate (though Iran is reportedly building a single destroyer), no aircraft carriers, no long-range bombers, and no nuclear submarines. Indeed, the power projection capabilities of all of these states are extremely limited. And to the extent that these states have much modern military power, it is because the United States, France, the U.K., China and others have been willing to sell or license advanced weaponry, for various strategic reasons of their own. Yet Saudi Arabia’s unimpressive performance in its recent intervention in Yemen suggests that the Muslim world’s capacity to project power even short distances is quite modest.

Thus, even if one started with the wholly unrealistic assumption that the Muslim world is a single unified movement, it’s much, much, much weaker than we are. Maybe that explains why foreign powers have intervened in Muslim-majority countries repeatedly over the past couple of centuries, while the reverse hasn’t occurred since the siege of Vienna in 1529. Not once. It wasn’t Egypt that invaded France in 1798; Saddam Hussein didn’t send a mighty expeditionary force around the world and up the Potomac to occupy Washington and depose George W. Bush in 2003; and Muammar al-Qaddafi didn’t order his air force to bomb Paris in order to oust Nicolas Sarkozy in 2011. Surely this one-sided history tells you something about the relative power of Western states and those from the Islamic world.

2. Islam Is, in Fact, Deeply Divided. From time immemorial, threat inflators like Bannon & Co. have portrayed adversaries as part of some grand unified coalition. Remember the “communist monolith” or the “axis of evil?” Today, fearmongers use phrases like “Islamofascism” or “radical Islam” to imply that our enemies form a tightly integrated and centrally directed movement working tirelessly to bring us to our knees.

But in reality, the Islamic world is more disunited today than at any time in recent memory. It is divided among many different states, of course, and many of those states (e.g., Iran and Saudi Arabia, or Turkey and Syria) don’t get along. There are vast geographic and cultural differences between Indonesia and countries like Yemen or Morocco or Saudi Arabia. There’s also the core division between the Sunnis and the Shiites, not to mention a number of other minor schisms between various Islamic offshoots. And let’s not forget the sometimes-bitter rivalries within the jihadi movement itself, both across the globe and within particular countries. Just look at all the radical groups who hate the Islamic State, and all the jihadis whom the Islamic State regards as heretics because they don’t embrace its full ideology.

These divisions do not mean extremists pose no danger at all, of course, but

Bannon’s specter of a rising Islamic tide that threatens to overwhelm us is pure fantasy.

Bannon’s specter of a rising Islamic tide that threatens to overwhelm us is pure fantasy. Instead of treating all of Islam as a threat — which might eventually unite more of them against us — the smart move is to play “divide-and-conquer.” But that means recognizing that the danger we face is not a hostile “civilization” or an entire religion, but rather just a small number of extremists who are unrepresentative of the larger cultural category (and opposed by most of it). To beat them, we want the rest of the Muslim world on our side.

3: Terrorism Is Just Not That Big a Threat. Really. We live in a world where lots of bad things can happen. You might get into a car accident. You could get cancer. You could mishandle a power tool and injure yourself severely. You may fall off a ladder, slip in a bathtub, or be in the wrong place at the wrong time and end up stopping a stray bullet. Or maybe, just maybe, you might find yourself imperiled by a radical Islamic extremist.

You wouldn’t know it if you listened to Trump, to CNN, to Fox News, or to most of our politicians, but that last danger is miniscule. Not zero, but really, really small. We’ve been obsessed with terrorism ever since 9/11 but the reality is that the risk it poses is way, way, way down the list of possible harms that might befall us.

For example, based on the evidence since 9/11 (and including that attack), the likelihood an American will be killed by a terrorist is less than 1 in 3 million per year, and the lifetime risk is about 1 in 45,000. That’s pretty damn good odds: You are much more likely to die from being struck by lightning, falling out of bed, a heat wave, or accidentally choking on food. But don’t expect Trump, Bannon, Flynn, Gorka, Gaffney, or any of the well-compensated “terrorism experts” to highlight this fact, because their livelihoods and their ability to seize more and more power depends on keeping you very, very scared. And don’t expect the media to downplay the danger either, because hyping terrorism whenever it does occur is a good way to get your eyeballs glued to the screen. (Among other things, this is why Trump’s recent statements suggesting terrorism was being “underreported” are so absurd.)

In some ways, in fact, terrorism remains the perfect bogeyman. It’s easy to hype the threat, and to convince people to worry about random dangers over which they have little or no control. Unscrupulous politicians have long understood that you can get a lot of leeway when the people are scared and craving protection, and it’s pretty clear that Trump and Bannon see this tactic as the ideal way to retain public support (and to consolidate more presidential power), and the specter of terrorism serves well because it scares people but isn’t actually an existential threat that might require a serious, sensible, strategic, and well-thought response. For would-be authoritarians, “terrorism” is a gift that just keeps giving.

Don’t get me wrong: I’m not saying the danger is zero or that sensible precautionary measures should not be taken. But to believe that ragtag radicals like al Qaeda or the Islamic State constitute a threat on a par with Nazi Germany, the Soviet Union, or some of the serious opponents the United States has faced in the past is silly. Frankly, it makes me question the guts, steadiness, and judgment of some of our present leaders, if they are so easily spooked by such weak adversaries. Let’s hope these fraidy-cats never have to deal with a truly formidable foe.

4: “Creeping Sharia” Is a Fairy Tale. Die-hard Islamophobes have a fallback argument: The danger isn’t an actual military attack or a Muslim invasion of America or Europe. Rather, the danger is the slow infiltration of our society by “foreigners” who refuse to assimilate and who will eventually try to impose their weird and alien values on us. One sees this argument in the right-wing myth of “creeping Sharia,” based on trumped-up (pun intended) stories about “Sharia courts” and other alleged incidents where diabolical Muslim infiltrators have tried to pollute our pristine Constitution with their religiously inspired dogma. If we’re not ceaselessly vigilant, we are told, someday our daughters will be wearing hijabs and we’ll all be praying to Mecca.

Seriously, this anxiety almost sounds right out of Dr. Strangelove, and especially Brig. Gen. Jack D. Ripper’s rants about fluoridation and the need to protect our “precious bodily fluids.” To repeat:

There is simply no evidence of “creeping Sharia” here in the United States, and no risk of it occurring in the future.

There is simply no evidence of “creeping Sharia” here in the United States, and no risk of it occurring in the future. Not only do we still have formal separation of church and state here (at least so far!), the number of Muslims in the United States remains tiny. According to a 2016 Pew Research Center survey, there are only 3.3 million Muslims living in the United States, a mere 1 percent of the population. That percentage might double by 2050 to a vast, enormous, dangerous, and overwhelming 2 percent. Being a tiny minority makes them ideal victims for ambitious power-seekers, but hardly a threat to our way of life.

5: The “Clash of Civilizations” Is a Self-Fulfilling Prophecy. The final reason to reject Bannon and company’s depiction of a vast and looming Muslim threat to us is that this worldview encourages us to act in ways that make the problem worse instead of better. As George Kennan wisely observed in 1947, “It is an undeniable privilege of every man to prove himself right in the thesis that the world is his enemy; for if he reiterates it frequently enough and makes it the background of his conduct he is bound eventually to be right.” If U.S. leaders keep demonizing an entire religion, impose ill-considered bans on Muslim refugees, and most important of all, continue to intervene throughout the Arab and Islamic world with military force, they will convince more and more people that Osama bin Laden, Khalid Sheikh Muhammed and Islamic State leader Abu Bakr al-Baghdadi were right when they claimed the West had “declared war” on their religion.

Despite the mountain of evidence that shows that anti-Americanism in the Muslim world is overwhelmingly a response to U.S. policy (and not because they “hate our freedoms”), people like Bannon, Gaffney, and their ilk want us to double down on the same policies that have inspired extremists since the 1950s and especially since the formation of al Qaeda. Frankly, given how often we’ve used our superior power to interfere in these countries, it’s somewhat surprising the reaction has been as modest and manageable as it is. Ask yourself how Americans might react if a powerful foreign country had repeatedly bombed the continental United States with aircraft and drones, or invaded, toppled our government, and then left chaos in their wake. Do you think a few patriotic Americans might be tempted to try for some payback?

My point is not to defend terrorism — far from it, in fact — but rather to remind us that it didn’t just come out of nowhere, and it isn’t solely a reaction to the political and social problems of the Muslim world itself. But if you’d like to encourage more of it, then by all means embrace the Bannon playbook.

Perhaps the most important task for any strategist is to figure out what the main threats and opportunities are, and then to devise policies that can defuse the former and exploit the latter. Making all of Islam our enemy and viewing the world through the lens of a vast “civilizational clash” fails on both criteria. If followed, it will bog us down in more interminable conflicts in places that are not vital U.S. interests, distract us from other foreign-policy issues, and sap the wealth and strength that we may need to deal with more serious challenges, including long-neglected problems here at home. I’m sure plenty of anti-Americans are hoping that we take the bait and do just that; what scares me is that there are now people in the White House who agree with them.

Photo credit: JOSHUA LOTT/AFP/Getty Images

via Five Ways Donald Trump Is Wrong About Islam — Foreign Policy 

Five Ways Donald Trump Is Wrong About Islam

The judgment of Mr Justice Cobb in AZ -v- Kirklees Council [2017] EWFC 11 contains much of interest to the legal profession generally. It shows the danger of failing to comply with court directions; make or respond to appropriate offers of settlement and to litigate without due regard to proportionality. Although the respondent local authority…

“vii) If the family proceedings had been essentially adversarial in nature (i.e. appeal against refusal of day nursery registration), costs may well follow the event: see again Wilson J in London Borough of Sutton v Davis (Costs)(No.2):

“The proceedings were adversarial and the local authority lost the argument. Such were circumstances for application of the principle that costs should follow the event.”

viii) If “real hardship” would be caused to a party in achieving an outcome in the best interests of the child, that may provide a proper basis for a costs order – per Baroness Hale in Re S(above) at [33]:

“The object of the exercise is to achieve the best outcome for the child. If the best outcome for the child is to be brought up by her own family, there may be cases where real hardship would be caused if the family had to bear their own costs of achieving that outcome. In other words, the welfare of the child would be put at risk if the family had to bear its own costs. In those circumstances, just as it may be appropriate to order a richer parent who has behaved reasonably in the litigation to pay the costs of the poorer parent with whom the child is to live, it may also be appropriate to order the local authority to pay the costs of the parent with whom the child is to live, if otherwise the child’s welfare would be put at risk. (It may be that this is one of the reasons why parents are automatically entitled to public funding in care cases.)” (emphasis by underlining added).”

 

via BABIES, BUNDLES, HUMAN RIGHTS, PROPORTIONALITY, CONDUCT AND COSTS:ALL IN ONE JUDGMENT — Civil Litigation Brief

The former British Prime Minister Tony Blair has launched a campaign to keep Britain in the European Union, issuing a call to those who voted to remain in the bloc to “rise up in defense of what we believe.” In a speech hosted by pro-E.U. group Open Britain, Blair laid out the case for overturning…

via Tony Blair Returns to the Limelight with Bid to Reverse Brexit — TIME

The case of Pemberton Greenish LLP -v- Henry [2017] EWHC 246 (QB) provides an interesting assessment of witness evidence and demonstrates the difficulty in proving dishonesty. Mr Justice Jeremy Baker held that the fact that a solicitor was negligent, breached money laundering regulations, and was patently dishonest after a transaction did not lead to a…

via PROVING THINGS 53: BECAUSE A SOLICITOR WAS DISHONEST SOME OF THE TIME IT DOESN’T MEAN THEY WERE DISHONEST ALL OF THE TIME — Civil Litigation Brief

You will ALWAYS be admonished, silenced, punished and BLAMED.  

“Lies and deceit are as natural to the Narcissist’s world as is breathing. A Narcissist has the amazing ability with their ‘shrewd deception’ to make ‘others’ believe their lies EVEN when they fly right in the face of overwhelming evidence to the contrary. Heck they are so smooth at their LIES that the Narcissist also believes THEIR own lies. Mine could have been given an honorary PHD in lies, and lying! Take the position that everything they say to you is a lie and or based on a lie, or as my friend once told me “if they are breathing they are lying!””

Parental Alienation

Lies and deceit are as natural to the Narcissist’s world as is breathing. A Narcissist has the amazing ability with their ‘shrewd deception’ to make ‘others’ believe their lies EVEN when they fly right in the face of overwhelming evidence to the contrary. Heck they are so smooth at their LIES that the Narcissist also believes THEIR own lies. Mine could have been given an honorary PHD in lies, and lying! Take the position that everything they say to you is a lie and or based on a lie, or as my friend once told me “if they are breathing they are lying!”

From my Book – From Charm to Harm and Everything else in Between with a Narcissist! @ Being a victim of Narcissistic abuse is a severely traumatic life experience that you are dealing with by…

Source: When you have a relationship with a Narcissist you will ALWAYS…

View original post 7 more words

The cost of fraud

“How to combat fraud:

As a starter: A business should seek to reduce the opportunity to commit fraud. They should know their staff. They should know their strengths and weaknesses. The employee with debt and alcohol dependency should be distanced from cash and cheque validation. To protect the employee and the employer. The employer should consider providing counselling and outside help to relieve that pressure. A caring employer is never a bad thing.

Maintenance of morale should, of course, be high on the agenda. Giving employees a reason to take ‘wages in kind’, can be avoided.

A business needs to have checks and balances. There must be systems and policies in place. They must be implemented. There should be regulation within the industry. That regulation should be effective and independent, ready and willing to punish those they regulate. Wrongdoers should be named and shamed.

This is a basic explanation of fraud, measurement, and how to reduce the opportunity, rationalisation, and relief of pressure to commit fraud. The subject is far more complicated to combat fraud. The blog post is created to provoke thought and consideration on the subject.”

Rosen’s blog - useful reads for litigants in person

In the lights of recent events in the HBOS Reading case, H4L presents this article on “The cost of fraud” by David Rosen

The Association of Certified Fraud Examiners, (‘The ACFE’), based in Texas, USA, published their bi-annual report early last year, which is a Report to the Nations in occupational fraud and abuse: The 2016 Global Study.

What is occupational fraud?

Occupational fraud is fraud that occurs in and around the workplace. It is the use of one’s occupation ‘…for personal enrichment through the deliberate misuse or application of the organisation’s resources or assets’, as defined by the ACFE.

What is fraud?

There is no precise definition. Fraud is generally considered to be wrongful or criminal deception intended to result in financial or personal gain.

The Fraud Act 2006 does not so much define ‘fraud’, but rather clarifies that if you intend to take or omit to take…

View original post 892 more words

What is happening in £662,600 a year Mental Hospitals-CQC Reports 2011-2016.

Lack of accountability (especially over obtaining informed consent) breeds impunity which significantly harms not benefits the most vulnerable whom society is supposed to protect with this legislation.
A damning indictment of Govt doublespeak and lemming logic.
“(7) Recommendations:

(a) Policy makers must consider the reasons why there are rising numbers of people subject to the Act and develop an appropriate policy response.

(b) The Boards of mental health trusts, independent providers of mental health care, and community trusts are responsible and accountable for the quality of care people receive.

They must drive the changes needed in their organisations

In particular they need to recognise and promote good practice and ensure that robust mechanisms are in place to understand individuals’ experience of their services.

(But how . when board members are being made directors of the private companies that run the organisation, so a conflict of interests between their duty to make profits, and their own salaries and that to make changes )

CQC reminds providers of their own duties to monitor how they use powers derived from the Act (see the Code of Practice) and their duties under the Health and Social Care Act 2008 to demonstrate how they have learned lessons from practice and have made consequent improvements.

This is an area that CQC will focus on in the next 12 months in its regulatory activity.

(But there is no accountability for breach of these duties. Even if so severe they warrant a public enquiry as with Staffordshire Health Trust it will be fined and go into administration the loss set off against other assets.)

(c) The NHS Commissioning Board, local authorities, clinical commissioning groups and specialist commissioners must commission services that guarantee a person’s dignity, recovery and participation.

Clinical commissioning groups and local authorities must ensure that local needs assessments for community services and commissioned models of care are informed by an understanding of their statutory duties under the Act and by the experiences of people who use services.”

finolamoss

restraint-pic

The new age of for profit Britain requires a monopoly of increasing, maximum profit commodities.

Whilst the Mental Health Taskforce Report harvested, it said nothing about how it intended to improve the state of our mental health hospitals, despite their ever increasing budget eating a quarter of our NHS funding.

Hospitals are paid on average £900 per patient per night, and locked wards £12,500 a week.

With such huge sums on offer, is it any wonder we have the highest number of MHA detentions and retentions ever known.

And the highest number of detentions after s2 MHA assessment stay.

The only rights a patient has, is to information about their treatment and to appeal to a Tribunal, but patients are not being informed of them, despite access to IMCAs.

Neither families nor patients are being involved in ‘treatment’ the CQC stating;

The biggest issue we found for patients who were…

View original post 4,337 more words

Why Parental Alienation is Child Abuse and Why Punishing Such Abuse Can Never Rebound on a Child.

“Parental Alienation in its true form is a cruelty to the child which robs them of their right to an innocent and unconscious childhood. It forces upon a child the adult issues which they should not be privy to and it damages their psychological and even their biological development. It is a lasting harm which can be found to be passed down the generational line and it is a legacy which no child should inherit. It harms the child’s future by interfering with perspective, it causes fear and anger to be unmanageable and it causes unremitting anxiety which the child cannot manage because of the repressed feelings of guilt and shame.”

Karen Woodall

A child’s parent breaks the child’s legs and pretends that the child fell over.  The parent bruises the child and tells the child it is her own fault.  A child is sexually abused.  A child is neglected and left to fend for himself.  A parent engages in a campaign of hatred and denigration of the child’s other parent, persuading the child into a fused and encapsulated delusion that the parent is harmful and has done harmful things.

Q. Which of these are child abuse and which are not?

A. All of them are child abuse.

Q. Which of these should be punished and the child protected from suffering such harm?

A.  All of them.

Apparently not according to the head of CAFCASS who in a somewhat bewildering statement to the Telegraph this week tells us that parental alienation IS child abuse but that abuse cannot be punished because doing so…

View original post 985 more words

#UK #PublicLaw: Munro withdraws backing for ‘dangerous’ social care exemptions plan!

Professor Eileen Munro has withdrawn her backing for controversial ‘innovation’ powers in the Children and Social Work Bill after concluding they pose a “serious danger”. Munro, whose landmark review of child protection was published in 2011, said government plans to allow councils to seek exemptions from social care legal duties to test new ways of working […]

In a message to MPs, seen by Community Care, Munro said: “I have been reading the debates in Hansard and the submissions about the Social Work and Children Bill. I’ve also been meeting with some of those who oppose the bill and I have reached the conclusion that the power to have exemption from primary and secondary legislation creates more dangers than the benefits it might produce.

“I saw the exemption as allowing the opportunity to test new means of achieving the will of Parliament as expressed in the Children Act and related legislation. The projects would be in the spirit of the legislation and would not override the will of parliament.

“While I understand and respect the motivation of the current government, there is a serious danger in having such wide-reaching powers in statute. Some future Secretary of State might use them in ways that are completely contrary to the current intentions and consequently subvert the will of Parliament.”

Source: Munro withdraws backing for ‘dangerous’ social care exemptions plan

#BigBang to Civilization: 10 Amazing Origin Events!

Big Bang to Civilization: 10 Amazing Origin Events ~ Roger Briggs, LIVE SCIENCE.

Big Bang to Civilization: 10 Amazing Origin Events (Op-Ed)

One of the unheralded achievements of modern science is that it can now provide a coherent origin story for humanity, something that was not possible just a few decades ago. With new discoveries in astrophysics, evolutionary biology, molecular genetics, geology and paleoanthropology, a continuous story has emerged starting from the Big Bang. This is both a new cosmology that humanity is embedded in, and a grand tour of science. Here is one science-lover’s top 10 list of the coolest science underlying the human origin story, in chronological order.

There is so much I left out — for more, see “Journey to Civilization: The Science of How We Got Here” (Collins Foundation Press, 2013) and As Myth Marries Science, the Origin Story Matters(Op-Ed).

 

The Big Flash: Origin of the Cosmic Background Radiation

The Big Flash: Origin of the Cosmic Background Radiation

When the universe was about 380,000 years old it had cooled to about 3000 K, cool enough for electrons to attach to nuclei and form atomic matter in highly excited states. This produced a massive flux of photons near the visible range (typical of excited atoms) that filled the early universe. As the universe and space itself expanded, the wavelength of this light was stretched into the microwave range to become the Cosmic Microwave Background (CMB) that Penzias and Wilson inadvertently discovered in 1964.

George Gamow had predicted that a Big Bang should produce just such a background radiation, and the CMB became one of the first pieces of evidence supporting the Big Bang Theory. Since then, the study of the CMB with space-based instruments like COBE, WMAP, and now the Planck Spacecraft continues to be a rich source of information about the early universe and it’s deepest structure.

 

End of the Dark Age: The First Stars are Born

End of the Dark Age: The First Stars are Born

After about 400 million years of expansion following the Big Bang the universe was cool enough for gravity to begin coalescing clouds of hydrogen into stars, igniting nuclear fusion for the first time. The prodigious outpouring of radiation from the first stars marked the end of the Dark Age, and ionized nearby clouds of hydrogen. This re-ionization is the fingerprint of the first stars and can be seen in the spectral signatures of quasars, in the polarization of the CMB, and in the 21-centimeter emission line of hydrogen.

The birth of the first stars marked a turning point in the life of the universe: from here on the universe took on the features we see today, with galaxies full of stars surrounded by planetary systems. Stars perform some of the most important work in the cosmos: they manufacture the elements heavier than hydrogen, they create planets as part of their own formation, and they provide energy for those planets, as our own Sun does for us. We love stars!

 

The Solar System Forms: Unusual or Not?

The Solar System Forms: Unusual or Not?

Yellow, G-class stars like the sun are a dime a dozen throughout the universe, but only a fraction of them exist as single stars and contain all 92 naturally occurring elements like our sun. Astronomers now have strong evidence from exoplanet research that virtually all stars form planetary systems as a natural part of their own formation, and this agrees with current theories of star formation. But most of the planetary systems observed so far seem weird and inhospitable for life — for example, with planets the size of Jupiter orbiting much closer than Mercury orbits the sun, or five planets packed into a space smaller than Mercury’s orbit. Astronomers have yet to see a solar system that is neatly ordered like our own with a nice rocky planet located in the sweet spot for liquid water and life.

Just how special is Earth’s situation?

The media was recently abuzz when researchers estimated (PNAS, Nov. 26, 2013) that there could be 8 billion or 9 billion stars in our galaxy with Earth-like planets — about 5 percent of stars — making the odds very high for intelligent life elsewhere. Yet no life, or evidence of it, has ever been found beyond Earth, so the jury is still very much out on the questions of how rare or common the Earth is, and how unique humanity may or may not be.

 

Life Begins

Life Begins

The presence of Carbon-12 in ancient rocks suggests that life began on Earth about 3.8 billion years ago. This means that DNA or some precursor molecule had assembled and could begin its relentless self-replication that drove the evolution of life. But how did such a fragile and complex molecule assemble?

Organic molecules have now been seen throughout the universe. They can be found in the spectral signatures of stars and gas clouds, and the Murchison meteorite that fell to Earth in 1969 contained 92 different amino acids, most never seen on Earth. However it’s a giant leap from amino acids to a living organism with a metabolic system that provides energy, and a genetic system that stores information, directs the construction of proteins, regulates every function of the organism, and replicates itself, all enclosed within a membrane.

Could life have been seeded from elsewhere? Or maybe it was just the chance sloshing together of molecules that was bound to happen somewhere. Or could there be some fundamental organizing principle in the universe that drives matter toward complexity? We don’t know the answer, and the origin of life remains one of the greatest mysteries in all of science.

 

The Great Oxygen Catastrophe, Snowball Earth, and the Birth of Eukarya

The Great Oxygen Catastrophe, Snowball Earth, and the Birth of Eukarya

Life on Earth faced its biggest survival crisis about 2.5 billion years ago when the atmosphere changed over from being carbon dioxide-dominant to oxygen-rich. Up to this time life was prokaryotic, or bacteria-like, and thrived on CO2. But photosynthesizing bacteria used up CO2 and produced poisonous O2 that eventually filled the atmosphere. To make matters worse the drop in atmospheric CO2 plunged the earth into a massive deep freeze that has come to be called the Snowball Earth Event. Our planet was covered entirely with thick sheets of ice except near the equator, and life in the dark oceans was nearly driven to extinction. Yet somewhere in the midst of this two-headed crisis a new and more complex form of life emerged: Eukarya.

In 1967 Lynn Margulis was the first to recognize that some prokaryotic organisms were able to merge together in a cooperative arrangement she named endosymbiosis that helped them survive the crisis. We now understand that the mitochondria in animal cells and the chloroplasts in plant cells were once individual organisms before they were engulfed to become Eukaryotic organelles. They still carry primitive genomes from their days as prokaryotes. The emergence of Eukaryotic life opened the door for all higher forms of life that would follow – including us!

 

Lucky for Us: The Cretaceous-Tertiary (KT) Extinction

Lucky for Us: The Cretaceous-Tertiary (KT) Extinction

For nearly 200 million years dinosaurs ruled the Earth. Then quite suddenly, about 65 million years ago, they disappeared. It was Luis and Walter Alvarez, father and son, who first explained in 1980 what happened to the dinosaurs. They found thin layers of iridium in rocks that dated to 65 million years ago, and since iridium is naturally very rare on Earth they proposed that it’s source was a meteorite that impacted the Earth at this time. Their theory was controversial at first and had many doubters, but other researchers began to find similar iridium layers in rocks from other locations on Earth, suggesting a global event, and finally a meteor crater was found near the Yucatan Peninsula that was to about the same age.

The Chicxulub meteor must have been 10 to 15 kilometers in size and traveling at about 10,000 mph when it impacted, triggering a global winter that was devastating for nearly all land plants and animals. But the small furry mammals that burrowed underground survived. They had been living in the shadows of the dinosaurs all along, but with the dinosaurs gone they could now thrive and grow in size. They became the new rulers of the Earth. Eventually the mammalian lineage evolved into primates, then apes, then hominids, and finally the Homo lineage that produced human beings. If not for the chance encounter with a meteor 65 million years ago it is doubtful whether we would be here at all.

 

Tool Time: The Emergence of Homo

Tool Time: The Emergence of Homo

By about 5 million years ago certain apes in Africa had mastered the art of upright walking – these were the hominids. For the next few million years they roamed Africa as Australopithecus, with at least 7 different member species. But sometime about 2.5 million years ago, a new lineage branched off from Australopithecus. This is now called the genus Homo. There is still considerable debate about how and when this divergence happened, and which hominid species were involved. But the most widespread view is that Homo habilis was the first member if this new lineage, and by about 2.0 million years ago an ancestor that we now call Homo erectus was thriving in East Africa and would soon begin to populate Eurasia and Asia.

Stone tools also appear in the fossil record for the first time about 2.5 million years ago near the fossil remains of early members of Homo. No other life form had ever manufactured and used stone tools, and passed along the art for many generations. This was the birth of true culture and the advent of technology. Paralleling the development of tools was a rapid expansion in brain volume, eventually tripling in Homo neanderthalensis and Homo sapiens by about 200,000 years ago. But the Neanderthals seemed to stall out in their development of technology, while Homo sapiens continued to evolve technology, culture, and consciousness, turning us into a near-geologic force on the Earth. In the end, every hominid species went extinct – except one. We are the last and only surviving hominid.

 

Friend or Foe: Who were the Neanderthals?

Friend or Foe: Who were the Neanderthals?

By perhaps 1 million years ago, the hominid living in Africa called Homo ergaster (also called Homo erectus) began evolving into a new species. This was Homo heidelbergensis, probably humanity’s most immediate ancestor. Some members of this new species stayed in Africa, while others exited Africa and made their way into Europe, where they show up in the fossil record by about 800,000 years ago. The European heidelbergensis population, adapting to the much colder conditions, evolved into Homo neanderthalensis by about 400,000 years ago.

The Neanderthals, who were large-brained and very intelligent, spread all over Central Asia and Europe, while their cousins who had stayed in Africa evolved into modern humans, Homo sapiens. By about 100,000 years ago, humans began to leave Africa for the first time, putting them on a collision course with the Neanderthals, first in Central Asia and later in Europe. They interacted mysteriously in ways that are not yet fully understood.

In the end, by about 30,000 years ago, the Neanderthals retreated to Spain and Portugal, where they finally went extinct. There is no real evidence that humans killed the Neanderthals, and the similarity in their tools suggests some overlapping of cultures. But recently published genomic studies by Svante Pääbo of the Max Planck Institute and his colleagues reveal that some people of European descent today carry as much as 4 percent Neanderthal DNA, leaving no doubt that the two populations interbred somewhere along the way.

 

The Great Leap Forward

The Great Leap Forward

Jared Diamond popularized this descriptor in his book The Third Chimpanzee (1992), and Richard Klein called this phenomenon the “big bang of human culture” in The Dawn of Human Culture (2002). Archeologists have strong evidence from the tool culture that points to an astounding leap in human intelligence between about 100,000 and 60,000 years ago.

Curiously, geneticists have also found that at about this time the total human population on Earth plummeted to perhaps just a few thousand individuals. It is not known exactly what caused this population bottleneck, but it would be impossible to ignore the eruption of Mount Toba 73,000 years ago on the island of Sumatra. This was the largest volcanic event in the last 30 million years, depositing 30-foot thick layers of ash in India and triggering a global winter that may have lasted for a thousand years.

It seems that the humans living in East Africa who survived the Toba event were a new and better version of Homo sapiens, perhaps capable of spoken language and the powers of collaborative culture that it bestowed. These new humans, sometimes called behaviorally modern humans, were soon able to leave Africa and spread to every habitable continent on the Earth in a relatively short time, pushing the Neanderthals and all other remaining hominid species to extinction. No other living thing has had a fair chance to compete with us ever since the Great Leap.

 

The Advent of Civilization

The Advent of Civilization

For more than 2 million years our ancestors were nomadic hunter-gatherers. This changed for the first time about 11,500 years ago as Earth’s climate became warmer and milder.

People in the Middle East began experimenting with edible plants, selecting seeds from the best plants and planting them in protected areas. This type of gardening, called horticulture, required that people remain in one place to tend their crops, and gradually the nomadic lifestyle was replaced by more sedentary, permanent camps. Animals, too, were domesticated as companions, servants, or food sources. By about 10,000 years ago large permanent settlements like Jericho and Catalhoyuk appear in the archeological record. These “proto-cities” were not yet true cities but more like disorganized collections of villages with few signs of warfare, social stratification, wealth, rulers or any other levels of status. But another shift in human development on par with the Great Leap was in store.

By about 5200 years ago the first city-states first appear in several locations throughout the Middle East. For the first time the archeological record shows clear evidence of social stratification and a ruling elite holding almost all the wealth and power. This was the advent of civilization.

With the invention of writing human knowledge could be recorded permanently and controlled. Most of the characteristics of the today’s world now appeared, including centralized government and power, military forces and warfare, institutionalized religion, patriarchy, monetary systems, poverty, large-scale agriculture, trade networks, and empire. Civilization soon appeared independently at many other locations throughout the world including China, India, Egypt, Peru, Crete, and Mexico. Not much of this has changed in the last 5000 years except the names and places. But is this model still serving us well, or in humanity ready for something new, the next Great Leap?

The views expressed are those of the author and do not necessarily reflect the views of the publisher. This version of the article was originally published on LiveScience

Roger Briggs is the author of “Journey to Civilization: The Science of How We Got Here” (Collins Foundation Press, 2013). In his book, he presents a new creation story of the universe, the Earth, life and humanity based on the evidence and skepticism of science. Briggs contributed this article to LiveScience’s Expert Voices: Op-Ed & Insights.