Uncommon Sense

politics and society are, unfortunately, much the same thing

Many of America’s seemingly benevolent programs succeed only in making people dependent

original article: Searching for Self-Reliance
May 30, 2017 by Edwin J. Feulner

When conservatives call for Congress to cut federal spending and shrink the size of government, they’re often portrayed as heartless.

On the contrary: We remember our heritage. We know there’s actually nothing “progressive” at all about the nanny state. Indeed, it’s regressive. It’s a betrayal of our history as a nation built on self-reliance.

We owe our republic, after all, to the energy and exertions of rugged individuals — pilgrims who crossed the perilous sea in frail ships to brave a wilderness, pioneers who slogged thousands of miles through hostile territory and prevailed against all odds.

They had no subsidies, no guarantees, no government help save for raw public land they painfully developed by hard labor. They shared what they had, helped one another, and took turns standing guard to protect against danger. They wanted to be free, and they build the freest country in history.

Self-reliance, Alexis de Tocqueville observed in his landmark work “Democracy in America,” was the organizing principle of American life, culture, and politics in the 19th century. Today, however, our nation seems to have reversed Tocqueville’s admiring formulation and become a nanny state in which more and more individuals depend on government to do not only what they can’t do for themselves, but far too much else.

Sure, there are plenty of hard-working Americans still around. But unlike our predecessors, many other present-day Americans show little or no interest in relying on their own mind and muscle to surmount obstacles. Since the 1930s, generations have grown up accustomed to depending on government as their first line of defense against not only serious trouble, but also the common vicissitudes of ordinary life.

Think of the chores we expect our public servants to perform with all the panache of brave first responders tackling a terrorist attack. If you lock your keys inside your car, can’t coax your cat down from a tree, or feel insulted by a surly cabdriver, what do you do? Many milquetoasts in 21st century America call 911 and demand action by some hapless fire company or overworked police department.

The nanny state has conditioned vast numbers of us to view nearly any setback as a federal case. If you can’t pay your debts, taxes or tuition; if you can’t afford health insurance, rebuild your beach house after a hurricane, or save your business from your own follies, never fear — some federal program will surely bail you out.

And you don’t have to be poor, friendless, handicapped or underprivileged to get that help. The bigger your business and the more egregious your errors, the more you can expect the feds to save you.

Americans have been sliding into dependency ever since the New Deal began federalizing everyone’s problems, and particularly since Lyndon Johnson launched his so-called “Great Society.” What fell by the wayside was the previous American way of dealing with adversity, the era when people in need turned to the civil society around them — the safety net of families, friends, churches, local doctors, and politicians.

All that changed with the proliferation of federal programs doling out benefits on an industrial scale. Federal involvement in everything from retirement (Social Security), health care (Medicare and Medicaid) and education grew by leaps and bounds, making more and more Americans dependent on faceless bureaucrats they never meet.

It all adds up to a profound loss of the self-reliance that built this country and made it great. Many of our seemingly benevolent programs succeed only in weakening people and condemning them to endless dependency.

This is why conservatives want to cut government down to size. As President Reagan said in his first Inaugural Address, “It is not my intention to do away with government. It is, rather, to make it work — work with us, not over us; to stand by our side, not ride our back.”

Critics call that heartless. But to allow our present trajectory to continue unchecked is senseless. It’s time to change course — before it’s too late.

american, conservative, culture, freedom, government, history, ideology, right wing, unintended consequences

Filed under: american, conservative, culture, freedom, government, history, ideology, right wing, unintended consequences

Yes, Democrats are still responsible for slavery, Jim Crow, and the KKK

I was listening to a guy talk about the unsavory history of American Democrats. An academic in the audience, also a Democrat, spoke up during the Q&A and castigated the speaker for suggesting modern Democrats are responsible for their party’s past. Elsewhere, on a forum unrelated to politics I saw a post asking (while actually suggesting) if Republicans were the party of racism.

On many occasions I’ve heard people argue in no uncertain terms that today’s Democrat party is not the same as it used to be. Democrats are a very different group of people today, the argument goes, so the modern party cannot honestly be held accountable for the evils of their predecessors.

And yet Republicans today are frequently blamed for slavery, Jim Crow, and the KKK – all of which were either defended or (in the case of Jim Crow and the KKK) invented by Democrats. If Democrats cannot be honestly held accountable for the sins of the past because the modern party is composed of different people, how can modern Republicans honestly be held accountable for the Democrats’ sins of the past?

The myth that the parties “switched sides” is constantly losing credibility, as it should. Certain arguments keep cropping up which rightly challenge that myth. As one example, consider the fact the “not a person” argument was one of the chief defenses of slavery Democrats used in the past, and it is one of the chief defenses Democrats use today to defend abortion. Democrats never stopped playing semantic games with other people’s personhood. That game switched to a different target, but the victimizing continues. Deciding who is and who is not a person, and therefore who does and who does not have rights, is one of the fundamental tools of oppression Democrats have always used.

Contrast this with the conservative position on who does and who does not have rights. If you move to the United States legally and follow our rules, and join in the social compact we all have amongst ourselves as citizens, you can enjoy the rights and benefits of citizenship. If you move to our country and choose to break our laws and intentionally avoid becoming a citizen you don’t get to enjoy the rights and benefits of citizenship. Voting is not a human right, it is a citizen right. No one is denying an immigrant their status as a person by arguing they have no right to vote (despite Democrat protestations).

Another important detail is the fact Democrats are not responsible for the vast majority of civil rights legislation passed by the U.S. government. From the war between the states through the 1950s civil rights legislation was soundly the result of Republican efforts. Remember that incident in the early 20th century when the American military was racially segregated? Yeah, that was Woodrow Wilson, a Democrat, working against civil rights that had already been achieved up to that point in American history. And even for those pieces of legislation which Democrats do claim credit, we are justified in asking why should they? After all it was not Republicans who filibustered the 1964 civil rights act; that was Democrats. So why should Democrats get credit for it today?

Besides, if Democrats of today ought not be held accountable for the sins of Democrats of the past, even if you wanted to argue Democrats deserve credit for the 1964 civil rights act, why should today’s Democrats get credit for it? On the other hand, if today’s Democrats do deserve credit for the virtues of Democrats half a century ago, they likewise deserve blame for those past sins.

Another reason Democrats can still be blamed for their past sins is the myth that those slavery-defending Democrats were conservative. This is why liberals/progressives have no choice but to define conservatism as wanting to maintain the status quo and opposing change. All power seeks to preserve itself. Communists, socialists, fascists, Democrats, Republicans, and all political groups who have power want to keep it. To admit this plain fact would endanger the modern narrative. Liberals/progressives have no problem blaming modern Republicans for the sins of the past but they lose their minds if someone suggests Democrats should be held responsible for the sins of their own political party.

“Change” has always been a battle cry of tyrants, so conservatives are naturally skeptical of politicians promising change, or making promises of any kind. The liberal/progressive description of conservatism sees the political right as a group of people who want to maintain the status quo, to keep power structures as they are. But the conservative description of conservatism is quite different. In the American experiment, conservatism has always been leery of the abuse of power. That’s why, in order to “preserve” liberty, conservatives prefer to “conserve” power, to limit its concentration and avoid its over use. Conservatives are glad to test new ideas, but not to blindly jump on board just because enthusiastic (or even violent and hateful) protesters demand change. Environmental activists work in a similar fashion: seeking to preserve the environment by conserving energy, avoiding its overuse or waste (but resorting to liberal/progressive tactics in seeking to control other people in the process).

It was not conservatives of the past who defended slavery. The abolitionists were the conservatives of the day. They viewed the abuse of power in legally robbing one group of people of their humanity as a threat to everyone, naturally put us all in danger of the same abuse. Looking at the long term effects of the situation, conservatives realized if our government can dehumanize one group, it can dehumanize another. They viewed this type of power, in a free society claiming to be founded on the notion of liberty, as abuse. And the abolitionists were right. Today, prenatal people are denied all rights because they are explicitly robbed of their very humanity. (And don’t forget that other incident when progressive Germans decided to play semantic games with personhood.)

In our modern era all manner of common ideas are construed as discrimination and oppression to help reinforce the idea of blaming Republicans for the past sins of slavery. It is said foreigners who are not citizens are denied their humanity because they are not allowed to vote (which could become their right if only they would become citizens). It is said gays are denied their humanity because they are not allowed to live together, to love who they want to love, or to have a ceremony. Actually, even before government usurped the religious institution of marriage (a violation of the separation between church and state, by the way) gays were already doing all these things in the United States. No one was stopping them. There are some people who want to deny the right of gays to do any of those things, or even to live, but if I told you who they were I’d probably be accused of Islamophobia.

When conservatives want reasonable justification for redefining the right to vote or the institution of marriage we are accused of all sorts of evil things, and a lot of people believe those accusations because somehow conservatives are supposed to be responsible for slavery, so why wouldn’t Republicans do these other evil things, too? When conservatives ask why, after telling us liberals wanted government out of the bedroom, do they now demand government enter the bedroom we are supposed to simply cower and remain silent at the allegation of bigotry.

So there is political gain to be had in blaming Republicans for the past sins of Democrats. Democrats have a long track record, continuing even today, of dehumanizing others. But it is only Republicans who bear the blame for dehumanization. Misconstruing today’s issues as hate is the new norm. Anything progressives disagree with is labeled “fascism”. Then progressives act like fascists to “protest”. Silencing others, resorting to violence to do so, dehumanizing detractors, and bullying anyone fails to tote the line is fascism. It is also the history and contemporary practice of Democrats.

Yes, both political parties are quite different today as compared to what they were 150, 100, or even 50 years ago. No, the parties didn’t “switch sides”. Since Democrats continue to play the same political games they have played all along, they deserve the blame for their own sins, especially since they fraudulently claim credit for any virtues of the past.

abuse, american, bigotry, civil rights, corruption, culture, Democrats, fraud, government, history, hypocrisy, ideology, indoctrination, left wing, liberalism, oppression, pandering, political correctness, progressive, propaganda, public policy, relativism, victimization

Filed under: abuse, american, bigotry, civil rights, corruption, culture, Democrats, fraud, government, history, hypocrisy, ideology, indoctrination, left wing, liberalism, oppression, pandering, political correctness, progressive, propaganda, public policy, relativism, victimization

How Intellectuals Cover for Evil

original article: How Intellectuals Cover for Evil: Deconstruction
March 18, 2017 by Thomas McArdle

Alongside its unprecedented mass violence, the 20th century saw the rise and reign of the secular intellectual as false prophet and would-be führer. For such men, as historian Paul Johnson wrote:

The collective wisdom of the past, the legacy of tradition, the prescriptive codes of ancestral experience existed to be selectively followed or wholly rejected entirely as his own good sense might decide.

Enter the villain of Stream columnist Jonathan Leaf’s powerful new play, Deconstruction, running through March 25 at the Theatre at Grand Hall (St. Mary’s Parish), 440 Grand Street, New York, N.Y., produced by Storm Theatre.

The Antwerp-born Paul de Man came to America after the Second World War and Blitzkrieged the study of literature by pioneering the postmodern theory of deconstruction — which, among other things, put morally-relativistic modern man in the place of a murdered God.

Pretending to be a Hero of the Anti-Nazi Resistance

De Man ultimately reached the zenith of academic prestige at Yale, becoming the single most influential literary critic in America — whose theories still deeply influence English classes at colleges today. But at the outset of Deconstruction, it’s summer 1949. He holds a menial job at a Grand Central bookshop, and finds himself the pitied guest of Catholic-turned-Marxist novelist and critic Mary McCarthy in her Rhode Island beach cottage.

Leaf’s drama speculates about the two married academics’ rumored affair.  McCarthy would secure de Man his first academic post at New York’s Bard College, an hour’s drive north of Vassar, where she was teaching. De Man doesn’t quite seduce McCarthy; it’s mutual. As she later admits, “anyone who strokes my ego after a few drinks too often can stroke other places.” He compliments her literary talent. She praises his conversational cleverness, and his brave service in the Belgian Resistance – except that, as we discover, the latter was a lie. Quite the contrary.

Deconstruction 2

Jed Peterson as de Man is a fascinating near-reincarnation of Paul Henreid playing the sly, covert Nazi in Carol Reed’s 1940 thriller Night Train To Munich. De Man apes sincerity quite effectively, as he professes shame for seducing other women, then dwells on his tragic youth. At 17, he found his mother hanged on the anniversary of his brother being struck dead by a train. Yet soon after telling the tale, he does indeed lead McCarthy to bed.

In Leaf’s telling, McCarthy would eventually find herself expecting de Man’s child, leaving her third and current husband to think the child is his. After her miscarriage it would be her husband, not de Man, at her side. De Man would by this time be busy with a 21-year-old Bard student whom he had also impregnated.

No, de Man had not fought in the Resistance. In fact, he had served the Nazis.

Inventing New Forms of Relativism to Explain Away His Crimes

But this is the tip of the iceberg. No, de Man had not fought in the Resistance. In fact, he had served the Nazis. Some four years after de Man’s 1983 death, a Belgian scholar would discover more than 100 pro-Nazi articles de Man had published under his own byline in occupied Belgium during the war in the country’s leading newspaper, Le Soir. In one, he recommended a forced exodus of the Jews, remarking that Europe “would lose, in all, a few personalities of mediocre value” then continue in greatness.

Le_SoirDe Man’s legion of deconstruction disciples would proclaim the revelations overblown. Literary scholar James Atlas noted in the New York Times in 1988, while the truth about de Man was still hitting the fan, that de Man’s Yale colleague Geoffrey Hartman minimized de Man’s offenses because they “didn’t begin to compare with the ‘vulgar anti-Semitic writing’ in other newspapers of the day.”

De Man would quit the pro-Nazi paper, but not necessarily for the right reasons. Two months after de Man’s departure Le Soir’s other literary critic was assassinated by the Resistance for being a Nazi collaborator.

Interrogated by Hannah Arendt

The play twists the knife when Leaf’s last character arrives — McCarthy’s friend, political theorist Hannah Arendt. A German Jew who grew up in Koenigsberg, she’d escaped death in the Holocaust thanks to falsified papers from a U.S. diplomat. To an audience, Karoline Fischer’s stern, straight-talking Arendt may be the least enchanting of the three characters, but that suits her harsh message of truth.

“That I managed to get out of Germany, then out of a detention camp — it’s because I’m not cowed. By anyone.” So she informs de Man in an unwelcome visit to his Bard office. “I want to know: who are you?”

But this far-and-away more honest intellectual already knows, having “made some inquiries in Belgium.”

“Tell me, did you deliver bombs for the Resistance? Is that true or a lie?” Arendt demands of de Man.

“If we cannot prove God’s existence or the moral laws taken from antiquity, then what place is there for traditional morality?”

His blood-curdling response: “As a student of Heidegger, you of all people should know that the notion of objective truth is a philosophical concept. An abstraction. Neither more, nor less.”

“What Is Truth?”

De Man was taunting Arendt, aware that she’d once been both Heidegger’s student and his lover. (Heidegger’s blatant, public support for the Nazis even after the war has since dimmed his intellectual star a little.)

If there is no real truth, then why be good? Or, as de Man earlier asked McCarthy, “If we cannot prove God’s existence or the moral laws taken from antiquity, then what place is there for traditional morality? You do see the logic at least?”

The logic she sees – indeed keenly feels – is the soul-destroying vacuum of love and beauty that de Man leaves in his wake. As Mary McCarthy, Fleur Alys Dobbins, in the performance of the night, shifts jarringly from a feathery hedonism to ravaged victimhood.

“You know, Paul, I spent hours thinking of baby names, painting the child’s room different colors in my mind. Wondering: a girl or a boy, which would you like?” she cries in her pain. When de Man claims, “I’m ashamed,” Arendt counters, “You have no shame,” then reveals, “one of the inquiries I made told me something that didn’t entirely surprise me: you wrote for a Nazi newspaper.”

The real difference between de Man and McCarthy?  She admits, “I know I’m a fraud,” but “I want to be good.”

The year he died, de Man would write, “’I am not given to retrospective self-examination, and mercifully forget what I have written with the same alacrity I forget bad movies … although, as with bad movies, certain scenes or phrases return at times to embarrass and haunt me like a guilty conscience.” Atlas noted that, writing on Rousseau, de Man had even claimed we can never distinguish between “fictional discourse and empirical event,” which “makes it possible to excuse the bleakest of crimes.”

Leaf’s deconstruction of the de Man myth ends with McCarthy (“some Marxist, I am!”) repeating aloud a prayer to the Virgin Mary. In the words of Whittaker Chambers, the Communist spy who turned Christian, Deconstruction’s audience discovers that “man without mysticism is a monster.”

abuse, anti-religion, atheism, culture, education, elitism, ethics, history, ideology, left wing, liberalism, marxism, philosophy, progressive, relativism

Filed under: abuse, anti-religion, atheism, culture, education, elitism, ethics, history, ideology, left wing, liberalism, marxism, philosophy, progressive, relativism

White guilt gave us a mock politics based on the pretense of moral authority

original article: The Exhaustion of American Liberalism
March 5, 2017 by SHELBY STEELE

The recent flurry of marches, demonstrations and even riots, along with the Democratic Party’s spiteful reaction to the Trump presidency, exposes what modern liberalism has become: a politics shrouded in pathos. Unlike the civil-rights movement of the 1950s and ’60s, when protesters wore their Sunday best and carried themselves with heroic dignity, today’s liberal marches are marked by incoherence and downright lunacy—hats designed to evoke sexual organs, poems that scream in anger yet have no point to make, and an hysterical anti-Americanism.

All this suggests lostness, the end of something rather than the beginning. What is ending?

America, since the ’60s, has lived through what might be called an age of white guilt. We may still be in this age, but the Trump election suggests an exhaustion with the idea of white guilt, and with the drama of culpability, innocence and correctness in which it mires us.

White guilt is not actual guilt. Surely most whites are not assailed in the night by feelings of responsibility for America’s historical mistreatment of minorities. Moreover, all the actual guilt in the world would never be enough to support the hegemonic power that the mere pretense of guilt has exercised in American life for the last half-century.

White guilt is not angst over injustices suffered by others; it is the terror of being stigmatized with America’s old bigotries—racism, sexism, homophobia and xenophobia. To be stigmatized as a fellow traveler with any of these bigotries is to be utterly stripped of moral authority and made into a pariah. The terror of this, of having “no name in the street” as the Bible puts it, pressures whites to act guiltily even when they feel no actual guilt. White guilt is a mock guilt, a pretense of real guilt, a shallow etiquette of empathy, pity and regret.

It is also the heart and soul of contemporary liberalism. This liberalism is the politics given to us by white guilt, and it shares white guilt’s central corruption. It is not real liberalism, in the classic sense. It is a mock liberalism. Freedom is not its raison d’être; moral authority is.

When America became stigmatized in the ’60s as racist, sexist and militaristic, it wanted moral authority above all else. Subsequently the American left reconstituted itself as the keeper of America’s moral legitimacy. (Conservatism, focused on freedom and wealth, had little moral clout.) From that followed today’s markers of white guilt—political correctness, identity politics, environmental orthodoxy, the diversity cult and so on.

This was the circumstance in which innocence of America’s bigotries and dissociation from the American past became a currency of hardcore political power. Barack Obama and Hillary Clinton, good liberals both, pursued power by offering their candidacies as opportunities for Americans to document their innocence of the nation’s past. “I had to vote for Obama,” a rock-ribbed Republican said to me. “I couldn’t tell my grandson that I didn’t vote for the first black president.”

For this man liberalism was a moral vaccine that immunized him against stigmatization. For Mr. Obama it was raw political power in the real world, enough to lift him—unknown and untested—into the presidency. But for Mrs. Clinton, liberalism was not enough. The white guilt that lifted Mr. Obama did not carry her into office—even though her opponent was soundly stigmatized as an iconic racist and sexist.

Perhaps the Obama presidency was the culmination of the age of white guilt, so that this guiltiness has entered its denouement. There are so many public moments now in which liberalism’s old weapon of stigmatization shoots blanks—Elizabeth Warren in the Senate reading a 30-year-old letter by Coretta Scott King, hoping to stop Jeff Sessions’s appointment as attorney general. There it was with deadly predictability: a white liberal stealing moral authority from a black heroine in order to stigmatize a white male as racist. When Ms. Warren was finally told to sit, there was real mortification behind her glaring eyes.

This liberalism evolved within a society shamed by its past. But that shame has weakened now. Our new conservative president rolls his eyes when he is called a racist, and we all—liberal and conservative alike—know that he isn’t one. The jig is up. Bigotry exists, but it is far down on the list of problems that minorities now face. I grew up black in segregated America, where it was hard to find an open door. It’s harder now for young blacks to find a closed one.

This is the reality that made Ms. Warren’s attack on Mr. Sessions so tiresome. And it is what caused so many Democrats at President Trump’s address to Congress to look a little mortified, defiantly proud but dark with doubt. The sight of them was a profound moment in American political history.

Today’s liberalism is an anachronism. It has no understanding, really, of what poverty is and how it has to be overcome. It has no grip whatever on what American exceptionalism is and what it means at home and especially abroad. Instead it remains defined by an America of 1965—an America newly opening itself to its sins, an America of genuine goodwill, yet lacking in self-knowledge.

This liberalism came into being not as an ideology but as an identity. It offered Americans moral esteem against the specter of American shame. This made for a liberalism devoted to the idea of American shamefulness. Without an ugly America to loathe, there is no automatic esteem to receive. Thus liberalism’s unrelenting current of anti-Americanism.

Let’s stipulate that, given our history, this liberalism is understandable. But American liberalism never acknowledged that it was about white esteem rather than minority accomplishment. Four thousand shootings in Chicago last year, and the mayor announces that his will be a sanctuary city. This is moral esteem over reality; the self-congratulation of idealism. Liberalism is exhausted because it has become a corruption.

american, bias, bigotry, corruption, culture, Democrats, discrimination, diversity, extremism, government, history, ideology, indoctrination, left wing, liberalism, oppression, pandering, philosophy, political correctness, politics, progressive, propaganda, racism, relativism, unintended consequences

Filed under: american, bias, bigotry, corruption, culture, Democrats, discrimination, diversity, extremism, government, history, ideology, indoctrination, left wing, liberalism, oppression, pandering, philosophy, political correctness, politics, progressive, propaganda, racism, relativism, unintended consequences

If Black Genocide were shown on BET, Black Lives Matter would be attacking abortion clinics

original article: One of Margaret Sanger’s Pals Ran a Concentration Camp That Killed Black People
October 14, 2016 by JASON JONES & JOHN ZMIRAK

It’s a pro-life commonplace that The American Birth Control League, founded by Margaret Sanger 100 years ago and later rechristened Planned Parenthood, had ties to eugenicists and racists. This is not quite right. It’s like saying that the NBA has ties to professional sports. The birth control movement and the eugenics movement were the same movement — to the point where Margaret Sanger twice tried to merge her organization with major eugenics groups.

One eugenics expert, Eugen Fischer, whom Sanger featured as a speaker at a population conference she organized, had already run a concentration camp — in German-ruled Southwest Africa, before World War I, where he murdered, starved and experimented on helpless native Africans. It was Fischer’s book on eugenics, which Hitler had read in prison, that convinced Hitler of its central importance. Another longtime official of Planned Parenthood, Garrett Hardin, had a decades-long track record of serving in eugenics organizations, and as late as the 1980s was calling for mass forced sterilization of Americans as a necessary solution to the “population problem.”

The same people served on the boards of the American Eugenics Society and Sanger’s organizations for decades, and they worked closely together on countless projects — ranging from researching the birth control pill as a means of diminishing the African-American birth rate (they tested the early, hazardous versions of the Pill on impoverished rural women in Puerto Rico), to passing forced sterilization or castration laws in more than a dozen states that targeted blacks and other poor people accused of “feeble mindedness” or “shiftlessness” and diagnosed as “unfit” parents. Today, Planned Parenthood sets up its centers in America’s poorest neighborhoods, and continues to target the same populations via abortion.

Maafa 21: Black Genocide

That’s the appalling truth uncovered in a neglected 2014 documentary which we feature here at The Stream as part of our #100forLife campaign. Maafa 21: Black Genocide gets its odd title from the Swahili word for slavery, and it is this film’s contention that the eugenics movement in America began in the panic which white racists felt at the end of slavery over what should be done to solve what some called the “Negro problem.” It’s a long, harrowing film, which you should watch in small doses — treating it as a miniseries. And keep a box of Kleenex handy, because you will weep.

Produced by the pro-life apostolate Life Dynamics with a mostly black cast of narrators and commentators, this film claims that Planned Parenthood and other organizations and government programs that target the poor and try to block their reproduction are the 21st century’s answer to the Ku Klux Klan — which was founded by white Southern elites to keep down the “unruly” ranks of freed black slaves.

It’s a shocking assertion, but one that the filmmakers prove beyond the shadow of a reasonable doubt, citing name after name, giving racist quote after racist quote, showing that Sanger personally approved the publication of outrageous and cruel claims of the genetic inferiority of millions of Americans, especially blacks, and calling for their forced sterilization, and the cut-off of welfare benefits and even private charity, to stop the “unfit” from reproducing themselves. Then she took part in promoting policies that turned this evil, utopian program of social engineering into binding American laws. One of the leading advocates for the legalization of abortion in the 1960s and 70s was Planned Parenthood, run by her appointees and later by her grandson, Alexander Sanger.

Margaret Sanger Worked with White Supremacists for Decades

The board of Margaret Sanger’s organization and others where she served as an officer, the authors she published in The Birth Control Review, the conferences she sponsored, and the people to whom Planned Parenthood gave awards well into the 1960s and 70s, are a Who’s Who of the ugliest, most paranoid misanthropic elitists and white racists of the 20th century — apart from those who were thankfully hanged at Nuremburg. After those trials, when “eugenics” had acquired a well-deserved taint, these same American elitists used the exaggerated threat of “overpopulation” to peddle the desperate need to control other people’s fertility, if need be by forced sterilization — a policy which Sanger had advocated since 1934.

The eugenicists, self-appointed experts on human quality of life, had peddled their theories not just in Britain and America but in Germany, where they helped to directly inspire Nazi sterilization and extermination programs aimed at the handicapped, Jews, and the small population of black or mixed race Germans — children of French colonial troops whom Hitler considered a grave menace to “Aryan” racial “hygiene.” One of Sanger’s regular authors in The Birth Control Review wrote in a U.S. newspaper in the 1930s defending the forced sterilization of such mixed-race children, for the sake of Germany’s “health.”

Hitler’s Bible, by Sanger’s Friend

Friends and associates of Sanger (such as Harry Laughlin) accepted awards from Nazi-controlled universities, visited with Hitler and Himmler, and boasted that the forced sterilization programs which they had instituted in America were used as models by the Germans. One author who served on Sanger’s board and published regularly in The Birth Control Review was Lothrop Stoddard, a high official of the Massachusetts Ku Klux Klan, whose book The Rising Tide of Color Against White World Supremacy, Adolf Hitler cited in Mein Kampf as “my bible.”

Ota_Benga_at_Bronx_Zoo

Nor were the eugenicists isolated cranks. Their ranks include Harvard professors, mainline Protestant clergymen, prominent conservationists for whom entire animal species are named, and Gilded Age plutocrats. Much of the funding for eugenics organizations came from the Carnegie Corporation and the Rockefeller Foundation.

Supreme Court justice Oliver Wendell Holmes, writing his opinion that the forced sterilization of a supposedly “feeble-minded” woman in Virginia was constitutional, infamously said that “three generations of imbeciles are enough.” His views were echoed by President Teddy Roosevelt, as the film proves with quotations. It also recounts how a Sanger ally Madison Grant, a prominent Darwin apostle and eugenicist, helped to exhibit Ota Benga, an African pygmy, in a cage with an orangutan for ten days at New York City’s Bronx Zoo, to “illustrate evolution.” Mr. Benga took his own life ten years later.

The eugenicists’ arrogant certainty that, because they had inherited money and power, they were genetically superior to the rest of the human race, found in Charles Darwin’s theories an ideal pretext and a program: to take the survival of the fittest and make it happen faster, by stopping the “unfit” from breeding. The goal, in Margaret Sanger’s own words, was “More Children from the Fit, Fewer from the Unfit.” Instead of seeing the poor as victims of injustice or targets for Christian charity, the materialism these elitists took from Darwin assured them that the poor were themselves the problem — that they were inferior, deficient and dangerous down to the marrow of their bones.

“Feeble-Minded” and “Shiftless” Blacks

The targets of this campaign in America were poor people, the unemployed, non-English-speaking immigrants, but most of all African-Americans. This vulnerable population, composed largely of ex-slaves and their children, was identified in the 1880s as a “threat” to the “racial health” and progress of the United States, by followers of Francis Galton — first cousin of Charles Darwin, heir to a slave-trading fortune, and inventor of the “science” of eugenics. These people had been exploited for centuries as free labor, denied education for fear of fomenting rebellion, and excluded from most of the economy. Now the eugenicists blamed the victims, black Americans, for their desperate social conditions, claiming that they were the natural result of blacks’ “defective germ plasm,” which posed a threat to America akin to a deadly virus.

The forced sterilization laws which Sanger and her allies passed were used to sterilize at least 60,000 Americans, but perhaps as many as 200,000, on the pretext that young women who became pregnant out of wedlock were “feeble-minded,” “immoral” or “socially useless” parasites — all rhetoric that Sanger personally used in her books, articles, and at least one speech before a Ku Klux Klan rally, as she recounts in her memoir.

tony-riddick-150x150

Maafa 21 interviews Elaine Riddick, who was raped at age 13 and became pregnant. As she lay in the hospital waiting to deliver the baby, welfare officials from the state of North Carolina warned her illiterate grandparents that if they didn’t sign the consent form to have her irreversibly sterilized, the state would cut off their welfare benefits. They scrawled an “X” on the government form, and Elaine was sterilized without her knowledge. She only learned what had been done to her five years later, when welfare officials explained that she was too “feeble-minded” to care for a child “or even tie my own shoes,” as she recounts. Elaine was sterilized in 1968. The last such “eugenic” forced sterilization in the U.S. took place in 1983.

While Elaine never went to high school, she went on and finished college, and the one child which the United States government had permitted her to have — Tony Riddick, a child of rape — now runs his own successful company. Harry Laughlin, the eugenicist who helped pass the law that sterilized Elaine, died without any children.

abortion, abuse, bullies, elitism, ethics, eugenics, extremism, feminism, government, hate crime, history, ideology, left wing, nanny state, oppression, progressive, public policy, racism, racist, scandal, tragedy, victimization, video

Filed under: abortion, abuse, bullies, elitism, ethics, eugenics, extremism, feminism, government, hate crime, history, ideology, left wing, nanny state, oppression, progressive, public policy, racism, racist, scandal, tragedy, victimization, video

Where culture, politics, and religion meet in America

original article: Tocqueville and Democracy’s Fall in America
January 19, 2017 by Samuel Gregg

For Alexis de Tocqueville, American democracy’s passion for equality was a potentially fatal flaw—one that religion could help address. But what happens when religion also becomes preoccupied with equality?

Over the past year, lots of people, I suspect, have been reading Alexis de Tocqueville’s Democracy in America (1835/1840) as they ask themselves how the United States could have found itself having to choose in 2016 between two of the most unpopular candidates ever to face off for the office of president.

Historical factors contributed to America reaching this political point. These range from profound inner divisions characterizing American conservatism to deep frustration with the political class, as well as preexisting philosophical, cultural, and economic problems that have become more acute.

Tocqueville, however, recognized that such problems are often symptoms of subterranean currents that, once in place, are hard to reverse. A champion of liberty, Tocqueville was no determinist. He nevertheless understood that once particular habits become widespread in elite and popular culture, the consequences are difficult to avoid. In the case of democracy—perhaps especially American democracy—Tocqueville wondered whether its emphasis on equality might not eventually make the whole thing come undone.

The Passion for Equality

When Democracy in America’s second volume appeared in 1840, many reviewers noted that it was more critical of democracy than the first volume. In more recent times, Tocqueville’s warnings about democracy’s capacity to generate its own forms of despotism have been portrayed as prefiguring a political dynamic associated with the welfare state: i.e., people voting for politicians who promise to give them more things in return for which voters voluntarily surrender more and more of their freedom.

This very real problem, however, has distracted attention from Tocqueville’s interest in the deeper dynamic at work. This concerns how democracy encourages a focus on an equality of conditions. For Tocqueville, democratic societies’ dominant feature is the craving for equality—not liberty. Throughout Democracy in America, equality of conditions is described as “generative.” By this, Tocqueville meant that a concern for equalization becomes the driving force shaping everything: politics, economics, family life . . . even religion.

Democracy’s emphasis on equality helps to break down many unjust forms of discrimination and inequality. Women gradually cease, for instance, to be regarded as inherently inferior. Likewise, the fundamental injustice of slavery becomes harder and harder to rationalize.

At the same time, as Tocqueville scholar Pierre Manent has observed, democracies gravitate toward a fascination with producing total equality. Democracy requires everyone to relate to each other through the medium of democratic equality. We consequently start seeing and disliking any disparity contradicting this equality of conditions. Equality turns out to be very antagonistic to difference per se, even when differences are genetic (such as between men and women) or merited (some are wealthier because they freely assume more risks). But it’s also ambivalent about something that any society needs to inculcate among its members: virtue.

The idea of virtue implies that there are choices whose object is always good and others that are wrong in themselves. Courage is always better than recklessness and cowardice. But language such as “better than,” or “superior to” is intolerable to egalitarianism of the leveling kind. That’s one reason why many people in democratic societies prefer to speak of “values.” Such language implies that (1) all values are basically equal, and (2) there’s something impolite if not downright wrong with suggesting that some purportedly ethical commitments are irrational and wrong.

But in such a world, who am I to judge that some of the values espoused by, say, Bernie Sanders, Donald Trump, Nancy Pelosi, Hillary Clinton, or any other political figure for that matter, might reflect seriously defective evaluations of right and wrong? All that would matter is that “they have values.” The truth, however, is that democracies don’t need “people with values.” They require virtuous people: individuals and communities whose habits of the heart shape what Tocqueville called the “whole mental and intellectual state” of a people as they associate together, pursue their economic self-interest, make laws, and vote.

The Religion of Egalitarian Sentimentalism

At the best of times, living a virtuous life is difficult. This is especially true when a fixation with equality makes many people reluctant to distinguish between baseness and honor, beauty and ugliness, rationality and feelings-talk, truth and falsehood. Much of Democracy in America consequently seeks to show how democratic societies could contain their equalizing inclinations.

Some of Tocqueville’s recommendations focus on constitutional restraints on government power. He understood that the political regime’s nature matters. But Tocqueville also believed that the main forces that promoted virtue, and that limited the leveling egalitarianism that relativizes moral choices, lay beyond politics. In America’s case, he observed, religion played an important role in moderating fixations with equality-as-sameness.

Tocqueville didn’t have just any religion in mind. He was specifically concerned with Christianity. For all the important doctrinal differences marking the Christian confessions scattered across America in Tocqueville’s time, few held to relativistic accounts of morality. Words like “virtue,” “vice,” “good,” and “evil” were used consistently and had concrete meaning.

Christianity did underscore a commitment to equality insofar as everyone was made as imago Dei and was thus owed equality before the law. This conviction helped to secure slavery’s eventual abolition. Nevertheless Christianity in America also emphasized another quintessentially Christian theme: freedom—political, economic, and religious. In the United States, the word “liberty” wasn’t associated with the anti-Christian violence instinctively linked by European Christians with the French Revolution.

Religions, however, aren’t immune to the cultures in which they exist. So what happens if a religion starts succumbing to the hunger for equalization that Tocqueville associated with democratic ways? Most often, such religions begin abandoning their distinctiveness, as self-evidently false propositions such as “all religions are the same” take hold. Truth claims and reasoned debate about religious and moral truth are relegated to the periphery. Why? Because trying to resolve them would mean affirming that certain religious and moral claims are false and thus unequal to those that are true.

When Christians go down this path, the inevitable theological void is filled by a sentimentalism that arises naturally from egalitarianism. God is condensed to the Great Non-Judge in the Sky: a nice, harmless deity who’s just like us. Likewise, such Christians increasingly take their moral cues from democratic culture. The consequent emphasis on equality-as-sameness doesn’t just mean that liturgy and doctrine are reduced to inoffensive banalities. The horizons of Christian conceptions of justice also shrink to the abolition of difference. The truth that many forms of inequality are just, including in the economic realm, is thus rendered incomprehensible. In the end, Christian confessions that embrace such positions collapse into pale facsimiles of secular egalitarianism and social justice activism.

A Fatal Combination?

These religions are incapable of performing the role that Tocqueville thought was played by many religious communities in the America he surveyed in the early 1830s. Of course, the object of religion isn’t to provide social lubrication. Religion is concerned with the truth about the divine, and living our lives in accordance with the truth about such matters. However, if religion ceases to be about truth, its capacity to resist (let alone correct) errors and half-truths such as “values-talk,” or justice’s reduction to equality-as-sameness, is diminished.

There’s no shortage of evidence of just how far large segments of American religious opinion have drifted in this direction. We have political operatives demanding, for example, “a little democracy and respect for gender equality in the Catholic Church”—as if the dogmatic and doctrinal truths proclaimed by a 2000-year-old universal church should be subordinated to a twentieth-first-century progressive American conception of equality. Plenty of older Protestant, Catholic, and Eastern Orthodox clergy offer political commentaries that owe more to John Rawls’s A Theory of Justice than to C.S. Lewis, Aquinas, the Church Fathers, or Christ. For many American Jews, Jewish faith and identity is the pursuit of progressive politics. Such religions cannot speak seriously about virtue (or much else) in the face of the relentless drive for equalization in democracy that so worried Tocqueville.

Politics is clearly shaped by culture. Yet at any culture’s heart is the dominant cultus. America’s ability to resist democratic equalization’s deadening effects on freedom requires religions that are not consumed by the obsession with equality that Tocqueville thought might be democracy’s fatal flaw. For Tocqueville, part of America’s genius was that religion and liberty went hand in hand. In the next few years, America is going to discover whether that’s still true.

culture, freedom, government, history, politics, religion, unintended consequences

Filed under: culture, freedom, government, history, politics, religion, unintended consequences

Flawed anthropology leads to flawed economics

original article: We’re all Dead: How J.M. Keynes – And His Critics – Went Wrong
June 29, 2016 by Liz Crandell

“Critics of John Maynard Keynes were so determined his economics were wrong that they allowed Keynes to dictate the terms of the debate,” says Victor Claar, professor of economics at Henderson State University, in his Acton University lecture. He continues to describe Keynes flawed anthropology with respect to classical economists and the Great Depression. Key observations of human nature include the principles of work, property, exchange, and division of labor. We can survive and prosper, take ownership of our work, support and rely on each other through exchange, and specialize in exchange at an opportunity cost. Furthermore, these observations are linked to moral imperatives.

Work allows us to combat sloth, we can practice good stewardship, serve other people, and provide richer options for all. Keynes, who was focused on how consumption worked rather than what human life looked like, did not understand these things. Maynard, like his father, Neville, was a large proponent of the Cambridge method, and the distinctions between positive and normative economics laid out by John Stuart Mills. The great legacy and wide scope of this method still exists today, as most economists continue to try and steer clear of normative statements, and try to stick to descriptive value judgments. However, by the nature of the problems we face, dealing with poverty, unemployment, and development, we inherently deal with positive statements and issues.

Supporters of Keynes’ theories use The Great Depression and post-World War eras as evidence of their effectiveness. Claar grants insight into the attractiveness of such policies, saying that such a recession created pessimism about the ability of market forces to self-correct, and since government management worked “reasonably well” after World War I, state management became tempting again. There is fault in this, since Keynes “focuses on the inherent instability of the market and the need for active policy intervention to achieve full employment of resources and sustained growth.” Keynes maintains that recessions and high unemployment are due to the fact that firms and consumers in the private sector do not spend enough on new capital and equipment and goods and services due to insecurity and nervousness about the future. As such, the remedy lies in the public sector, with the government spending using deficit financing if necessary. Ideally, after people get back to work, revenues will increase and the budget will balance once more. The obvious downside to this thought is that reducing pain in the short run, putting a band aid on the problem, leads to inflation and slower rates of long-term growth. Claar draws students’ attention to a revealing quote from Keynes that creates a moral dilemma: “In the long run, we’re all dead.” Keynes is perfectly happy to allow future generations pay off the debt that his creates.

Claar concludes there are three keys to understanding Keynes: The classical model’s predicted equilibria are mere special cases and are rarely satisfied in practice; hubris, or that the State is more capable of managing the economy that we ourselves are; and consumption is the purpose of all economic activity. This “flawed anthropology leads to flawed economics,” and “caught hold in the same period that men and women of science began to believe that systematic management of human beings was both possible and useful in all areas of society.” Keynes himself declared eugenics to be “the most important, significant and, I would add, genuine branch of sociology which exists.” Claar leaves students with a hopeful message that we can combat this dangerous line of thinking with well-functioning markets that let prices send strong signals to all of us regarding where our services may be needed most by others; clearly defined and enforced property rights that lead to good stewardship; and influential institutions, such as churches and families, to share wisdom.

bias, economics, elitism, eugenics, government, history, ideology, nanny state, philosophy, progressive

Filed under: bias, economics, elitism, eugenics, government, history, ideology, nanny state, philosophy, progressive

2 fatal mistakes made by Roe v. Wade

original article: 2 fatal mistakes made by Roe v. Wade
January 18, 2016 by KRISTI BURTON BROWN

Roe v. Wade has been the most fatal judicial decision in U.S. history. In the aftermath of Roe, 58 million babies have been aborted, whilecountless women have been irreparably damaged and families have been harmed and torn apart.

Roe was based on multiple mistakes, direct lies, and a rejection of accurate science, research, and the real Constitution. (Even Ruth Bader Ginsburg agrees that Roe was “heavy-handed judicial intervention [which] was difficult to justify.”) However, there are two particularly damaging mistakes – one made by the justices and one by the attorney who argued the case.

FATAL MISTAKE #1: The justices completely missed the intent of the 14th Amendment.

The justices behind Roe wrote that there was no constitutional basis for protecting preborn life. They rejected the 14th Amendment as a basis for protecting the preborn, even though it recognizes the right to life and equal protection for all persons.

They entirely failed to recognize the specific intent of the Congressional sponsors of the 14th Amendment. The intent – a key part of interpreting law – shows that the sponsors wanted to include future vulnerable and oppressed human beings in constitutional equal protection.

Representative John Bingham, a House sponsor, intended the Amendment to be applied universally – to any and every human being.[1] In a speech to Congress, prior to the passage of the 14th Amendment, he declared that the Constitution is “based upon the equality of the human race. Its primal object must be to protect each human being…”[2]

Senate sponsor Jacob Howard agreed that “the measure would apply to even the ‘humblest, the poorest, the most despised of the human race.’”[3] Representative H.D. Scott stated: “The strength of this Government…is in its willingness as well as ability to do equal and exact justice to every human being…”[4] He condemned justice being “made subservient to interest” and when the strong “can prey upon the weak and unfortunate with impunity.”[5]

Just as these statements applied to Black Americans at the time, they apply to the preborn now, just as they did on January 22, 1973, and at the time the 14 Amendment was passed. The Roe Court would have done well to recognize this clear and constitutional truth.

FATAL MISTAKE #2: The lawyer who argued Roe believed women needed abortion to be successful.

There are certainly things to admire about Sarah Weddington, the 26-year-old lawyer who successfully argued Roe v. Wade before the U.S. Supreme Court. At a very young age, she took on the entire nation to advocate for something she believed in. She didn’t let her age, gender, or inexperience stop her.

However, besides the fact that Weddington was on the completely wrong side of a human justice issue, she also has a sad story in her personal history. Before she married Ron Weddington, she became pregnant with their child in her final year of law school. Neither of them wanted children, and so the couple traveled over the border to Mexico, for an illegal abortion.

Weddington cites her ability to have an abortion as the reason she went on to have a career as a lawyer, and yet countless successful female attorneys have proven Weddington’s assertion wrong. Women are, in fact, able to be successful and to be mothers.

Erika Bachiochi, a feminist and former pro-choice attorney,  authored “Embodied Equality: Debunking Equality Arguments for Abortion Rights” for the Harvard Journal of Law & Public Policy. She also wrote about the real truth on abortion and women for CNN:

As a one-time abortion rights supporter, I well know the temptation to see the right to abortion as a representation of women’s equality. …

Abortion betrays women by having us believe that we must become like men — that is, not pregnant — to achieve parity with them, professionally, socially, educationally. . …

When we belittle the developing child in the womb, a scientific reality that most pro-choice advocates have come to admit, we belittle and distort that child’s mother. We make her out to be one with property rights over her developing unborn child (much as husbands once had property rights over their wives).

We give her the inhumane (but for 42 years, constitutionally protected) right to decide the fate of another human being, of a vulnerable child — her child — to whom she properly owes an affirmative duty of care. We do all this rather than offering her the myriad familial and social supports she needs, whatever her situation, and cherishing her role in the miracle of human life.

Conclusion

While these two fatal mistakes continue to cost millions of lives, we can each personally work to stop the damage. In our conversations with friends, on social media, on campus, and at our offices, clubs, churches, and groups, we can spread the truths that every human being – at every stage of development – deserves equal protection and that no woman needs to take her child’s life to succeed at life. We can actively and practically help women who make the choice for life.

As Carol Tobias, President of the National Right to Life Committee,says:  “As long as abortion is legal, pro-lifers will fight and never give up.”

Sources:
[1] CONG. GLOBE, 34th Cong., 3rd Sess. (1857)
[2] Id.
[3] SENATOR JACOB HOWARD, SPEECH INTRODUCING THE FOURTEENTH AMENDMENT, Speech delivered in the U.S. Senate, May 23, 1866
[4] CONG. GLOBE, 34th Cong., 3rd Sess. (1857)
[5] Id.

abortion, culture, feminism, history, ideology, indoctrination, judiciary, law, lies, progressive, propaganda, science

Filed under: abortion, culture, feminism, history, ideology, indoctrination, judiciary, law, lies, progressive, propaganda, science

This is what made George Washington ‘greatest man in the world’

original article: This is what made George Washington ‘greatest man in the world’
November 2, 2015 by BILL FEDERER

After the victory over the British at Yorktown, many of the Continental soldiers grew disillusioned with the new American government, as they had not been paid in years. The Continental Congress had no power to tax to raise money to pay them.

A disgruntled group of officers in New York met and formed a Newburgh Conspiracy. They plotted to march into the Capitol and force Congress to give them back pay and pensions. With some British troops still remaining on American soil, a show of disunity could have easily renewed the war.

On March 15, 1783, General George Washington surprised the conspiracy by showing up at their clandestine meeting in New York. Washington gave a short but impassioned speech, urging them to oppose anyone “who wickedly attempts to open the floodgates of civil discord and deluge our rising empire in blood.”

Taking a letter from his pocket, Washington fumbled with a pair of reading glasses, which few men had seen him wear, and said: “Gentlemen, you will permit me to put on my spectacles, for I have not only grown gray but almost blind in the service of my country.”

Washington concluded his Newburgh address, May 15, 1783: “And let me conjure you, in the name of our common Country, as you value your own sacred honor … to express your utmost horror and detestation of the Man who wishes … to overturn the liberties of our Country, and who wickedly attempts to open the flood Gates of Civil discord, and deluge our rising Empire in Blood. By thus determining … you will defeat the insidious designs of our Enemies, who are compelled to resort from open force to secret Artifice. You will give one more distinguished proof of unexampled patriotism and patient virtue. … You will … afford occasion for Posterity to say, when speaking of the glorious example you have exhibited to Mankind, ‘had this day been wanting, the World had never seen the last stage of perfection to which human nature is capable of attaining.’”

Many present were moved to tears as they realized the sacrifice Washington had made for the opportunity of beginning a new nation completely free from the domination of a king. With this one act by George Washington, the conspiracy collapsed.

Major General David Cobb, who served as aide-de-camp to General George Washington, wrote of the Newburgh affair in 1825: “I have ever considered that the United States are indebted for their republican form of government solely to the firm and determined republicanism of George Washington at this time.”

The crisis was resolved when Robert Morris issued $800,000 in personal notes to the soldiers, and the Continental Congress gave each soldier a sum equal to five years pay in highly-speculative government bonds, which were redeemed by the new Congress in 1790. Six month later the Treaty of Paris was signed, officially ending the war.

George Washington wrote to General Nathanael Greene, Feb. 6, 1783: “It will not be believed that such a force as Great Britain has employed for eight years in this country could be baffled in their plan of subjugating it by numbers infinitely less, composed of men oftentimes half starved; always in rags, without pay, and experiencing, at times, every species of distress which human nature is capable of undergoing.”

On Nov. 2, 1783, from his Rock Hill headquarters near Princeton, New Jersey, General George Washington issued his farewell orders: “Before the Commander in Chief takes his final leave of those he holds most dear, he wishes to indulge himself a few moments in calling to mind a slight review of the past. … The singular interpositions of Providence in our feeble condition were such, as could scarcely escape the attention of the most unobserving; while the unparalleled perseverance of the Armies of the United States, through almost every possible suffering and discouragement for the space of eight long years, was little short of a standing miracle. …”

Washington continued: “To the Armies he has so long had the honor to Command, he can only again offer in their behalf his recommendations to their grateful country, and his prayers to the God of Armies. May ample justice be done then here, and may the choicest of Heaven’s favours, both here and thereafter, attend those who, under Divine auspices, have secured innumerable blessings for others.”

In New York, Dec. 4, 1783, in Fraunces Tavern’s Long Room, General George Washington bade a tearful farewell to his Continental Army officers: “With a heart full of love and gratitude, I now take leave of you. I most devoutly wish that your latter days may be as prosperous and happy as your former ones have been glorious and honorable.”

On Dec. 23, 1783, Washington resigned his commission, addressing Congress assembled in Annapolis, Maryland: “I resign with satisfaction the appointment I accepted with diffidence; a diffidence in my abilities to accomplish so arduous a task; which however was superseded by a confidence in the rectitude of our cause, the support of the supreme power of the Union, and the patronage of Heaven. … Having now finished the work assigned to me, I retire from the great theatre of action; and bidding an affectionate farewell to this august body, under whose orders I have so long acted, I here offer my commission, and take any leave of all the employments of public life.”

At a time when kings killed to get power and kings killed to keep power, George Washington’s decision to give up power gained worldwide attention.

Earlier in 1783, the American-born painter Benjamin West was in England painting the portrait of King George III. When the King asked what General Washington planned to do now that he had won the war, West replied: “They say he will return to his farm.”

King George exclaimed: “If he does that, he will be the greatest man in the world.”

american, conservative, ethics, government, history, people, war

Filed under: american, conservative, ethics, government, history, people, war

So the sun affects climate on this planet. Who knew?

original article: To The Horror Of Global Warming Alarmists, Global Cooling Is Here
May 26, 2013 by Peter Ferrara

Around 1250 A.D., historical records show, ice packs began showing up farther south in the North Atlantic. Glaciers also began expanding on Greenland, soon to threaten Norse settlements on the island. From 1275 to 1300 A.D., glaciers began expanding more broadly, according to radiocarbon dating of plants killed by the glacier growth. The period known today as the Little Ice Age was just starting to poke through.

Summers began cooling in Northern Europe after 1300 A.D., negatively impacting growing seasons, as reflected in the Great Famine of 1315 to 1317. Expanding glaciers and ice cover spreading across Greenland began driving the Norse settlers out. The last, surviving, written records of the Norse Greenland settlements, which had persisted for centuries, concern a marriage in 1408 A.D. in the church of Hvalsey, today the best preserved Norse ruin.

Colder winters began regularly freezing rivers and canals in Great Britain, the Netherlands and Northern France, with both the Thames in London and the Seine in Paris frozen solid annually. The first River Thames Frost Fair was held in 1607. In 1607-1608, early European settlers in North America reported ice persisting on Lake Superior until June. In January, 1658, a Swedish army marched across the ice to invade Copenhagen. By the end of the 17th century, famines had spread from northern France, across Norway and Sweden, to Finland and Estonia.

Reflecting its global scope, evidence of the Little Ice Age appears in the Southern Hemisphere as well. Sediment cores from Lake Malawi in southern Africa show colder weather from 1570 to 1820. A 3,000 year temperature reconstruction based on varying rates of stalagmite growth in a cave in South Africa also indicates a colder period from 1500 to 1800. A 1997 study comparing West Antarctic ice cores with the results of the Greenland Ice Sheet Project Two (GISP2) indicate a global Little Ice Age affecting the two ice sheets in tandem.

The Siple Dome, an ice dome roughly 100 km long and 100 km wide, about 100 km east of the Siple Coast of Antartica, also reflects effects of the Little Ice Age synchronously with the GISP2 record, as do sediment cores from the Bransfield Basin of the Antarctic Peninsula. Oxygen/isotope analysis from the Pacific Islands indicates a 1.5 degree Celsius temperature decline between 1270 and 1475 A.D.

The Franz Josef glacier on the west side of the Southern Alps of New Zealand advanced sharply during the period of the Little Ice Age, actually invading a rain forest at its maximum extent in the early 1700s. The Mueller glacier on the east side of New Zealand’s Southern Alps expanded to its maximum extent at roughly the same time.

Ice cores from the Andeas mountains in South America show a colder period from 1600 to 1800. Tree ring data from Patagonia in South America show cold periods from 1270 to 1380 and from 1520 to 1670. Spanish explorers noted the expansion of the San Rafael Glacier in Chile from 1675 to 1766, which continued into the 19th century.

The height of the Little Ice Age is generally dated as 1650 to 1850 A.D. The American Revolutionary Army under General George Washington shivered at Valley Forge in the winter of 1777-78, and New York harbor was frozen in the winter of 1780. Historic snowstorms struck Lisbon, Portugal in 1665, 1744 and 1886. Glaciers in Glacier National Park in Montana advanced until the late 18th or early 19th centuries. The last River Thames Frost Fair was held in 1814. The Little Ice Age phased out during the middle to late 19th century.

The Little Ice Age, following the historically warm temperatures of the Medieval Warm Period, which lasted from about AD 950 to 1250, has been attributed to natural cycles in solar activity, particularly sunspots. A period of sharply lower sunspot activity known as the Wolf Minimum began in 1280 and persisted for 70 years until 1350. That was followed by a period of even lower sunspot activity that lasted 90 years from 1460 to 1550 known as the Sporer Minimum. During the period 1645 to 1715, the low point of the Little Ice Age, the number of sunspots declined to zero for the entire time. This is known as the Maunder Minimum, named after English astronomer Walter Maunder. That was followed by the Dalton Minimum from 1790 to 1830, another period of well below normal sunspot activity.

The increase in global temperatures since the late 19th century just reflects the end of the Little Ice Age. The global temperature trends since then have followed not rising CO2 trends but the ocean temperature cycles of the Pacific Decadal Oscillation (PDO) and the Atlantic Multidecadal Oscillation (AMO). Every 20 to 30 years, the much colder water near the bottom of the oceans cycles up to the top, where it has a slight cooling effect on global temperatures until the sun warms that water. That warmed water then contributes to slightly warmer global temperatures, until the next churning cycle.

Those ocean temperature cycles, and the continued recovery from the Little Ice Age, are primarily why global temperatures rose from 1915 until 1945, when CO2 emissions were much lower than in recent years. The change to a cold ocean temperature cycle, primarily the PDO, is the main reason that global temperatures declined from 1945 until the late 1970s, despite the soaring CO2 emissions during that time from the postwar industrialization spreading across the globe.

The 20 to 30 year ocean temperature cycles turned back to warm from the late 1970s until the late 1990s, which is the primary reason that global temperatures warmed during this period. But that warming ended 15 years ago, and global temperatures have stopped increasing since then, if not actually cooled, even though global CO2 emissions have soared over this period. As The Economist magazine reported in March, “The world added roughly 100 billion tonnes of carbon to the atmosphere between 2000 and 2010. That is about a quarter of all the CO2 put there by humanity since 1750.” Yet, still no warming during that time. That is because the CO2 greenhouse effect is weak and marginal compared to natural causes of global temperature changes.

At first the current stall out of global warming was due to the ocean cycles turning back to cold. But something much more ominous has developed over this period. Sunspots run in 11 year short term cycles, with longer cyclical trends of 90 and even 200 years. The number of sunspots declined substantially in the last 11 year cycle, after flattening out over the previous 20 years. But in the current cycle, sunspot activity has collapsed. NASA’s Science News report for January 8, 2013 states,

“Indeed, the sun could be on the threshold of a mini-Maunder event right now. Ongoing Solar Cycle 24 [the current short term 11 year cycle] is the weakest in more than 50 years. Moreover, there is (controversial) evidence of a long-term weakening trend in the magnetic field strength of sunspots. Matt Penn and William Livingston of the National Solar Observatory predict that by the time Solar Cycle 25 arrives, magnetic fields on the sun will be so weak that few if any sunspots will be formed. Independent lines of research involving helioseismology and surface polar fields tend to support their conclusion.”

That is even more significant because NASA’s climate science has been controlled for years by global warming hysteric James Hansen, who recently announced his retirement.

But this same concern is increasingly being echoed worldwide. The Voice of Russia reported on April 22, 2013,

“Global warming which has been the subject of so many discussions in recent years, may give way to global cooling. According to scientists from the Pulkovo Observatory in St.Petersburg, solar activity is waning, so the average yearly temperature will begin to decline as well. Scientists from Britain and the US chime in saying that forecasts for global cooling are far from groundless.”

That report quoted Yuri Nagovitsyn of the Pulkovo Observatory saying, “Evidently, solar activity is on the decrease. The 11-year cycle doesn’t bring about considerable climate change – only 1-2%. The impact of the 200-year cycle is greater – up to 50%. In this respect, we could be in for a cooling period that lasts 200-250 years.” In other words, another Little Ice Age.

The German Herald reported on March 31, 2013,

“German meteorologists say that the start of 2013 is now the coldest in 208 years – and now German media has quoted Russian scientist Dr Habibullo Abdussamatov from the St. Petersburg Pulkovo Astronomical Observatory [saying this] is proof as he said earlier that we are heading for a “Mini Ice Age.” Talking to German media the scientist who first made his prediction in 2005 said that after studying sunspots and their relationship with climate change on Earth, we are now on an ‘unavoidable advance towards a deep temperature drop.’”

Faith in Global Warming is collapsing in formerly staunch Europe following increasingly severe winters which have now started continuing into spring. Christopher Booker explained in The Sunday Telegraph on April 27, 2013,

“Here in Britain, where we had our fifth freezing winter in a row, the Central England Temperature record – according to an expert analysis on the US science blog Watts Up With That – shows that in this century, average winter temperatures have dropped by 1.45C, more than twice as much as their rise between 1850 and 1999, and twice as much as the entire net rise in global temperatures recorded in the 20th century.”

A news report from India (The Hindu April 22, 2013) stated, “March in Russia saw the harshest frosts in 50 years, with temperatures dropping to –25° Celsius in central parts of the country and –45° in the north. It was the coldest spring month in Moscow in half a century….Weathermen say spring is a full month behind schedule in Russia.” The news report summarized,

“Russia is famous for its biting frosts but this year, abnormally icy weather also hit much of Europe, the United States, China and India. Record snowfalls brought Kiev, capital of Ukraine, to a standstill for several days in late March, closed roads across many parts of Britain, buried thousands of sheep beneath six-metre deep snowdrifts in Northern Ireland, and left more than 1,000,000 homes without electricity in Poland. British authorities said March was the second coldest in its records dating back to 1910. China experienced the severest winter weather in 30 years and New Delhi in January recorded the lowest temperature in 44 years.”

Booker adds, “Last week it was reported that 3,318 places in the USA had recorded their lowest temperatures for this time of year since records began. Similar record cold was experienced by places in every province of Canada. So cold has the Russian winter been that Moscow had its deepest snowfall in 134 years of observations.”

Britain’s Met Office, an international cheerleading headquarters for global warming hysteria, did concede last December that there would be no further warming at least through 2017, which would make 20 years with no global warming. That reflects grudging recognition of the newly developing trends. But that reflects as well growing divergence between the reality of real world temperatures and the projections of the climate models at the foundation of the global warming alarmism of the UN’s Intergovernmental Panel on Climate Change (IPCC). Since those models have never been validated, they are not science at this point, but just made up fantasies. That is why, “In the 12 years to 2011, 11 out of 12 [global temperature]forecasts [of the Met Office] were too high — and… none were colder than [resulted],” as BBC climate correspondent Paul Hudson wrote in January.

Global warming was never going to be the problem that the Lysenkoists who have brought down western science made it out to be. Human emissions of CO2 are only 4 to 5% of total global emissions, counting natural causes. Much was made of the total atmospheric concentration of CO2 exceeding 400 parts per million. But if you asked the daffy NBC correspondent who hysterically reported on that what portion of the atmosphere 400 parts per million is, she transparently wouldn’t be able to tell you. One percent of the atmosphere would be 10,000 parts per million. The atmospheric concentrations of CO2 deep in the geologic past were much, much greater than today, yet life survived, and we have no record of any of the catastrophes the hysterics have claimed. Maybe that is because the temperature impact of increased concentrations of CO2 declines logarithmically. That means there is a natural limit to how much increased CO2 can effectively warm the planet, which would be well before any of the supposed climate catastrophes the warming hysterics have tried to use to shut down capitalist prosperity.

Yet, just last week, there was Washington Postcolumnist Eugene Robinson telling us, by way of attempting to tutor Rep. Lamar Smith (R-TX), Chairman of the House Committee on Science, Space and Technology, “For the record, and for the umpteenth time, there is no ‘great amount of uncertainty’ about whether the planet is warming and why.” If you can read, and you have gotten this far in my column, you know why Robinson’s ignorance is just anotherWashington Post abuse of the First Amendment. Mr. Robinson, let me introduce you to the British Met Office, stalwart of Global Warming “science,” such as it is, which has already publicly confessed that we are already three quarters through 20 years of No Global Warming!

Booker could have been writing about Robinson when he concluded his Sunday Telegraph commentary by writing, “Has there ever in history been such an almighty disconnect between observable reality and the delusions of a political class that is quite impervious to any rational discussion?”

But there is a fundamental problem with the temperature records from this contentious period, when climate science crashed into political science. The land based records, which have been under the control of global warming alarmists at the British Met Office and the Hadley Centre Climate Research Unit, and at NASA’s Goddard Institute for Space Studies and the National Oceanic and Atmospheric Administration (NOAA) in the U.S., show much more warming during this period than the incorruptible satellite atmosphere temperature records. Those satellite records have been further confirmed by atmospheric weather balloons. But the land based records can be subject to tampering and falsification.

climate change, environment, global warming, history, ice sheets, science

Filed under: climate change, environment, global warming, history, ice sheets, science

Pages

Categories

July 2017
M T W T F S S
« Jun    
 12
3456789
10111213141516
17181920212223
24252627282930
31