My friend Allen McDuffee has been busting his ass on a new blog called Think Tanked that you should all check out. Today, he posted an interesting little nugget about AEI’s Charles Murray, a white supremacist, but the “socially acceptable” kind that gets to write New York Times op-eds.
In a post titled “Arthur Sulzberger Needs YOU!,” Charles Murray takes his distaste over his payment from the New York Times to the AEI blog.
To all my fellow ink-stained wretches, a heads up. I got my check from the New York Times for an op ed that was published a few weeks ago. It was for $75. Not that anyone has ever paid the mortgage by writing op eds, but $75 for 800 words written for The Greatest Newspaper In the World is… how shall I put this? Weird. Do you suppose the red ink has really gotten that bad?
Yes. It’s true–not good at all. But what’s weird, actually, is posting something for the New York Times complaint department on the AEI blog.
Yeah, the writing world is a real harsh mistress, isn’t she, Charles?
This particular criticism isn’t only odd, as Allen pointed out, but also darkly hilarious. Here we have a white supremacist finally speaking up, not to defend his horrible beliefs, but to complain about his pay from the nation’s supposed shining example of journalistic integrity [insert hysterical laughter here]. At the same time, the media has been trying its damnedest to ignore the frequent and increasing instances of right-wing extremism in this country, a trend that I have reported on at length.
Here is Ross Douthat explaining why a billionaire, anti-choice zealots, and right-wing extremists hijacking U.S. politics is a victory for vaginas everywhere.
When historians set out to date the moment when the women’s movement of the 1970s officially consolidated its gains, they could do worse than settle on last Tuesday’s primaries.
I’ll give him points for a hilariously hyperbolic opening. Make your case, sailor.
It was a day when most of the major races featured female candidates, and all the major female candidates won. They won in South Dakota and Arkansas, California and Nevada. They won as business-friendly moderates (the Golden State’s Meg Whitman); as embattled incumbents (Arkansas’s Blanche Lincoln); as Tea Party insurgents (Sharron Angle in Nevada). South Carolina gubernatorial hopeful Nikki Haley even came in first despite multiple allegations of adultery.
But mostly, they won as Republicans. Conservative Republicans, in fact. Conservative Republicans endorsed by Sarah Palin, in many cases. Which generated a certain amount of angst in the liberal commentariat about What It All Meant For Feminism.
The question of whether conservative women get to be feminists is an interesting and important one. But it has obscured a deeper truth: Whether or not Palin or Fiorina or Haley can legitimately claim the label feminist, their rise is a testament to the overall triumph of the women’s movement.
Yesterday, I wrote about media pundits’ propensity to portray the extremely old and familiar as fresh and exciting. They do this to sell papers, drum up website hits, and to appear insightful and necessary. Maybe a handful do it out of boredom, or stupidity, believing what they are seeing really is something revolutionary.
In reality, there is nothing more sexist than assuming any woman’s political victory — regardless of the type of woman — is a progressive step forward for the feminist movement. Women are people, and people are a diverse bunch. It still matters what kind of woman wins the election. And the kind of women that won these races are either preposterously wealthy, staunch anti-feminists, or a healthy combination of both.
What happened on election day is an old story: rich, mostly white, right-wingers won. Oh, and they also happen to be girls. Hooray.
Basically, it will take more than Douthat calling this a victory for feminism to make it so.
Meg Whitman, the billionaire former eBay chief executive, won the Republican nomination for governor after spending a record $71 million of her money on the race. Quite simply, Whitman bought her victory, and this has nothing to do with the bonds of sisterhood or feminine strength. This is corporatism in a skirt.
In fact, Whitman herself seems to hate the notion of feminism. At least, she certainly doesn’t want anyone calling her such an offensive term. When asked if she is a feminist, Whitman replied, “I am a big believer in equal rights for all people … in a level playing field.” But she said, “I’m not a big label person.”
This could be NOW’s new slogan: Taking action for women’s equality since 1966…or whatever…we’re not big label people.
I know when Elizabeth Cady Stanton and Susan B. Anthony were taking on the male-dominated establishment, what sustained them was the thought that one day Blanche Lincoln (D-Walmart) would squeak out a victory despite being a corporate whore.
Apparently, it doesn’t matter than Lincoln is a turncoat Blue Dog Democrat, who voted with Republicans to allow warrantless government surveillance, the invasion of Iraq, and shot down the public option. All that matters is the stuff between her legs, which sort of goes against the whole notion of “feminism,” but nevermind. A girl won!
And then there’s Sharron Angle. I’ve written about her support of the right-wing extremist fringe, but Douthat skims over such silly details for the sake of preserving his narrative i.e. Things Are Super Awesome For Women Right Now. He’s going to jam this premise down your throat even though women earn around 79% of men’s median weekly salaries, and Congress just passed a healthcare bill that dramatically diminishes a woman’s right to choose the fate of her own body.
Angle proposed a bill that “would have required doctors to inform women seeking abortions about a controversial theory linking an increased risk of breast cancer with abortion.” (The abortion-causes-breast cancer theory is a myth, and was spread, in part, to discourage abortions). But I hear lying to scared, pregnant women for the sake of controlling their bodies is all the rage right now in the neo-feminist movement.
Other than the novelty of having survived not one — but multiple — allegations of adultery, Nikki Haley is extremely typical of the right-wing fringe. She has a 100 percent rating from the anti-abortion S.C. Citizens for Life group, and she calls on her website for the deportation of illegal immigrants. Oh, and if any of her white supremacist base, who may confuse her for a “raghead,” were concerned, don’t worry. She converted to Christianity.
Modern Republicans have grown wise to the fact that they’re never going to defeat feminism. Try as they did to shame, humiliate, and dismiss feminists as a bunch of ugly, barren spinsters, who refuse to shave their legs and can’t land a man, the propaganda campaign didn’t stick. Now, they’re left with only one option: hijack the movement.
In the same way President Obama’s victory was a sign that affirmative action is “no longer necessary,” so the victories of a handful of women (be they billionaires, right-wing extremists, turncoats, or militant anti-choicers) herald the dawn of a new feminism: one that is staunchly anti-woman, and represents only a class of wealthy, pro-Business, right wing extremists.
I feel sorry for Matt Bai. It was just three years ago that he sighed over the wasteland of the Clinton era and pondered aloud, what was it all for?
Even without the allusions to the old days, his speech seemed strangely reminiscent of that first campaign, and not necessarily in a good way. Listening to him talk, I found it hard not to wonder why so many of the challenges facing the next president were almost identical to those he vowed to address in 1992. Why, after Clinton’s two terms in office, were we still thinking about tomorrow? In some areas, most notably health care, Clinton tried gamely to leave behind lasting change, and he failed. In many more areas, though, the progress that was made under Clinton — almost 23 million new jobs, reductions in poverty, lower crime and higher wages — had been reversed or wiped away entirely in a remarkably short time. Clinton’s presidency seems now to have been oddly ephemeral, his record etched in chalk and left out in the rain.
Yeah, what’s up with that? Why does America seem to be forever spinning its wheels, and why has politics been reduced to a series of empty promises and arguments about abortion and gay marriage?
Apparently, Matt has been asking this question for three years because he has yet to find an answer.
Barbara Herbert, a course director at Tufts University School of Medicine, made a short, but compelling plea in today’s New York Times. Herbert argued that the United States government should convene a truth and reconciliation commission, using the one in South Africa as a model, to investigate into possible crimes committed by the Bush administration.
Such a commission would allow a nation to (a) find the truth of what happened from multiple perspectives, (b) develop an understanding of how it happened and (c) heal.
A commission isn’t some kind of partisan booby trap thrown together in a frenzied quest for retribution as Harry Reid suggested last week. The formation of a nonpartisan commission also wouldn’t act as a nefarious tool to dismantle the foundation of The American Way (corrupting the sweet “mysteries” of life,) as Bush apologists like Peggy Noonan claim.
A truth commission would use the law as a compass, and its only goal would be to restore order in America. As Herbert wrote, “We need a chance for secular redemption and healing.”
On Tuesday, Jeremy Scahill reported that Rep. John Conyers, chair of the House Judiciary Committee, and Rep. Jerrold Nadler wrote to Attorney General Eric Holder officially requesting the appointment of an independent Special Prosecutor to “to investigate and, where appropriate, prosecute torture committed against detainees during the Bush administration.” In order to restore credibility to the Justice Department, Holder must adhere to the rule of law, and not partisan demands. He must investigate into possible crimes committed under the Bush administration.
The law is not a fringe issue. Progressives may be the ones demanding an investigative commission, but the issue at stake here is the law itself. That’s not a partisan issue. The law should be sacred to all Americans: Republicans and Democrats. And if Democrats are proven to have been complicit in torture, then they too must be punished according to the law.
Otherwise, Americans will learn only one lesson: the law does not apply to our leaders. What a terrible lesson to teach young Americans.
The mainstream media’s players are incapable of cognitive dissonance.
The editors of our major, failing newspapers, seem perfectly comfortable with printing foreign policy advice from men, who would be arrested in other countries for war crimes.
I expected some kind of disclaimer before former undersecretary of defense, Douglas Feith’s, New York Times op-ed. Maybe Warning: This man has been accused by Spanish human rights lawyers of providing legal cover to Bush policies under which detainees were tortured. TAKE NOTHING HE SAYS SERIOUSLY.
Or Warning: Douglas Feith created the Counter Terrorism Evaluation Group shortly after 9/11. The group was under investigation by the Senate Select Committee on Intelligence for whether it exaggerated the threat posed by Iraq to justify the war.
Or Warning: Taking advice from men like Douglas Feith got us into two wars, which — in case you haven’t been watching television — aren’t going very well, so maybe you shouldn’t take what he has to say very seriously.
Alas, I reached the end of the article to find the following benign interpretation of Feith’s career:Douglas J. Feith, a former under secretary of defense, is a senior fellow and Justin Polin is a research associate at the Hudson Institute.
This is like describing Augusto Pinochet as a stern fellow with an unpopular vision of Chile’s future.
The media continues to perpetuate the cycle of bad advice by treating men like Douglas Feith as “serious” foreign policy “experts.” We could replicate (or possibly improve upon) Feith’s world class strategy advice by dressing a chimp in a suit and having him hurl his own feces at a world map. Wherever the shit lands, that’s where we send our troops. And we only have to pay undersecretary Chimp in bananas.
Our national conversation could benefit greatly from banning Douglas Feithian contributors. Feith has nothing new to offer the debate, anyway. In the Times, he recycles the old arguments that we must invade Pakistan for, like, the good of the people! Remember, this was partly the excuse Neo-Conservatives concotted for why we had to invade and occupy Iraq. While it is true Iraqis were suffering greatly, firebombing their villages was hardly a solution to the problem.
But then, helping the indigenous people is never the real reason we send our army overseas. And men like Douglas Feith know this. Though he writes about spreading the message of moderate religion via radio in Pakistan, his true interests have nothing to do with his love of Pakistani culture. He (and his cronies) are only interested in political and military leverage.
The Times is the only player still harboring the debunked notion that the Neo-Conservatives have something of value to offer the planet.
In 1947, President Truman signed the National Security Act, which formed the National Military Establishment, a department with the unfortunate acronym “NME,” (pronounced “enemy”). Wise men realized a name change was in order, so they rebranded NME as the “Department of Defense.” In its new role, the DoD would oversee the duties formerly handled by the Department of War and the Department of the Navy.
Department of War and “enemy” are more suitable nomenclatures for our modern wartime Chimera, the Department of Defense.
As Thom Shanker details with the cool, detached demeanor of a serial killer, the “protracted wars in Iraq and Afghanistan are forcing the Obama administration to rethink what for more than two decades has been a central premise of American strategy: that the nation need only prepare to fight two major wars at a time.”
Of course, “only two wars at a time, boys” isn’t written anywhere in our Constitution. That may be because our forefathers were sort of wary about that whole empirical conquest thing. They’d just escaped being ruled over by a tyrannical king and were in no rush to impose their own authoritarian regime upon anyone else, though that didn’t stop them from wiping out the Native Americans and pesky Mexicans.
A senior Defense Department official involved in a strategy review now under way said the Pentagon was absorbing the lesson that the kinds of counterinsurgency campaigns likely to be part of some future wars would require more staying power than in past conflicts, like the first Iraq war in 1991 or the invasions of Grenada and Panama.
I know what you’re thinking: Surely, the only lesson to be taken out of the Iraq and Afghanistan quagmires is to NOT invade countries that pose no threat to the United States. Well, that’s why you’re not in charge of leading young men and women to their deaths. The problem isn’t ideological. It’s strategical.
Among the refinements to the two-wars strategy the Pentagon has incorporated in recent years is one known as “win-hold-win” — an assumption that if two wars broke out simultaneously, the more threatening conflict would get the bulk of American forces while the military would have to defend along a second front until reinforcements could arrive to finish the job.
Another formulation envisioned the United States defending its territory, deterring hostility in four critical areas of the world and then defeating two adversaries in major combat operations, but not at exactly the same time.
For anyone of you weak, pathetic peace-lovers out there, who thought maybe (just maybe) the conflicts in Iraq and Afghanistan (and sometimes Pakistan) were winding down, stick this Pentagon memo in your pipe and smoke it. This is the long-vision, people. This is perpetual war.
An inconvenient truth is that Americans get worked up at the thought of an extended, massive ground invasion of foreign lands. That’s why the future of war is small, scattered, air-oriented, and covert. Whether it’s Dick Cheney’s implementation of a secret assassination ring, or Pakistan-stationed US drones killing civilians, war no longer has to receive the blessing of Congress, or – pause for laughter – the American people.
War is an inevitability, so a public debate about whether war should be is never an option. It’s not a matter of should we be planning for multiple, simultaneous, small invasions, but a debate over technicalities and strategies for when it happens. And the media usually walks hand-in-hand with the Pentagon, somehow managing to keep a straight face on the matter, when generals and bureaucrats start spouting rhetoric about preserving freedom and democracy via cluster bombs.
The war debate (if it can be called a debate) is completely off-kilter. Even in the “liberal” New York Times, the article isn’t balanced with a pro-war participant and a serious anti-war participant. Yet again, we get a photocopied Pentagon memo crammed within a major newspaper’s margins, without analysis or journalistic insight into the consequences of perpetual war. Including an anti-war voice isn’t partisan. It’s actually doing real journalistic work, which is representing all sides of a story, and not just the loudest opinions resonating from the state.
The closest the Times comes to representing an anti-war voice is in the confusing interjection from Michael E. O’Hanlon, a senior follow from the Brookings Institution, a think tank that the Times tells me is center-left, though I wouldn’t have guessed that from O’Hanlon’s comment:
“We have Gates and others saying that other parts of the government are underresourced and that the DoD should not be called on to do everything. That’s a good starting point for this — to ask and at least begin answering where it might be better to have other parts of the government get stronger and do a bigger share, rather than the Department of Defense.”
This sounds like O’Hanlon wants to outsource killing to other departments. Maybe we can arm teachers and parachute them into Pakistan.
Yet again, the debate over our larger war policies goes unexamined by the mainstream media. The media remains compliant in the imperial conquests of our government, and then acts dumbfounded when popular support for their institution wanes, and they find themselves antiquated and bankrupted.
SOMEDAY we’ll learn the whole story of why George W. Bush brushed off that intelligence briefing of Aug. 6, 2001, “Bin Laden Determined to Strike in U.S.” But surely a big distraction was the major speech he was readying for delivery on Aug. 9, his first prime-time address to the nation. The subject — which Bush hyped as “one of the most profound of our time” — was stem cells. For a presidency in thrall to a thriving religious right (and a presidency incapable of multi-tasking), nothing, not even terrorism, could be more urgent.
When Barack Obama ended the Bush stem-cell policy last week, there were no such overheated theatrics. No oversold prime-time address. No hysteria from politicians, the news media or the public. The family-values dinosaurs that once stalked the earth — Falwell,Robertson, Dobson and Reed — are now either dead, retired or disgraced. Their less-famous successors pumped out their pro forma e-mail blasts, but to little avail. The Republican National Committee said nothing whatsoever about Obama’s reversal of Bush stem-cell policy. That’s quite a contrast to 2006, when the party’s wild and crazy (and perhaps transitory) new chairman, Michael Steele, likened embryonic stem-cell research to Nazi medical experiments during his failed Senate campaign.
What has happened between 2001 and 2009 to so radically change the cultural climate? Here, at last, is one piece of good news in our global economic meltdown: Americans have less and less patience for the intrusive and divisive moral scolds who thrived in the bubbles of the Clinton and Bush years. Culture wars are a luxury the country — the G.O.P. included — can no longer afford.
Not only was Obama’s stem-cell decree an anticlimactic blip in the news, but so was his earlier reversal of Bush restrictions on the use of federal money by organizations offering abortions overseas. When the administration tardily ends “don’t ask, don’t tell,” you can bet that this action, too, will be greeted by more yawns than howls.
Once again, both the president and the country are following New Deal-era precedent. In the 1920s boom, the reigning moral crusade was Prohibition, and it packed so much political muscle that F.D.R. didn’t oppose it. The Anti-Saloon League was the Moral Majority of its day, the vanguard of a powerful fundamentalist movement that pushed anti-evolution legislation as vehemently as it did its war on booze. (The Scopes “monkey trial” was in 1925.) But the political standing of this crowd crashed along with the stock market. Roosevelt shrewdly came down on the side of “the wets” in his presidential campaign, leaving Hoover to drown with “the dries.”
Much as Obama repealed the Bush restrictions on abortion and stem-cell research shortly after pushing through his stimulus package, so F.D.R. jump-started the repeal of Prohibition by asking Congress to legalize beer and wine just days after his March 1933 inauguration and declaration of a bank holiday. As Michael A. Lerner writes in his fascinating 2007 book “Dry Manhattan,” Roosevelt’s stance reassured many Americans that they would have a president “who not only cared about their economic well-being” but who also understood their desire to be liberated from “the intrusion of the state into their private lives.” Having lost plenty in the Depression, the public did not want to surrender any more freedoms to the noisy minority that had shut down the nation’s saloons.
In our own hard times, the former moral “majority” has been downsized to more of a minority than ever. Polling shows that nearly 60 percent of Americans agree with ending Bush restrictions on stem-cell research (a Washington Post/ABC News survey in January); that 55 percent endorse either gay civil unions or same-sex marriage (Newsweek, December 2008); and that 75 percent believe openly gay Americans should serve in the military (Post/ABC, July 2008). Even the old indecency wars have subsided. When a federal court last year struck down the F.C.C. fine against CBS for Janet Jackson’s “wardrobe malfunction” at the 2004 Super Bowl, few Americans either noticed or cared about the latest twist in what had once been a national cause célèbre.
It’s not hard to see why Eric Cantor, the conservative House firebrand who is vehemently opposed to stem-cell research, was disinclined to linger on the subject when asked about it on CNN last Sunday. He instead accused the White House of acting on stem cells as a ploy to distract from the economy. “Let’s take care of business first,” he said. “People are out of jobs.” (On this, he’s joining us late, but better late than never.)
Even were the public still in the mood for fiery invective about family values, the G.O.P. has long since lost any authority to lead the charge. The current Democratic president and his family are exemplars of precisely the Eisenhower-era squareness — albeit refurbished by feminism — that the Republicans often preached but rarely practiced. Obama actually walks the walk. As the former Bush speechwriter David Frum recently wrote, the new president is an “apparently devoted husband and father” whose worst vice is “an occasional cigarette.”
Frum was contrasting Obama to his own party’s star attraction, Rush Limbaugh, whose “history of drug dependency” and “tangled marital history” make him “a walking stereotype of self-indulgence.” Indeed, the two top candidates for leader of the post-Bush G.O.P, Rush and Newt, have six marriages between them. The party that once declared war on unmarried welfare moms, homosexual “recruiters” and Bill Clinton’s private life has been rebranded by Mark Foley, Larry Craig, David Vitter and the irrepressible Palins. Even before the economy tanked, Americans had more faith in medical researchers using discarded embryos to battle Parkinson’s and Alzheimer’s than in Washington politicians making ad hoc medical decisions for Terri Schiavo.
What’s been revealing about watching conservatives debate their fate since their Election Day Waterloo is how, the occasional Frum excepted, so many of them don’t want to confront the obsolescence of culture wars as a political crutch. They’d rather, like Cantor, just change the subject — much as they avoid talking about Bush and avoid reckoning with the doomed demographics of the G.O.P.’s old white male base. To recognize all these failings would be to confront why a once-national party can now be tucked into the Bible Belt.
The religious right is even more in denial than the Republicans. When Obama nominated Kathleen Sebelius, the Roman Catholic Kansas governor who supports abortion rights, as his secretary of health and human services, Tony Perkins, the leader of the Family Research Council, became nearly as apoplectic as the other Tony Perkins playing Norman Bates. “If Republicans won’t take a stand now, when will they?” the godly Perkins thundered online. But Congressional Republicans ignored him, sending out (at most) tepid press releases of complaint, much as they did in response to Obama’s stem-cell order. The two antiabortion Kansas Republicans in the Senate, Sam Brownback and Pat Roberts, both endorsed Sebelius.
Perkins is now praying that economic failure will be a stimulus for his family-values business. “As the economy goes downward,” he has theorized, “I think people are going to be driven to religion.” Wrong again. The latest American Religious Identification Survey, published last week, found that most faiths have lost ground since 1990 and that the fastest-growing religious choice is “None,” up from 8 percent to 15 percent (which makes it larger than all denominations except Roman Catholics and Baptists). Another highly regarded poll, the General Social Survey, had an even more startling finding in its preliminary 2008 data released this month: Twice as many Americans have a “great deal” of confidence in the scientific community as do in organized religion. How the almighty has fallen: organized religion is in a dead heat with banks and financial institutions on the confidence scale.
This, too, is a replay of the Great Depression. “One might have expected that in such a crisis great numbers of these people would have turned to the consolations of and inspirations of religion,” wrote Frederick Lewis Allen in “Since Yesterday,” his history of the 1930s published in 1940. But that did not happen: “The long slow retreat of the churches into less and less significance in the life of the country, and even in the lives of the majority of their members, continued almost unabated.”
The new American faith, Allen wrote, was the “secular religion of social consciousness.” It took the form of campaigns for economic and social justice — as exemplified by the New Deal and those movements that challenged it from both the left and the right. It’s too early in our crisis and too early in the new administration to know whether this decade will so closely replicate the 1930s, but so far Obama has far more moral authority than any religious leader in America with the possible exception of his sometime ally, the Rev. Rick Warren.
History is cyclical, and it would be foolhardy to assume that the culture wars will never return. But after the humiliations of the Scopes trial and the repeal of Prohibition, it did take a good four decades for the religious right to begin its comeback in the 1970s. In our tough times, when any happy news can be counted as a miracle, a 40-year exodus for these ayatollahs can pass for an answer to America’s prayers.
The late Tom Anderson, the family doctor in this little farm town in northwestern Indiana, at first was puzzled, then frightened.
He began seeing strange rashes on his patients, starting more than a year ago. They began as innocuous bumps — “pimples from hell,” he called them — and quickly became lesions as big as saucers, fiery red and agonizing to touch.
They could be anywhere, but were most common on the face, armpits, knees and buttocks. Dr. Anderson took cultures and sent them off to a lab, which reported that they were MRSA, or staph infections that are resistant to antibiotics.
MRSA (methicillin-resistant Staphylococcus aureus) sometimes arouses terrifying headlines as a “superbug” or “flesh-eating bacteria.” The best-known strain is found in hospitals, where it has been seen regularly since the 1990s, but more recently different strains also have been passed among high school and college athletes. The federal Centers for Disease Control and Prevention reported that by 2005, MRSA was killing more than 18,000 Americans a year, more than AIDS.
Dr. Anderson at first couldn’t figure out why he was seeing patient after patient with MRSA in a small Indiana town. And then he began to wonder about all the hog farms outside of town. Could the pigs be incubating and spreading the disease?
“Tom was very concerned with what he was seeing,” recalls his widow, Cindi Anderson. “Tom said he felt the MRSA was at phenomenal levels.”
By last fall, Dr. Anderson was ready to be a whistle-blower, and he agreed to welcome me on a reporting visit and go on the record with his suspicions. That was a bold move, for any insinuation that the hog industry harms public health was sure to outrage many neighbors.
So I made plans to come here and visit Dr. Anderson in his practice. And then, very abruptly, Dr. Anderson died at the age of 54.
There was no autopsy, but a blood test suggested a heart attack or aneurysm. Dr. Anderson had himself suffered at least three bouts of MRSA, and a Dutch journal has linked swine-carried MRSA to dangerous human heart inflammation.
The larger question is whether we as a nation have moved to a model of agriculture that produces cheap bacon but risks the health of all of us. And the evidence, while far from conclusive, is growing that the answer is yes.
A few caveats: The uncertainties are huge, partly because our surveillance system is wretched (the cases here in Camden were never reported to the health authorities). The vast majority of pork is safe, and there is no proven case of transmission of MRSA from eating pork. I’ll still offer my kids B.L.T.’s — but I’ll scrub my hands carefully after handling raw pork.
Let me also be very clear that I’m not against hog farmers. I grew up on a farm outside Yamhill, Ore., and was a state officer of the Future Farmers of America; we raised pigs for a time, including a sow named Brunhilda with such a strong personality that I remember her better than some of my high school dates.
One of the first clues that pigs could infect people with MRSA came in the Netherlands in 2004, when a young woman tested positive for a new strain of MRSA, called ST398. The family lived on a farm, so public health authorities swept in — and found that three family members, three co-workers and 8 of 10 pigs tested all carried MRSA.
Since then, that strain of MRSA has spread rapidly through the Netherlands — especially in swine-producing areas. A small Dutch study found pig farmers there were 760 times more likely than the general population to carry MRSA (without necessarily showing symptoms), and Scientific Americanreports that this strain of MRSA has turned up in 12 percent of Dutch retail pork samples.
Now this same strain of MRSA has also been found in the United States. A new study by Tara Smith, a University of Iowa epidemiologist, found that 45 percent of pig farmers she sampled carried MRSA, as did 49 percent of the hogs tested.
The study was small, and much more investigation is necessary. Yet it might shed light on the surge in rashes in the now vacant doctor’s office here in Camden. Linda Barnard, who was Dr. Anderson’s assistant, thinks that perhaps 50 people came in to be treated for MRSA, in a town with a population of a bit more than 500. Indeed, during my visit, Dr. Anderson’s 13-year-old daughter, Lily, showed me a MRSA rash inflaming her knee.
“I’ve had it many times,” she said.
So what’s going on here, and where do these antibiotic-resistant infections come from? Probably from the routine use — make that the insane overuse — of antibiotics in livestock feed. This is a system that may help breed virulent “superbugs” that pose a public health threat to us all. That’ll be the focus of my next column, on Sunday.
Working families were in deep trouble long before this megarecession hit. But too many of the public officials who should have been looking out for the middle class and the poor were part of the reckless and shockingly shortsighted alliance of conservatives and corporate leaders that rigged the economy in favor of the rich and ultimately brought it down completely.
As Jared Bernstein, now the chief economic adviser to Vice President Joseph Biden, wrote in the preface to his book, “Crunch: Why Do I Feel So Squeezed? (And Other Unsolved Economic Mysteries)”:
“Economics has been hijacked by the rich and powerful, and it has been forged into a tool that is being used against the rest of us.”
Working people were not just abandoned by big business and their ideological henchmen in government, they were exploited and humiliated. They were denied the productivity gains that should have rightfully accrued to them. They were treated ruthlessly whenever they tried to organize. They were never reasonably protected against the savage dislocations caused by revolutions in technology and global trade.
Working people were told that all of this was good for them, and whether out of ignorance or fear or prejudice or, as my grandfather might have said, damned foolishness, many bought into it. They signed onto tax policies that worked like a three-card monte game. And they were sold a snake oil concoction called “trickle down” that so addled their brains that they thought it was a wonderful idea to hand over their share of the nation’s wealth to those who were already fabulously rich.
America used to be better than this.
The seeds of today’s disaster were sown some 30 years ago. Looking at income patterns during that period, my former colleague at The Times, David Cay Johnston, noted that from 1980 (the year Ronald Reagan was elected) to 2005, the national economy, adjusted for inflation, more than doubled. (Because of population growth, the actual increase per capita was about 66 percent.)
But the average income for the vast majority of Americans actually declined during those years. The standard of living for the average family improved not because incomes grew but because women entered the workplace in droves.
As hard as it may be to believe, the peak income year for the bottom 90 percent of Americans was way back in 1973, when the average income per taxpayer, adjusted for inflation, was $33,000. That was nearly $4,000 higher, Mr. Johnston pointed out, than in 2005.
Men have done particularly poorly. Men who are now in their 30s — the prime age for raising families — earn less money than members of their fathers’ generation did at the same age.
It may seem like ancient history, but in the first few decades following World War II, the United States, despite many serious flaws, established the model of a highly productive society that shared its prosperity widely and made investments that were geared toward a more prosperous, more fulfilling future.
The American dream was alive and well and seemingly unassailable. But somehow, following the oil shocks, the hyperinflation and other traumas of the 1970s, Americans allowed the right-wingers to get a toehold — and they began the serious work of smothering the dream.
Ronald Reagan saw Medicare as a giant step on the road to socialism. Newt Gingrich, apparently referring to the original fee-for-service version of Medicare, which was cherished by the elderly, cracked, “We don’t get rid of it in Round One because we don’t think it’s politically smart.”
The right-wingers were crafty: You smother the dream by crippling the programs that support it, by starving the government of money to pay for them, by funneling the government’s revenues to the rich through tax cuts and other benefits, by looting the government the way gangsters loot legitimate businesses and then pleading poverty when it comes time to fund the services required by the people.
The anti-tax fanatic Grover Norquist summed the matter up nicely when he famously said, “Our goal is to shrink the government to the size where you can drown it in a bathtub.” Only they didn’t shrink the government, they enlarged it and turned its bounty over to the rich.
Now, with the economy in free fall and likely to get worse, Americans — despite their suffering — have an opportunity to reshape the society, and then to move it in a fairer, smarter and ultimately more productive direction. That is the only way to revive the dream, but it will take a long time and require great courage and sacrifice.
The right-wingers do not want that to happen, which is why they are rooting so hard for President Obama’s initiatives to fail. They like the direction that the country took over the past 30 years. They’d love to do it all again.