The Washington Post reports: The United States has a higher tolerance for torture than any other country on the U.N. Security Council, and Americans are more comfortable with torture than citizens of war-ravaged countries such as Afghanistan, Iraq and Ukraine.
Those are two key findings reported by the International Committee of the Red Cross on Monday, in a new report highlighting global perspectives on war.
The American willingness to use torture was part of a worrying trend identified by the ICRC — a growing belief globally that enemy combatants can be tortured for information. When researchers asked that question in 1999, just 28 percent of respondents said enemy combatants could be tortured. This year, 36 percent said it was justified.
That finding has raised concern, ICRC researchers said, about the role of international law in the world’s numerous armed conflicts. The report said the rules of armed conflict, like the Geneva Conventions, “are being questioned perhaps more than at any time in recent history.”
But there’s also a shocking lack of awareness that those rules exist — 39 percent of the Americans who supported torture told the ICRC they “didn’t realize my country had agreed to ban torture” as a signatory to the Geneva Conventions. [Continue reading…]
Toni Morrison writes: This is a serious project. All immigrants to the United States know (and knew) that if they want to become real, authentic Americans they must reduce their fealty to their native country and regard it as secondary, subordinate, in order to emphasize their whiteness. Unlike any nation in Europe, the United States holds whiteness as the unifying force. Here, for many people, the definition of “Americanness” is color.
Under slave laws, the necessity for color rankings was obvious, but in America today, post-civil-rights legislation, white people’s conviction of their natural superiority is being lost. Rapidly lost. There are “people of color” everywhere, threatening to erase this long-understood definition of America. And what then? Another black President? A predominantly black Senate? Three black Supreme Court Justices? The threat is frightening.
In order to limit the possibility of this untenable change, and restore whiteness to its former status as a marker of national identity, a number of white Americans are sacrificing themselves. They have begun to do things they clearly don’t really want to be doing, and, to do so, they are (1) abandoning their sense of human dignity and (2) risking the appearance of cowardice. Much as they may hate their behavior, and know full well how craven it is, they are willing to kill small children attending Sunday school and slaughter churchgoers who invite a white boy to pray. Embarrassing as the obvious display of cowardice must be, they are willing to set fire to churches, and to start firing in them while the members are at prayer. And, shameful as such demonstrations of weakness are, they are willing to shoot black children in the street.
To keep alive the perception of white superiority, these white Americans tuck their heads under cone-shaped hats and American flags and deny themselves the dignity of face-to-face confrontation, training their guns on the unarmed, the innocent, the scared, on subjects who are running away, exposing their unthreatening backs to bullets. Surely, shooting a fleeing man in the back hurts the presumption of white strength? The sad plight of grown white men, crouching beneath their (better) selves, to slaughter the innocent during traffic stops, to push black women’s faces into the dirt, to handcuff black children. Only the frightened would do that. Right?
These sacrifices, made by supposedly tough white men, who are prepared to abandon their humanity out of fear of black men and women, suggest the true horror of lost status.
It may be hard to feel pity for the men who are making these bizarre sacrifices in the name of white power and supremacy. Personal debasement is not easy for white people (especially for white men), but to retain the conviction of their superiority to others—especially to black people—they are willing to risk contempt, and to be reviled by the mature, the sophisticated, and the strong. If it weren’t so ignorant and pitiful, one could mourn this collapse of dignity in service to an evil cause. [Continue reading…]
My education came under the tutelage of my father, a man who taught me his love for driving through the South. There’s a beauty in the neat tobacco rows on Highway 64 and the tall, quiet sentinel trees on 87. With mouths full of sunflower seeds, my daddy would quiz me on each plant, animal, and landmark we passed, and I picked up both his habits of driving and cataloguing the things that made us Southern, black, and whole.
But things ain’t always beautiful, and I learned those too. One hot summer afternoon, taking the 74 east from Charlotte, North Carolina, to Elizabethtown in my daddy’s black Toyota truck, a man ran us off the road. We skidded on the dirt shoulder as the man sped on past, his Confederate battle flag license plate a final insult to our situation. The bile rose in my throat, and the hot anger and shame at the symbol made my skin prickle. Here was a man who could just be a jerk having a bad day, but whose choice of a single symbol suddenly made that bad day personal. My dad just cussed a little bit, put another handful of sunflower seeds in his mouth, and continued on our way down that road.
At a gas station just outside of Rockingham, serendipity found us. As we pulled up to the pump, just there in front of our car was Mr. Confederate Plate, leaning like all villains do against the side of his car. I’m not sure who recognized whom first, but I remember the shouting match, and Mr. Confederate Flag calling my father the one name he would never answer to, looking at me and saying the same, and then pantomiming that he had a gun in the car. I remember looking around at similar flags on another truck and inside the gas station, and knowing instinctively that we were not in friendly territory. I also remember my father shaking with rage and that same hot shame as my own when he climbed back in the truck.
After another cussing fit, Vann Newkirk, Sr. looked at me and said the thing that’s always stuck with me since. “This is who we are,” he told me. “Don’t forget.” And we went back down the road.
This is who we are. Those words often come to me when I see the ugly things in life now. When the first details about Tamir Rice’s death at the hands of police officers came to me on Twitter, they were a scream in the dark. When people questioned with straight faces if our president was even born in America, they echoed about my ears. When the Department of Justice report revealed that Ferguson, Missouri was a racial kleptocracy, they were a whisper in the wind.
When a man who was accused of multiple sexual assaults, was endorsed by the Ku Klux Klan, characterized Mexican immigrants as “rapists,” and promoted stop and frisk as a national campaign of “law and order” was elected president, they boomed like thunder. [Continue reading…]
The 2016 American presidential campaign has renewed concerns about the specter of violence in American electoral politics. The campaign has been marked by tense – and occasionally violent – altercations between supporters and critics of Republican nominee Donald Trump.
Trump encouraged his supporters to “knock the crap” out of protesters, and even suggested he would pay the legal fees of followers who assaulted his critics.
By refusing to commit to accepting the results of the election, he has confirmed the doubts among his supporters about the integrity of American elections. Thereby, he has increased the risk of possibly violent resistance by hard-core Trumpists.
It would be comforting to conclude that the menace of violence surrounding the 2016 presidential election is unique. But my research on the history of voting rights in the United States suggests that this is far from the case. Indeed, the threat and execution of violence around elections has a long, sad history in American politics.
Somewhat like the 2016 election – which has revolved around issues of race and immigration – efforts by disadvantaged (and often nonwhite) citizens to secure greater political influence have been met with violent repression by those already enjoying power (usually more affluent whites) throughout American history.
Alyssa Rosenberg writes: Movies, television and novels have trained audiences to excuse almost any police shooting, including the deaths of children — until now, when the emergence and near-ubiquity of real-life videos have made the gap between fiction and reality undeniable.
Whether a shooting is legal is determined in part by an officer’s fear. But when the Los Angeles Police Department cleared scripts for television series such as “Dragnet” or “Adam-12,” “any shooting that was done on the shows was squeaky clean,” explained former detective sergeant Joseph Wambaugh, who worked briefly in the LAPD’s public information office, where the scripts were reviewed. “Any officer would have to be in total control.”
If this standard had nothing to do with how officers actually reacted after shooting someone, it was intended to bolster the audience’s confidence in police officers.
In fact, officers on early cop shows such as “Dragnet” and “Naked City” were often presented as so decent that they questioned their own decisions to shoot and had to be convinced that they’d done the right thing. Often, the person doing the convincing was a parent or relative of the dead person.
The first time Joe Friday (Jack Webb), the archetypal stoic police officer, killed a person in the “Dragnet” episode “The Big Thief,” he was so distressed that his partner had to help him fill out his incident report. “I kind of wonder if there was another way,” Friday declared glumly, unconvinced that he was right to shoot even though the other man had a gun. Friday was ultimately reassured by the law itself, when the shooting was ruled a justifiable homicide.
Friday’s question hangs in the air, but it both casts and dispels doubt in a single sentence. If someone who cares as much as Joe Friday does couldn’t find a better solution when confronted with a dangerous criminal, then maybe one doesn’t exist. Friday’s concerns are themselves the proof that he would never do the wrong thing. [Continue reading…]
Although there is no universally accepted definition of “youth,” the report’s authors primarily use the most commonly applied age bracket of 15-29, in line with other international organizations.
The YDI is a composite index of 18 indicators that collectively measure progress on youth development in 183 countries, including 49 of the 53 Commonwealth countries. It has five domains, measuring levels of education, health and wellbeing, employment and opportunity, political participation and civic participation among young people.
In its rankings within these five domains, the number on American youth that jumps out is for health and well-being: 106 — that’s below, for instance, Iraq (103) and Bangladesh (102).
There’s no mystery as to why the U.S. ranks so poorly in this regard. The primary reason: obesity. And the primary causes of obesity are diets loaded in empty calories combined with sedentary life styles.
The American way of life has become a system of factory farming in which a large proportion of citizens get fattened up and fed into a life-long disease management system. The primary beneficiaries of this system are the pharmaceutical industry, the manufacturers of sodas and junk food, and the entertainment industry.
Suppose a terrorist plot was uncovered revealing a plan to poison most Americans. This discovery probably wouldn’t generate a huge amount of alarm for the simple reason that however evil its ambitions might be, no terrorist organization could actually carry out a plot on this scale.
On the other hand, even though there has never been a corporate conspiracy designed to accomplish this goal, a largely unquestioned obedience to the principle of profit has brought America to this juncture. This is a chronic condition of commercial exploitation and social decay that has been decades in the making.
The sedentary lifestyle of the US population was already a concern in the 1950s, when President Eisenhower created the Council on Fitness and Health to promote physical activity in the population. While secular data to assess trends are limited, in 2000 the Centers for Disease Control and Prevention estimated that less than 30 percent of the US population has an adequate level of physical activity, another 30 percent is active but not sufficiently, and the remainder is sedentary. A longitudinal study of girls aged 9–18 years documented the dramatic decline in physical activity during adolescence, particularly among Black girls. A number of factors may result in limited physical activity at schools, such as budget constraints and pressure to meet academic performance targets. Out of school, physical activity is also frequently limited. The Centers for Disease Control and Prevention reported a dramatic decline in the proportion of children who walk or bike to school, from close to 42 percent in 1969 to 16 percent in 2001. At home, the average US teenager spends over 30 hours per week watching television. This activity is not only sedentary but also associated with reduced consumption of fresh fruits and vegetables, possibly related to consumption of snack foods while watching television and to the influence of food commercials, most of which advertise low-nutrient-density foods.
In the 1950s, the sugar industry sought to halve the amount of fat in the American diet and replace this with sugar which would result in a 30% increase in sugar consumption and “a tremendous improvement in general health,” according to the president of the Sugar Research Foundation, Harry Hass. The industry turned out to be tremendously successful in boosting sugar consumption, but instead of improving health it has poisoned America, setting multiple generations on a path towards chronic disease and premature death.
The 2014 documentary, Fed Up, can be rented or bought here, or viewed on Netflix.
It’s rare to hear an author say, “Researching and writing this book has made me want to scream.” But perhaps it’s not surprising, given the topic of Gary Younge’s Another Day in the Death of America: A Chronicle of Ten Short Lives — the daily, weekly, monthly, yearly death-by-gun of startling numbers of kids in this country — and the time he spent tracking down the stories of the young Americans who died on a single day in November 2013 in separate incidents nationwide.
After all, these days, the U.S. is a haven and a heaven for guns. It’s hard to find another nation on the planet — except in places like Syria or Afghanistan where whole populations have been thrown into desperate internecine conflicts — in which guns are so readily available. Between 1968 and 2015, the number of guns in the U.S. essentially doubled to 300 million. Between 2010 and 2013 alone, American arms manufacturers doubled their production of weapons to almost 11 million a year. And those guns have gotten more deadly as well. Military-style assault rifles and semi-automatic handguns are now the weapons of choice for mass killers and “lone wolf” terrorists in this country. In almost all cases those killers got their guns and ammo (often high-capacity magazines capable of holding 15 to 100 rounds) in perfectly legal fashion. And it’s getting easier to carry concealed weapons all the time. Missouri, for instance, recently passed a law that allows the carrying of such a weapon without either a permit or training of any sort.
Under the circumstances, no one should be surprised that kids die in remarkable numbers from guns for all kinds of reasons. Believe me, though, that makes it no less shocking when you read Younge’s unsettling and moving book. Long a journalist, columnist, and editor for the British Guardian stationed here in the U.S., today he offers us a look at the death toll from guns among our young and the way we Americans generally like to explain that toll to ourselves (or rather how we like to explain it away). Tom Engelhardt
An all-American slaughter The youthful carnage of America’s gun culture By Gary Younge
Every day, on average, seven kids and teens are shot dead in America. Election 2016 will undoubtedly prove consequential in many ways, but lowering that death count won’t be one of them. To grapple with fatalities on that scale — 2,500 dead children annually — a candidate would need a thoroughgoing plan for dealing with America’s gun culture that goes well beyond background checks. In addition, he or she would need to engage with the inequality, segregation, poverty, and lack of mental health resources that add up to the environment in which this level of violence becomes possible. Think of it as the huge pile of dry tinder for which the easy availability of firearms is the combustible spark. In America in 2016, to advocate for anything like the kind of policies that might engage with such issues would instantly render a candidacy implausible, if not inconceivable — not least with the wealthy folks who now fund elections.
So the kids keep dying and, in the absence of any serious political or legislative attempt to tackle the causes of their deaths, the media and the political class move on to excuses. From claims of bad parenting to lack of personal responsibility, they regularly shift the blame from the societal to the individual level. Only one organized group at present takes the blame for such deaths. The problem, it is suggested, isn’t American culture, but gang culture.
Researching my new book, Another Day in the Death of America, about all the children and teens shot dead on a single random Saturday in 2013, it became clear how often the presence of gangs in neighborhoods where so many of these kids die is used as a way to dismiss serious thinking about why this is happening. If a shooting can be described as “gang related,” then it can also be discounted as part of the “pathology” of urban life, particularly for people of color. In reality, the main cause, pathologically speaking, is a legislative system that refuses to control the distribution of firearms, making America the only country in the world in which such a book would have been possible.
Ivan Krastev writes: In our increasingly Anglophone world, Americans have become nakedly transparent to English speakers everywhere, yet the world remains bafflingly and often frighteningly opaque to monolingual Americans. While the world devours America’s movies and follows its politics closely, Americans know precious little about how non-Americans think and live. Americans have never heard of other countries’ movie stars and have only the vaguest notions of what their political conflicts are about.
This gross epistemic asymmetry is a real weakness. When WikiLeaks revealed the secret cables of the American State Department or leaked the emails of the Clinton campaign, it became a global news sensation and a major embarrassment for American diplomacy. Leaking Chinese diplomatic cables or Russian officials’ emails could never become a worldwide human-interest story, simply because only a relative handful of non-Chinese or non-Russians could read them, let alone make sense of them. [Continue reading…]
Although I’m pessimistic about the prospects of the meek inheriting the earth, the bi-lingual are in a very promising position. And Anglo-Americans should never forget that this is after all a country with a Spanish name. As for where I stand personally, I’m with the bi-lingual camp in spirit even if my own claim to be bi-lingual is a bit tenuous — an English-speaker who understands American-English but speaks British-English; does that count?
Dahlia Lithwick writes: This past Tuesday marked the 25th anniversary of Anita Hill’s devastating Senate testimony accusing then–Supreme Court nominee Clarence Thomas of workplace sexual harassment. In light of the most recent accusations against Donald Trump, it’s hard to miss the almost perfect synchronicity between these two October explosions of gender awareness. In a deeply personal and visceral way, America is having another Anita Hill moment.
In one sense it’s depressing: It’s been 25 years, and yet here we are, still talking about whether a man who allegedly treats women like lifelong party favors, should perhaps be disqualified from our highest governmental positions. But to despair that it’s gender Groundhog Day in America is to fundamentally miss the point: A lot has changed since October 1991, and American women are reaping the benefits of having gone through this looking glass once before. The nearly universal and instantaneous outrage at Trump’s comments and behavior — from the press, from GOP leaders, from really everyone outside of the Breitbart bubble? We have Anita Hill to thank for that.
It’s almost impossible for women like me, who came of age during the Thomas Senate battle, to miss the parallels between the two episodes. In both cases, powerful men allegedly mistreated and shamed women with less power than they had. In both cases these victims came forth reluctantly, and sometimes years later. In both instances, supporters of the man accused of misconduct argued that it was “just words,” or that it was all “years ago,” or that he was merely joking, or that it never happened at all. They argue that if the subordinate was soooo offended, why did she wait to complain? [Continue reading…]
The Washington Post reports: Three Kansas men were accused of plotting a bomb attack targeting an apartment complex home to a mosque and many Muslim immigrants from Somalia, authorities said Friday.
Curtis Allen, Gavin Wright and Patrick Eugene Stein face federal charges of conspiring to use a weapon of mass destruction, the Department of Justice announced Friday.
“These charges are based on eight months of investigation by the FBI that is alleged to have taken the investigators deep into a hidden culture of hatred and violence,” Acting U.S. Attorney Tom Beall said in a statement. “Many Kansans may find it as startling as I do that such things could happen here.”
According to the complaint, the investigation was prompted by a paid confidential informant who had attended meetings with a group of individuals calling themselves “the Crusaders,” and heard plans discussed plots to attack Muslims, whom they called “cockroaches.” [Continue reading…]
Clint Smith writes: Recently, protesters and police clashed in the streets of Charlotte, North Carolina, following the killing of Keith Lamont Scott, a forty-three-year-old father of seven, who had recently moved to the city with his wife and family. Scott was shot by officers who were searching for a man with an outstanding warrant. Scott was not that man. Officer accounts claim that Scott had a handgun and refused to comply when he got out of his car. Other witnesses say that Scott was actually holding a book, as he often read while waiting for the bus to return his son from elementary school.
The footage from Charlotte reflected a scene that has become all too familiar over the past several years: police cocooned in riot gear, their bodies encased in bulletproof vests and military-style helmets; protesters rendered opaque by the tear gas that surrounds them, scarves covering their mouths and noses to keep from inhaling the smoke.
These protests happened because of Keith Lamont Scott, but they also happened because Charlotte is a city that has long had deep racial tensions, and frustration has been building for some time. There are many places one might look to find the catalyst of this resentment, nationally and locally. But one of the first places to look is Charlotte’s public-school system.
In 1954, the Supreme Court ruled in Brown v. Board of Education that “separate educational facilities are inherently unequal” and thus unconstitutional. The decision mandated that schools across the country be integrated, though, in reality, little actual school desegregation took place following the ruling. It took years for momentum from the civil-rights movement to create enough political pressure for truly meaningful integration to take place in classrooms across the country.
To understand what happened next, it helps to turn to a book published last year and edited by Roslyn Arlin Mickelson, Stephen Samuel Smith, and Amy Hawn Nelson, “Yesterday, Today, and Tomorrow: School Desegregation and Resegregation in Charlotte.” It uses essays by sociologists, political scientists, economists, and attorneys to illuminate how the city became the focal point of the national school-desegregation debate, with decisions that set a precedent for the rest of the country. [Continue reading…]
Maya Jasanoff writes: One hundred and fifty years after the Thirteenth Amendment abolished slavery in the United States, the nation’s first black president paid tribute to “a century and a half of freedom—not simply for former slaves, but for all of us.” It sounds innocuous enough till you start listening to the very different kinds of political rhetoric around us. All of us are not free, insists the Black Lives Matter movement, when “the afterlife of slavery” endures in police brutality and mass incarceration. All of us are not free, says the Occupy movement, when student loans impose “debt slavery” on the middle and working classes. All of us are not free, protests the Tea Party, when “slavery” lurks within big government. Social Security? “A form of modern, twenty-first-century slavery,” says Florida congressman Allen West. The national debt? “It’s going to be like slavery when that note is due,” says Sarah Palin. Obamacare? “Worse than slavery,” says Ben Carson. Black, white, left, right—all of us, it seems, can be enslaved now.
Americans learn about slavery as an “original sin” that tempted the better angels of our nation’s egalitarian nature. But “the thing about American slavery,” writes Greg Grandin in his 2014 book The Empire of Necessity, about an uprising on a slave ship off the coast of Chile and the successful effort to end it, is that “it never was just about slavery.” It was about an idea of freedom that depended on owning and protecting personal property. As more and more settlers arrived in the English colonies, the property they owned increasingly took the human form of African slaves. Edmund Morgan captured the paradox in the title of his classic American Slavery, American Freedom: “Freedom for some required the enslavement of others.” When the patriots protested British taxation as a form of “slavery,” they weren’t being hypocrites. They were defending what they believed to be the essence of freedom: the right to preserve their property.
The Empire of Necessity explores “the fullness of the paradox of freedom and slavery” in the America of the early 1800s. Yet to understand the chokehold of slavery on American ideas of freedom, it helps to go back to the beginning. At the time of the Revolution, slavery had been a fixture of the thirteen colonies for as long as the US today has been without it. “Slavery was in England’s American colonies, even its New England colonies, from the very beginning,” explains Princeton historian Wendy Warren in her deeply thoughtful, elegantly written New England Bound, an exploration of captivity in seventeenth-century New England. The Puritan ideal of a “city on a hill,” long held up as a model of America at its communitarian best, actually rested on the backs of “numerous enslaved and colonized people.” [Continue reading…]
The Atlantic reports: Black Americans are less likely to dial 911 immediately following, and for more than a year after the highly publicized assault or death of a black person at the hands of police. That’s the conclusion in “Police Violence and Citizen Crime Reporting in the Black Community,” a study to be published in October’s American Sociological Review, the official publication of the American Sociological Association.
Three sociologists — Matthew Desmond at Harvard, Andrew Papachristos at Yale, and David Kirk at Oxford — screened and analyzed over 1.1 million 911 calls made to Milwaukee’s emergency dispatch between March 1, 2004 and December 31, 2010. They isolated and further analyzed some 883,000 calls in which a crime was reported within city limits in black, Latino, and white neighborhoods where at least 65 percent of residents fit the race category, per 2000 Census data. They chose those dates in order to study what, if any, impact the brutal beating of Frank Jude by several police officers might have had on residents dialing 911 for help. The effect they found was significant.
“Police misconduct can powerfully suppress one of the most basic forms of civic engagement: calling 911 for matters of personal and public safety,” the authors wrote in the study. The author’s conclusions may also shed some light on the controversial “Ferguson effect,” that is, the idea that a rise in crime follows a high-profile incident of police brutality. [Continue reading…]
Charles M. Blow writes: Another set of black men killed by the police — one in Tulsa, Okla., another in Charlotte, N.C.
Another set of protests, and even some rioting.
Another television cycle in which the pornography of black death, pain and anguish are exploited for visual sensation and ratings gold.
And yes, another moment of mistakenly focusing on individual cases and individual motives and individual protests instead of recognizing that what we are witnessing in a wave of actions rippling across the country is an exhaling — a primal scream, I would venture — of cumulative cultural injury and a frantic attempt to stanch the bleeding from multiplying wounds.
We can no longer afford to buy into the delusion that this moment of turmoil is about discrete cases or their specific disposition under the law. The system of justice itself is under interrogation. The cultural mechanisms that produced that system are under interrogation. America as a whole is under interrogation. [Continue reading…]
The figures boggle the mind. Approximately 11 million Americans cycle through our jails and prisons each year (including a vast “pre-trial population” of those arrested and not convicted and those who simply can’t make bail). At any moment, according to the Prison Policy Initiative, there are more than 2.3 million people in our “1,719 state prisons, 102 federal prisons, 942 juvenile correctional facilities, 3,283 local jails, and 79 Indian Country jails as well as in military prisons, immigration detention facilities, civil commitment centers, and prisons in the U.S. territories.” In some parts of the country, there are more people in jail than at college.
If you want a partial explanation for this, keep in mind that there are cities in this country that register more arrests for minor infractions each year than inhabitants. Take Ferguson, Missouri, now mainly known as the home of Michael Brown, the unarmed black teenager shot and killed in 2014 by a town policeman. The Harvard Law Reviewreported that, in 2013, Ferguson had a population of 22,000. That same year “its municipal court issued 32,975 arrest warrants for nonviolent offenses,” or almost one-and-a-half arrests per inhabitant.
And then there are the conditions in which all those record–breaking numbers of people live in our jails and prisons. At any given time, 80,000 to 100,000 inmates in state and federal prisons are held in “restrictive housing” (aka solitary confinement). And those numbers don’t even include county jails, deportation centers, and juvenile justice institutions. Rikers Island, New York City’s infamous jail complex in its East River, has 990 solitary cells. And keep in mind that solitary confinement — being stuck in a six-by-nine or eight-by-10-foot cell for 23 or 24 hours a day — is widely recognized as a form of psychosis-inducing torture.
And that, of course, is just to begin to explore America’s vast and ever-expanding prison universe. The fact is that it’s hard to fathom even the basics of the American urge to lock people away in vast numbers, which is why today TomDispatch regular Rebecca Gordon focuses instead on what it might mean for justice in this country if we started to consider alternatives to prison. Tom Engelhardt
There oughta be a law… Should prison really be the American way? By Rebecca Gordon
You’ve heard of distracted driving? It causes quite a few auto accidents and it’s illegal in a majority of states.
Well, this year, a brave New Jersey state senator, a Democrat, took on the pernicious problem of distracted walking. Faced with the fact that some people can’t tear themselves away from their smartphones long enough to get across a street in safety, Pamela Lampitt of Camden, New Jersey, proposed a law making it a crime to cross a street while texting. Violators would face a fine, and repeat violators up to 15 days in jail. Similar measures, says the Washington Post,have been proposed (though not passed) in Arkansas, Nevada, and New York. This May, a bill on the subject made it out of committee in Hawaii.
That’s right. In several states around the country, one response to people being struck by cars in intersections is to consider preemptively sending some of those prospective accident victims to jail. This would be funny, if it weren’t emblematic of something larger. We are living in a country where the solution to just about any social problem is to create a law against it, and then punish those who break it.
Rev Dr William J Barber, II writes: Since a police officer shot and killed Keith Lamont Scott in Charlotte, N.C., on Tuesday afternoon, the ensuing protests have dominated national news. Provocateurs who attacked police officers and looted stores made headlines. Gov. Pat McCrory declared a state of emergency, and the National Guard joined police officers in riot gear, making the Queen City look like a war zone.
Speaking on the campaign trail in Pittsburgh on Thursday, Donald J. Trump offered a grave assessment: “Our country looks bad to the world, especially when we are supposed to be the world’s leader. How can we lead when we can’t even control our own cities?” Mr. Trump seems to want Americans to believe, as Representative Robert Pittenger, a Republican whose district includes areas in Charlotte, told the BBC, that black protesters in the city “hate white people because white people are successful and they’re not.”
But Charlotte’s protests are not black people versus white people. They are not black people versus the police. The protesters are black, white and brown people, crying out against police brutality and systemic violence. If we can see them through the tear gas, they show us a way forward to peace with justice. [Continue reading…]
Talbot Brewer writes: I don’t know how careers are seen in other countries, but in the United States we are exhorted to view them as the primary locus of self-realization. The question before you when you are trying to choose a career is to figure out “What Color is Your Parachute?” (the title of a guide to job searches that has been a perennial best seller for most of my lifetime). The aim, to quote the title of another top-selling guide to career choices, is to “Do What You Are.”
These titles tell us something about what Americans expect to find in a career: themselves, in the unlikely form of a marketable commodity. But why should we expect that the inner self waiting to be born corresponds to some paid job or profession? Are we really all in possession of an inner lawyer, an inner beauty products placement specialist, or an inner advertising executive, just waiting for the right job opening? Mightn’t this script for our biographies serve as easily to promote self-limitation or self-betrayal as to further self-actualization?
We spend a great deal of our youth shaping ourselves into the sort of finished product that potential employers will be willing to pay dearly to use. Beginning at a very early age, schooling practices and parental guidance and approval are adjusted, sometimes only semi-consciously, so as to inculcate the personal capacities and temperament demanded by the corporate world. The effort to sculpt oneself for this destiny takes a more concerted form in high school and college. We choose courses of study, and understand the importance of success in these studies, largely with this end in view.
Even those who rebel against these forces of acculturation are deeply shaped by them. What we call “self-destructive” behavior in high school might perhaps be an understandable result of being dispirited by the career prospects that are recommended to us as sufficient motivation for our studies. As a culture we have a curious double-mindedness about such reactions. It is hard to get through high school in the United States without being asked to read J.D. Salinger’s Catcher in the Rye — the story of one Holden Caulfield’s angst-ridden flight from high school, fueled by a pervasive sense that the adult world is irredeemably phony. The ideal high school student is supposed to find a soul-mate in Holden and write an insightful paper about his telling cultural insights, submitted on time in twelve-point type with double spacing and proper margins and footnotes, so as to ensure the sort of grade that will keep the student on the express train to the adult world whose irredeemable phoniness he has just skillfully diagnosed. [Continue reading…]
Emma Green writes: In general, Americans do not like atheists. In studies, they say they feel coldly toward nonbelievers; it’s estimated that more than half of the population say they’d be less likely to vote for a presidential candidate who didn’t believe in God.
This kind of deep-seated suspicion is a long-standing tradition in the U.S. In his new book, Village Atheists, the Washington University in St. Louis professor Leigh Eric Schmidt writes about the country’s early “infidels” — one of many fraught terms nonbelievers have used to describe themselves in history — and the conflicts they went through. While the history of atheists is often told as a grand tale of battling ideas, Schmidt set out to tell stories of “mundane materiality,” chronicling the lived experiences of atheists and freethinkers in 19th- and 20th-century America.
His findings both confirm and challenge stereotypes around atheists today. While it’s true that the number of nonbelievers is the United States is growing, it’s still small — roughly 3 percent of U.S. adults self-identify as atheists. And while more and more Americans say they’re not part of any particular religion, they’ve historically been in good company: At the end of the 19th century, Schmidt estimated, around a tenth of Americans may have been unaffiliated from any church or religious institution.
As the visibility and number of American atheists has changed over time, the group has gone through its own struggles over identity. Even today, atheists are significantly more likely to be white, male, and highly educated than the rest of the population, a demographic fact perhaps tied to the long legacy of misogyny and marginalization of women within the movement. At times, nonbelievers have advocated on behalf of minority religious rights and defended immigrants. But they’ve also been among the most vocal American nativists, rallying against Mormons, Catholics, and evangelical Protestants alike.
Schmidt and I discussed the history of atheists in the United States, from the suspicion directed toward them to the suspicions they have cast on others. Our conversation has been edited and condensed for clarity. [Continue reading…]
This website or its third-party tools use cookies, which are necessary to its functioning. By closing this banner, you agree to the use of cookies.Ok