Quantcast
Channel: excerpts Archives - Longreads
Viewing all 50 articles
Browse latest View live

The Great Online School Scam

$
0
0

Noliwe Rooks | Excerpt from Cutting School: Privatization, Segregation, and the End of Public Education | The New Press | September 2017 | 18 minutes (5,064 words)

* * *

DeVos’s ties to—and support for—the profoundly troubled virtual school industry run deep.

In a 2013 interview with Philanthropy Magazine, DeVos said her ultimate goals in education reform encompassed not just charter schools and voucher programs, but also virtual education. She said these forms were important because they would allow “all parents, regardless of their zip code, to have the opportunity to choose the best educational setting for their children.” Also in 2013, one of the organizations that she founded, the American Federation for Children, put out a sharply critical statement after New Jersey’s school chief, Chris Cerf, declined to authorize two virtual charter schools. The group said the decision “depriv[es] students of vital educational options.” Yet another group DeVos founded and funded, the Michigan-based Great Lakes Education Project, has also advocated for expansion of online schools, and in a 2015 speech available on YouTube DeVos praised “virtual schools [and] online learning” as part of an “open system of choices.” She then said, “We must open up the education industry—and let’s not kid ourselves that it isn’t an industry. We must open it up to entrepreneurs and innovators.” DeVos’s ties to—and support for—the profoundly troubled virtual school industry run deep.

At the time of her nomination, charter schools were likely familiar to most listeners given their rapid growth and ubiquity. However, the press surrounding the DeVos nomination may have been one of the first times most became aware of a particular offshoot of the charter school movement—virtual or cyber schools. Despite flying somewhat under the mainstream radar, online charter schools have faced a wave of both negative press and poor results in research studies. One large-scale study from 2015 found that the “academic benefits from online charter schools are currently the exception rather than the rule.” By June of 2016, even a group that supports, runs, and owns charter schools published a report calling for more stringent oversight and regulation of online charter schools, saying, “The well-documented, disturbingly low performance by too many full-time virtual charter public schools should serve as a call to action for state leaders and authorizers across the country.” The jointly authored research was sponsored by the National Alliance for Public Charter Schools, the National Association of Charter School Authorizers, and 50Can, all groups that lobby state and federal agencies to loosen regulations to allow more robust charter-school growth. As one of the report’s backers said, “I’m not concerned that Betsy DeVos supports virtual schools, because we support them too—we just want them to be a lot better.” Such an upswing in quality seems highly unlikely to happen anytime soon. They are yet another trickle in the stream of apartheid forms of public education flowing down from the wealthy and politically well connected to communities that are poor, of color, or both.

In Pennsylvania, Michigan, South Carolina, Ohio, and Florida, poor students from rural areas as well as those in underfunded urban schools that primarily educate students who are Black and Latino today face a new response to the question of how to solve the riddle of race, poverty, and educational underachievement. Increasingly, despite little supporting evidence, a growing number of states and local school districts no longer believe that the solution is merely about infrastructure, class size, funding, or hiring more teachers. In states with high levels of poverty and “hard to educate” Black and Latino students, virtual schools are on the rise. Such schools are not growing nearly as fast in school districts that are white and relatively wealthy, nor are they the educational strategy of choice in most private schools. As much a business strategy as one promoting learning, virtual education allows businesses to profit from racial inequality and poverty. Sadly, this particular cure to what ails our education system more often than not exacerbates the problems.

* * *

The very nature and meaning of education underwent a change.

Though supported by Democrats as well, the expansion of virtual charter schools accelerated as Republicans increased their margin of control in governor’s mansions and state legislatures across the country. At the start of 2016, Republicans occupied thirty-two of the nation’s fifty governorships, ten more than they did in 2009. During that same period of time, Republican control of state legislatures doubled. What that means is that by 2016, Republicans controlled more legislative chambers than they did in the entire history of their party. The same political winds that have shifted to blow so many Republicans into office have, at the same time, pushed virtual education to the forefront of educational policy for a certain segment of our nation’s youth.

In December of 2015, Congress sent the long-awaited overhaul of the federal government’s education bill to the White House for President Obama’s signature. Called the Every Student Succeeds Act, the new bill updated the previous educational act signed into law in 2002 by President George W. Bush, No Child Left Behind. When introducing that new act, Bush said that it was a means for our nation’s schools to begin to seriously combat what he termed “the soft bigotry of low expectations” that had so often stood in the way of ensuring the success of America’s children who were poor, of color, special needs, or in any way struggling to achieve academically. President Bush promised the nation that No Child Left Behind would require that by 2014, 100 percent of all public school children could perform at grade level as measured by standardized tests in the areas of math and reading. However, by 2012, President Obama’s administration and much of the rest of the country realized we as a nation were far from successfully achieving the previous president’s promised outcomes. Obama’s education bill made no grand promises such as those found in the previous law, and in many ways was most notable for the fact that, unlike No Child Left Behind, with its push to give the federal government authority to prescribe and enforce educational standards, curriculum, and consequences from Washington, the Every Child Succeeds Act by and large returned such matters to the control of the states and local governments.

Education Secretary Betsy DeVos Speaks At Harvard On Empowering Parents

Protestors demonstrate as Education Secretary Betsy DeVos speaks at Harvard University on September 28, 2017. Photo: Getty Images.

Proposals for addressing issues of racial equity and fairness have not tended to benefit the poor and non-white when left to states to decide. In regard to educational equality, according to the progressive Republican watchdog organization American Bridge 21st Century, beginning in 2010 the Republican candidates for governor in Wisconsin, Michigan, Pennsylvania, Kansas, and Ohio rode a cresting wave of Tea Party support to elected office. They all campaigned on promises of massive budget overhauls that would cut taxes for both wealthy individuals and businesses. Once elected, in order to deliver on those promises, one governor after another eyed the funding pots set aside for public education as a way to pay for their budget priorities. By 2014, the legislative overhaul was complete, and the impact of the electoral shifts meant that changes were not just a matter of states’ and local governments’ agreeing about the best ways to fund the education of children who were poor and non-white. No, with this shift in the political landscape, the very nature and meaning of education underwent a change as well. Republican governors hewed closely to their party platform on education, which, in sum, aimed to shrink federal oversight of education; increase parental choice and flexibility; allow federal dollars to follow children to the school of their choice; expand school choice by increasing the number of charter schools; return greater control to parents, teachers, and school boards; and defend and increase options for home schooling. In regard to education, the shift was away from government regulation of public schools even while the proposed alternatives required taxpayer dollars.

As educational historian Diane Ravitch has pointed out, the educational reforms championed by these legislators seek to

eliminate the geographically based system of public education as we have known it for the past 150 years and replace it with a competitive market-based system of school choice—one that includes traditional public schools, privately managed charter schools, religious schools, voucher schools, for-profit schools, virtual schools and for-profit vendors of instruction. Lacking any geographic boundaries, these schools would compete for customers.

Virtual schools and their growth in both number and significance were not part of the public discussion surrounding the repeal of No Child Left Behind. Still, in reality, as it became more and more clear that the new education bill would pass, interest groups, state legislatures, and educational nonprofits interested in the lucrative, easy expansion of virtual education all sprang into action. Many knew that states would now have more power to propose the expansion of such schools, allegedly as a way of confronting the challenges facing states in need of creative ways to address the educational deficiencies of their lowest-achieving students, who were usually poor and often of color.

In April 2015, the Alabama State Legislature voted up State Bill 0072, known as the Virtual School Options for Local Boards of Education Bill. It required that, “at a minimum, each local board of education” adopt a policy for providing a “virtual option for eligible students in grades 9–12.” In Maine, two virtual academies opened in 2014 to specifically educate students in the state’s “poorer districts” and in May of 2015 the state explored the feasibility of opening a state-run virtual academy. In June, state education officials in Illinois announced that they would begin to test out limited online learning options during snow days for a three-year period. Following their analysis at the end of the test period, they held open the possibility of more wide-ranging implementation. The Virginia Department of Education piloted a new program during the 2015–2016 school year in which students spent 100 percent of their time in a cyber school, never setting foot inside a school. This flurry of activity was brought on by the ease and speed with which these new computer-based schools could expand and by the fact that the financial rewards were simply too great for cash-strapped politicians to ignore.

By 2015 virtual schools had gone mainstream, aided in part by the fact that between 2008 and 2014, 175 bills that expanded online schooling options passed in thirty-nine states and territories (including the District of Columbia). As a result, today there are public schools in every state that offer some form of online coursework, and in five states—Alabama, Arkansas, Florida, Michigan, and Virginia—students are actually required to take at least one online or partly digital class if they want to graduate from high school. In thirty states and Washington, D.C., there are fully online schools available.

* * *

Between 2011 and 2014, 100 percent of the children enrolled in Philadelphia-area cyber schools who took state achievement tests failed.

Despite all of the state- and federal-level support for these new education methods, all of this growth has happened without a similar amount of verification of the efficacy of virtual education. Over the past decade, we have heard more about the failure of Massive Open Online Courses, or MOOCs, to make a memorable impact on the style in which college instruction is delivered than we have about the success or failure of virtual education at the K–12 level. Colleges might get all the attention for going online in part because big brand names like MIT and Harvard now offer virtual courses for free around the world, but K–12 online schooling has, over the past twenty years, become a major player in the educational arena. As a result of MOOCs and other developments overshadowing this conversation, little attention has been paid to where in the country this profitable switch has grown most rapidly: areas with high levels of racial and economic inequality fueled by segregation. In districts that are rural and poor, and overwhelmingly with Republican governors and legislatures, in states like Florida, Alabama, Mississippi, and South Carolina, or in urban districts like Philadelphia, virtual schools are quickly becoming the format of choice despite politicians’ having little grasp on how cyber education impacts achievement for the most vulnerable students.

According to a 2012 Philadelphia Citypaper article, “Who’s Killing Philly Schools,” in a district comprised of 80 percent Black and Latino students, the vast majority of whom are below the poverty line, cyber schools accounted for fully educating more than a third of the children in 2014. The goal is for that number to rise to at least three-quarters, if not more, in subsequent years. However, between 2011 and 2014, 100 percent of the children enrolled in Philadelphia-area cyber schools who took state achievement tests failed. The record of Pennsylvania’s fourteen cyber charter schools was so abysmal that the state of Pennsylvania denied all applications to open new cyber charter schools in 2013 and 2014. Their poor track record has not derailed the long-term plan of increasing the numbers of students who take classes via virtual education, however.

Education Secretary Betsy DeVos Speaks At Harvard On Empowering Parents

Protestors demonstrate as Education Secretary Betsy DeVos speaks at Harvard University on September 28, 2017. Photo: Getty Images.

Cyber education grew during the term of Pennsylvania’s Republican governor Tom Corbett. Elected in 2010 and serving only a four-year term, his policies exemplified the organized effort on the part of political and business leaders to spearhead the shift to online learning. His first step was taking funding away from “brick and mortar” schools. In his first full year in office, he cut funding to the Philadelphia School District by $198 million, a 20 percent cut. Then, in 2011, he reduced public school funding by another $900 million, or 10 percent. Those cuts, plus more the next year, meant that by 2013 thousands of teachers were laid off and almost 70 percent of Pennsylvania’s school districts increased their class size, 40 percent cut extracurricular activities, and 75 percent cut instruction. The impact of his educational leadership was devastating for schools and communities throughout the state, and districts targeted by Corbett’s cuts felt them intensely. Dozens of schools closed. Thousands of teachers and school support staff were laid off. Art and music became scarce, along with nurses and guidance counselors. School buildings became so unpleasant that virtual education almost seemed like a respite.

In one particularly telling real-world example from 2016 of how such cuts affected students in the “brick and mortar” district, high school junior Jameria Miller talks about why she starts every morning running through the school to get a good seat near the front of the room in her first-period Spanish class. It’s not because she is just excited about the class. It’s because the school is cold. As she explains, “The cold is definitely a distraction. We race to class to get the best blankets.” What she means is that because the classroom where she begins her day has uninsulated metal walls, Jameria’s teacher hands out blankets to the students on a first come, first served basis. It’s the only way for them to stay warm. Miller’s school in the William Penn District is situated in Philadelphia’s “inner-ring” suburbs and serves a student body that is majority Black and overwhelmingly impoverished. Though concentrating in the cold is hard enough, Miller says the hardest part of her daily ordeal is the knowledge that life isn’t like this for students in other districts. She means students in wealthier districts. “It’s never going to be fair, they’re always going to be a step ahead of us. They’ll have more money than us, and they’ll get better jobs than us, always.” She says she doesn’t believe that either funding or systemic school improvement will ever truly equalize: “What I’m about to say might not be very nice, but rich people aren’t going to want [funding fairness]. They want their kids to have better things so that their kids can get a jump start in life and be ahead of everyone else. And, as long as people feel that way, we all won’t be equal. We won’t receive equal education ever, because education is what gets you success.” Her district is not the only one in Pennsylvania so affected.

* * *

The companies who run online and virtual schools are also consistently accused of financial impropriety.

Since 1998, Philadelphia schools—the largest city and school district in the state—have been run by the governor-appointed School Reform Commission. In the summer of 2013, in order to address a $350 million hole in its budget, the commission passed what was termed a “doomsday budget.” Thirty schools were closed that year. In 2014, Thomas Knudsen, chief recovery officer for the School Reform Commission, told a reporter writing a story for Salon that the commission wanted to “close 40 schools and an additional six every year thereafter until 2017.” At that point, he believed the district, which at its height had over 180 schools, would be down to 20 to 30, and those would be placed into “achievement networks” where public and private groups would compete to manage them. This news led the Salon reporter to describe what once was our nation’s tenth-largest school district as being in its “death throes.” The district wasn’t to be saved, or even managed, by the Reform Commission as much as dismantled.

Despite such severe money worries and their negative impact on “brick and mortar” schools, virtual educational sectors thrived during the same years. According to a 2011 New York Times article, five Pennsylvania cyber charters received $200 million in tax money in 2010–2011, and Agora Cyber Charter, which is run by the for-profit company K12, took in $31.6 million in 2013 alone from state taxpayers in Philadelphia. By 2015, cyber schools received over $60 million in per-student payments from the chronically starved and often bankrupt school districts. To make matters worse, the companies who run online and virtual schools are also consistently accused of financial impropriety. In 2011, the New York Times conducted a months-long investigation into virtual schools. By way of summing up its overall findings, the article begins by showing how, in the realm of education, what is good for business is not necessarily good for the students those businesses claim to educate:

By almost every educational measure, the Agora Cyber Charter School is failing. Nearly 60 percent of its students are behind grade level in math. Nearly 50 percent trail in reading. A third do not graduate on time. And hundreds of children, from kindergartners to seniors, withdraw within months after they enroll. By Wall Street standards, though, Agora is a remarkable success that has helped enrich K12 Inc., the publicly traded company that manages the school. And the entire enterprise is paid for by taxpayers.

The amount of money involved, as well as the potential profit, is significant.

One of the largest companies providing virtual education, Agora Schools, was on track to earn $72 million in 2011, a number it has bested each succeeding year from 2011 to 2014, when out of Agora’s $849 million in profit, $117 million came from its virtual schools division. And those profits are for just one company, in just one area of a crowded field of online education providers. In order to help build a market for their services, these companies often target children via huge advertising buys on Nickelodeon and Cartoon Network, as well as on teen sites such as MeetMe.com and VampireFreaks.com.

In addition to its seeming inability to properly educate students and the unsavory targeting of children with its product, it is worth noting that Agora’s parent company, K12 Inc., was founded by a man who had served federal time for financial improprieties. His name is Michael Milken. Milken not only came to symbolize 1980s-era Wall Street greed and excess by serving as the inspiration for the Michael Douglas character Gordon Gekko in the 1987 movie Wall Street; he also spent almost two years in a federal penitentiary for securities fraud. Once released from prison, he joined forces with another junk bond dealer, Ron Packard, who specialized in mergers and acquisitions for Goldman Sachs in the 1980s. Together they invested $10 million into K12 Inc. and formed a company with the goal of profiting from the $600 billion public education “market.” They have been dogged by financial improprieties. In 2012, K12 settled a federal lawsuit for $6.8 million. The suit alleged its executives inflated stock prices by misleading investors with false student-performance claims. In the summer of 2016, the company agreed to pay $168.5 million to settle alleged violations of California’s false claims, false advertising, and unfair competition laws, though the company admitted no wrongdoing. No matter; by 2016 Milken had a net worth of around $2.5 billion, according to Forbes—almost all of that money from contracts with public schools.

Protestors Call On IL Senators To Vote Against Betsy DeVos Confirmation

Protestors demonstrate against President Donald Trump outside the Kluczynski Federal Building in the Loop on January 31, 2017 in Chicago, Illinois. The protestors were unhappy with Trumps cabinet selections and the new restrictions on immigrants entering the country. They were also calling on Illinois’ senators to vote to block Trump’s appointment of Betsy DeVos for education secretary. Photo: Getty Images.

Pennsylvania has the second-highest cyber charter enrollment (after Ohio), and it accounts for about 17 percent of the national cyber charter school population, which across the country numbers over 220,000 students. In terms of instruction, students usually take lessons at home, so the virtual school operators have no classrooms to maintain, staff to hire, or heating bills to pay. Teachers are paid less, and student-teacher ratios are massive, sometimes as high as fifty students for each teacher. But, despite the widespread belief in their affordability, in Pennsylvania the district pays cyber schools as much per child as it pays to educate students in brick-and-mortar schools. In 2016, most of Pennsylvania’s cyber schools had dismal results. According to the state’s School Performance Profile website, only three—21st Century, PA Cyber, and PA Virtual—had a score above 60. The state considers 60 and below to be substandard. None of the cyber schools scored higher than 70, which is the state’s minimum passing score for all schools, and some cyber schools in the study scored down in the 30s. Such schools are neither inexpensive nor effective, yet they continue to expand.

* * *

 The undereducation of the poor and people of color is a business opportunity.

In addition to questions about how effective cyber schools are in terms of a return on investment for taxpayer dollars, an issue of particular concern is the sector’s emphasis on serving so-called high-risk students who don’t have the parental and other support structures that research shows are necessary to make the most of the model. Poor, rural, and urban districts are prime candidates, since cyber educators have explicitly stated that it is their business strategy to go after kids who—because it is believed that they do not have motivated parents—would demand the least from their educational experience. Students in foster care and Native Americans schooled on their tribal homelands are two categories of students targeted by virtual school providers in Florida.

Targeting the most economically vulnerable students ultimately yields cyber education businesses increased profits resulting from the segrenomics of apartheid schools. The undereducation of the poor and people of color is a business opportunity that generates great profit for businesses but provides little in the way of quality instruction.

Toward that end, it is important to take a few steps back and at least notice that, despite the near-universal enthusiasm for projects that give technology to educationally vulnerable poor children of color, computer-aided instruction, when not deployed in an informed, responsible manner, actually widens the gap between the financially and educationally privileged and everyone else. Nonetheless, over the past ten years, public school districts have invested millions of dollars in various types of online and computer-aided learning and instruction programs, and few are able to show the educational benefit of their expenditures for a majority of students. Those who benefit most are already well organized and highly motivated. Other students struggle and, according to researchers studying students in a variety of digital settings, might even lose academic ground.

Supporters of online learning say that all anyone needs in order to access a great education is a stable Internet connection, but only 35 percent of households earning less than $25,000 have broadband Internet access, compared with 94 percent of households with income in excess of $100,000. In addition, according to the 2010 Pew Report on Mobile Access, only half of Black homes have Internet connections at all, compared with almost 65 percent of white households. In its 2016 report on Internet usage, Pew related that a whopping 94 percent of Latinos used mobile phones to access the Internet, generally a much more expensive and less-than-ideal (if not altogether ineffectual) method for taking part in online education. In short, the explosion of this type of educational instruction, though on the rise, may leave wanting the very students who need public education while at the same time offering businesses providing Internet access an opportunity to reap significant rewards.

Protestors Call On IL Senators To Vote Against Betsy DeVos Confirmation

Simone Gewirth is dragged away to be arrested after sitting in the lobby of the Kluczynski Federal Building protesting President Donald Trump on January 31, 2017 in Chicago, Illinois. Seven people were arrested inside the building while protesting Trumps cabinet selections and the new restrictions on immigrants entering the country. They were also calling on Illinois’ senators to vote to block Trump’s appointment of Betsy DeVos for education secretary. Photo: Getty Images.

As but one example of how touting the benefits of cyber education goes hand in hand with profits for businesses, we need look no further than South Carolina. There, the growth of cyber education in the state got an advantageous boost in 2014 when the governor at the time, Nikki Haley, announced a new education budget and asked the state legislature for tens of millions of dollars to provide WiFi service to rural schoolchildren. It was a precursor to the expansion of virtual education. Once approved, the 2015 education budget provided “$29 million for improving bandwidth to school facilities, bolstering wireless connectivity within school walls, and furthering the push to ensure that every student has a computer or tablet.” These changes were enacted because, according to the press release announcing the allocations, “modernizing technology and improving bandwidth will give students greater access to educational content and will help improve critical computer skills their future employers will demand.” An additional $4 million was also provided for teacher technology training. Schools facing difficulties hiring could also offer courses in a “blended” setting, with students being taught online while sitting at a desk in their traditional school. Schools pay $3,500 for an entire classroom to take a virtual course—far less than the cost of a teacher. The allocation of those funds also set the stage for the aggressive expansion of online learning to a cohort of students who would benefit most from high-quality, in-person instruction.

The same year that Governor Haley released her technology-enhancing budget, the National Education Policy Center issued a 2015 report finding that, “despite the considerable enthusiasm for virtual education, there is little credible research to support virtual schools’ practices or to justify ongoing calls for ever greater expansion.” Though the authors concede that the available data are limited, which may make their findings less than definitive, “there is not a single positive sign from the empirical evidence presented here.” Nonetheless, Governor Haley and others like her insisted that this particular form of free-market public education would help the state’s children who were poor and without high-quality schools. She told the state’s citizens that an increase in cyber education was, in the language of school choice advocates, tantamount to “taking a stand against the idea that where you are born and raised should influence the quality of your education.”

* * *

These types of educational arrangements simply do not take place in districts that are wealthy.

Black and Latino children and their parents and communities have reason for concern about the rapid and unchecked growth in cyber education. It tends to impact them most. When Florida’s severe budget cuts in 2011 made it difficult for schools to meet class-size rules and left them too cash-strapped to hire more teachers, some schools in the Miami-Dade district required seven thousand of their students to take online classes in virtual labs with only noncertified teacher’s aides available to provide assistance. Students did not know of these new arrangements until they showed up for school one day, and parents were neither asked about nor informed of this change. Of the district’s roughly 344,000 students, 324,000 are Black or Latino. These types of educational arrangements simply do not take place in districts that are wealthy and have low numbers of students of color.

It is then surprising that, to a large extent, the success of the shift to digital learning has been aided by rhetoric that positions education as a basic right of citizenship, a civil rights mountain still in need of scaling. Nonetheless, to refer back to Stephanie Mencimer’s Mother Jones piece, “beneath the rhetoric, the online-education push is also part of a larger agenda that closely aligns with the GOP’s national strategy: It siphons money from public institutions into for-profit companies.” She continues to say that the tangible result of such efforts is to undercut “public employees, their unions, and the Democratic base. In the guise of a technocratic policy initiative, it delivers a political trifecta—and a big windfall for Bush’s corporate backers.” What it rarely delivers is a quality education, never mind one that comes close to the sort found in the wealthy, white school districts to which the Bush and DeVos families would send their own children.

People with disabilities, parents, caregivers, teachers, and

People with disabilities, parents, caregivers, teachers, and allies gathered at the clock in Grand Central Station to protest the nomination of Betsy Devos as secretary of education. February 5, 2017. New York, New York. Photo: Getty Images.

In many virtual school settings, students rarely even hear or see their teachers. At some cyber charter schools, students need only sign in to the school website and/or communicate with a teacher once every three days to prove they’re actually attending. In Wisconsin, a state legislative audit found that 16 percent of the virtual teachers surveyed had contact with individual students as little as three times a month. Other schools in the state outsource duties such as paper grading to contractors in India, making it difficult for the teachers to meaningfully explain to students the basis for the grades they received. While virtual education is a growth industry in Wisconsin, it is important to note that the state has the largest achievement gap between Black and white students in the country and ranks last in reading-comprehension tests among Black fourth graders. Milwaukee, one of the largest cities in the state and home to the highest number of Black students, is the biggest contributor to Wisconsin’s racial achievement gap. Four out of five Black children in Milwaukee live in poverty.

While much of the sector’s growth can be seen as being tied to states with Republican governors and legislatures, it was greatly aided in 2013 by the Obama administration when it launched the ConnectED Initiative, a five-year plan to connect nearly all U.S. students to high-speed wireless systems in their schools and libraries, earmark funds to train teachers to incorporate digital technology and devices into their lesson plans, and “unleash private sector innovation” in order make it easier for educational technology providers to offer personalized educational software, online education opportunities, and online textbooks to entire school districts. If such policies and practices actually worked to educate students who are undereducated, that might not be cause for concern. However, given all of the information that we have, we must conclude that they do not. It is then hard to understand why there is such a push to expand them. At the very least, it would make sense to also expand policies that would make it possible for schoolchildren who attend schools that lack heating in the winter to sit comfortably in their classrooms without resorting to huddling under a blanket.

* * *

Copyright © 2017 by Noliwe Rooks. This excerpt originally appeared in Cutting School: Privatization, Segregation, and the End of Public Education, published by The New Press, and is used here with permission.

Editor: Dana Snitzky


The Way We Treat Our Pets Is More Paleolithic Than Medieval

$
0
0

John Bradshaw | Excerpt adapted from The Animals Among Us: How Pets Make Us Human | Basic Books | October 2017 | 18 minutes (4,861 words)

 

We have no direct evidence proving that people living prior to 10,000 bce had pets. Any kept by hunter-gatherers must have included species tamed from the wild, which would leave little archaeological evidence: their remains would be impossible to distinguish from those of animals killed for food or kept for other — perhaps ritualistic — purposes.

Since we don’t have evidence from the prehistoric past, we must look to that gleaned from the past century. A remarkable number of hunter-gatherer and small-scale horticultural societies that persisted into the nineteenth and twentieth centuries in remote parts of the world — Amazonia, New Guinea, the Arctic, and elsewhere — give us insight into the behaviors of earlier Stone Age societies. We can start by asking whether hunter-gatherers already kept pets when they were first documented, before they had time to acquire the habit from the West.

It turns out that many small-scale “Paleolithic” societies kept pets of some kind: sometimes dogs, but mostly tamed wild animals, captured when young and then brought up as part of the human family. Native Americans and the Ainu of northern Japan kept bear cubs; the Inuit, wolf cubs; the Cochimi from Baja California, racoons; indigenous Amazonian societies, tapir, agouti, coati, and many types of New World monkeys; the Muisca of Colombia, ocelots and margays (two local species of wild cat); the Yagua of Peru, sloths; the Dinka of the Sudan, hyenas and Old World monkeys; native Fijians, flying foxes and lizards; the Penan of Borneo, sun bears and gibbons.

To this list of pets we can add a host of bird species, valued as pets from Brazil to Mali to China. Many have particularly bright plumage, such as parrots, parakeets, and hornbills; others, such as the bulbul, sing. Selection of some — such as the cassowaries, large flightless birds, cherished by the original inhabitants of New Britain (part of New Guinea), and the pigeons kept as pets in Samoa — seems to have been more arbitrary. Nowadays, the availability of Western domesticated animals has reduced some of this diversity, but “traditional” societies, from the Toraja in the mountains of Indonesia to the Tiv of West Africa, still widely treasure animal companions.

* * *

While traditional cultures do keep an extraordinarily wide variety of animals, a recent survey of sixty such societies finds that dogs and cats are nonetheless the most ubiquitous. This preference is clearly not traditional in most cases, since dogs and cats arrived in most parts of the world very late. Dogs were almost certainly domesticated (from the Eurasian wolf) by one or possibly several hunter-gatherer societies several thousand years before the dawn of agriculture and then gradually spread throughout much of the globe.

Because both dogs and cats have practical uses besides companionship, their status is not always easy to determine, given the cultural and linguistic barriers that often exist between Western researchers and traditional peoples. The survey found that about one-third of the groups in which dogs occurred treated them as pets; another third did not regard them with affection but simply used them as guards or for some kind of work. As expected, those groups that had cats regarded them as useful for the control of vermin, and two out of three such societies thus expected them to find their own food. In the others, however, certain individuals (“owners”) deliberately fed at least some of the cats and treated them as pets.

Orphaned baby monkeys and other young mammals and birds brought home from hunting expeditions are not merely tamed; they are adopted as members of an extended family with both human and animal members and fed choice fruits.

While widespread in these traditional societies, cats, dogs, and other familiar domestic animals represent only a minority of the vast range of species kept as pets. The survey recorded many kinds of tamed mammal, including primates of various kinds, foxes, bears, prairie dogs, and ground squirrels. Over a quarter of the societies also kept birds, which were even more varied than the mammals, including eagles, ravens, parrots, macaws, hawks, and pigeons. Although evidently valued for their appearance, most of the bird species kept had higher than average intelligence (for birds). Many clearly formed lasting relationships with humans: for example, the Yanomamo of South America taught their parrots to talk. Overall, the birds more obviously served purely as pets than most of the mammals: almost all received most of the food they needed, and many functioned as playthings for children.

Fish are the only class of pets almost entirely missing from traditional societies, presumably because appreciating them requires glass for aquaria. An exception: the Polynesians of Samoa capture and then tame eels, keeping them in holes in the ground and whistling to call them to the surface.

* * *

Hunter-gatherers have a much more complex relationship with animals than we in the West do today. In their societies, animals serve both an essential function (as food) and a symbolic one. For example, caring for an orphaned baby animal may represent atonement for the harm done to its kin through the hunt. The Huaorani, an Amazonian people living in Ecuador, adopt baby monkeys and other jungle animals. When hunting adult monkeys, they use blowguns to shoot poison-tipped darts, then attribute the animal’s death not to the dart but to the plant from which they extracted the poison, as if to distance themselves from the deed. After killing a female monkey, they attempt to capture any young still dependent upon her. Orphaned baby monkeys and other young mammals and birds brought home from hunting expeditions are not merely tamed; they are adopted as members of an extended family with both human and animal members and fed choice fruits. Tame harpy eagles partake of the meat from adult monkeys killed in hunting expeditions. On their deaths, these animals receive ceremonial burials.

Such rituals occurred in the Paleolithic era. In one particularly striking example from 16,000 years ago in Jordan, archaeologists discovered the skull of a red fox buried next to the remains of a woman on top of a layer of red ochre, a pigment of special value and ritual significance. Even more remarkably, this was likely a reburial of both human and fox, since some parts of their skeletons remain in another grave close by. Whoever moved the body of the woman apparently knew of her relationship with the fox and moved the animal’s most obvious remains — notably the skull — with her. It seems implausible that the fox died coincidentally at the same time as the woman; rather, it was almost certainly killed when she died, presumably so it could accompany her to the afterlife. Though we may deem this unnecessarily cruel, it does indicate a very special relationship between the two. We will never know precisely what that relationship was, but as no evidence exists of domestication of foxes there or anywhere else for another 12,000 years, we can reasonably assume that this fox had been obtained from the wild as an unweaned cub, in the style of other Paleolithic pets. Red foxes, tamed or otherwise, have no known use in hunter-gatherer societies except as sources of fur, so this individual was likely no more and no less than a much-loved companion.

Beliefs about animals form an essential part of the spiritual life of such small-scale societies. The role of the animal, and therefore its treatment, can vary widely, depending on a given society’s traditions. For example, in Amazonia, the Aché make pets of coatis (group-living racoon-like carnivores), believing that their wild relatives transport human souls to the land of the dead. By contrast, the neighboring Arawete believe that coatis feed on human corpses. They not only do not keep them as pets but set fires around newly dug graves to drive any nearby coatis away.

* * *

The path from wild wolf to domestic dog, the first domestic species, cannot have been straightforward and was likely not deliberate: no precedent would have existed for the idea that a wild animal could reproduce in captivity. The current prevailing theory holds that dogs domesticated themselves, descending from an unusual type of wolf that no longer exists in the wild. These wolves would have differed from their wary modern-day counterparts in being sufficiently tolerant of humans to spend much of their time scavenging around their camps. Adopting their cubs as pets would then have been easy, beginning the process of genetic selection toward tameness and, eventually, trainability. Initially these early “proto-dogs” might have served as early-warning devices (accounting for why dogs are much more prone than wolves to bark) and possibly as waste disposals. Not until they had become capable of forming social bonds with humans would they have been sufficiently controllable for useful service on hunting expeditions. Although utility would have provided the usual motivation for keeping a dog in the Paleolithic, that usefulness would have stemmed from the ties of affection. The slavering, perpetually chained guard dog aside, dogs’ effectiveness stems from their attentiveness to and desire to please people. The loved puppy that has become well adjusted to the humans with whom it will spend the rest of its life will be the most attentive and easiest to train. Unlike with most of the other domestic animals that came along later, the relationship between dog and master is fundamentally an emotional one.

 In four cases, the dogs were buried alongside people, though most had their own graves, mainly at the edge of the cemetery in an area where the graves of children were also concentrated — as if dogs and children were somehow considered equivalent.

The enormous number of dog burials unearthed from the period between 14,000 and 4,000 years ago indicates the esteem preliterate societies had for dogs. One of the earliest and perhaps best known comes from the upper Jordan Valley, where archaeologists discovered the skeleton of an elderly human (although both skeletons were mostly well preserved, the pelvis of the human was too badly damaged to determine its sex), with its hand resting on the chest of a puppy, which may have been killed for burial. Dating to some 12,000 years ago, the culture that buried this pair, the Natufians, were on the cusp of the transition from hunter-gatherers to settled agriculturalists. The positioning of the two skeletons strongly suggests a very close and affectionate relationship between the human and the animal — as if the puppy was intended to accompany its owner into the afterlife.

Some dog burials formed part of a sacrificial ritual. The ancient Egyptians, notorious for breeding, killing, and mummifying domestic cats by the millions, did the same to dogs, though to a lesser degree. At roughly the same time, some 2,500 years ago, the Persians living in today’s southern Israel created vast dog cemeteries. Archaeologists have excavated over 1,200 dogs and puppies from one site at Ashkelon, concluding that the majority were not pets but “feral” street dogs, many of which apparently died from natural causes. No written records indicate the spiritual significance of these interments or why the Persians had stronger feelings for strays than for their own dogs. During the millennium before the birth of Christ, conceptions of dogs evidently varied widely: it is thought that the Hittites, who inhabited what is now eastern Turkey, attached special healing powers to puppies, both living and deliberately sacrificed. Dating to 1,000 years earlier, one graveyard in China contained over four hundred dogs, each beneath a human, indicating that these dogs had been killed when their masters died and interred with them. In the oldest dog cemeteries discovered in the United States, in Tennessee’s Green River valley, some dogs were buried alone, while others were buried alongside people.

The details of the interment sometimes enable us to guess at why a particular dog was buried. At one of the earliest European sites, Skateholm in Sweden, archaeologists discovered fourteen dog graves dating to around 6,000 years ago. In four cases, the dogs were buried alongside people, though most had their own graves, mainly at the edge of the cemetery in an area where the graves of children were also concentrated — as if dogs and children were somehow considered equivalent. At least one dog had received an elaborate burial, its grave strewn with ochre. Alongside it lay grave goods, usually only found in human burials, precious items provided to accompany the animal to the afterlife. In this case these included three knives made from flint and an elaborately decorated hammer made from a red deer’s antler.


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


While dog burials continued throughout recorded history, the practice seems to have diminished as societies became more settled and adopted agriculture, with few recorded in Europe since the end of the first millennium. One theory holds that many of these more recent European burials reflect the special relationship between dog and hunter rather than pet and devoted owner.

The majority of dogs given special burials may have been, rather than pets first and foremost, either favored hunting companions needed by their masters in the afterlife or unowned dogs sacrificed for some spiritual or superstitious purpose. A few examples do point, however, to a primarily affectionate relationship. One of the dogs buried some 7,000 years ago at an ancient cemetery in Anderson, Tennessee, had suffered several injuries during its lifetime, each of which had healed. This dog had grown old enough to suffer from arthritis and at least the last few years of its life would have made a poor hunting partner. This suggests that the dog’s owner took care of it out of pure affection.

Although women generally played a lesser role in hunting than men, dogs were interred with them too. One such burial from just over 4,000 years ago, found in today’s United Arab Emirates, is remarkably reminiscent of the Natufian grave from some 8,000 years previous. Dating from roughly the same era but halfway across the world, in Indian Knoll, Kentucky, one graveyard contained six dogs buried with women, six with men, and eight buried alone. Dogs buried with women would most plausibly have been pets.

Other signs suggest that in some cultures dogs were becoming, if not pets as we think of them today, then at least creatures with personalities of their own. Around 3,000 years ago, the Egyptians buried some dogs in a manner indicating that they were treasured more for their companionship than for their practical uses. The hieroglyphs on their gravestones tell us that the Egyptians gave some of their dogs human, rather than distinctively animal, names, echoing the replacement of “Fido” and “Buster” with “Max” and “Sam” in the West toward the end of the twentieth century.

* * *

What of cats? Cats first became domestic, in the sense that they hunted within human settlements, somewhere in the Fertile Crescent about 10,000 years ago, but the first evidence of fully domesticated pet cats appears only about 3,500 years ago, in Egyptian artworks. The ancient Egyptians kept many kinds of exotic animals as pets, including monkeys, cheetahs, and small deer, but were nearly obsessed with domestic cats. Their more formal art often depicted cats as the companions of aristocratic women — their husbands preferring to pose with their dogs — but we also have evidence that pet cats became a feature of many, perhaps most, households in all strata of society, since they often feature in sketches done by temple artists for their own amusement. The Greek historian Herodotus reported 2,500 years ago that the Egyptians so venerated their pet cats that when one died from natural causes, the whole family shaved their eyebrows as a mark of respect. The ancient Egyptians undoubtedly also valued cats for their skills at controlling vermin — seemingly finding their ability to deter snakes especially impressive — but prized them equally as pets.

Cats subsequently spread from Egypt around the Mediterranean and, thanks to Phoenician traders, had reached England by 2,300 years ago. People valued them foremost as hunters, however, not as pets. Still, it is difficult to imagine that the off-duty cat snoozing by the fireside did not engender affection wherever and whenever it found itself.

* * *

As civilization proceeded and small-scale hunter-gatherer societies gave way to urban elites and subservient rural populations, pet keeping entered a completely new phase. In the generally egalitarian communities of the Paleolithic everyone could keep animals as companions, whereas in the highly stratified societies of the Egyptian, Greek, and Roman empires, right up until the twentieth century, the poorest had little opportunity to acquire pets for their own sake. That’s not to say that they didn’t feel affection for dogs and cats, but those animals had to earn their keep. The surviving evidence generally suggests that from the classical period (fifth and fourth centuries bce) until the end of the nineteenth century, pets played a part in the lives of the wealthiest members of society. As the less well-off inevitably left fewer traces of their lives, we can only guess at how they interacted with their animals; no doubt they had less time and fewer resources to devote to them. Not until the nineteenth century, with the rise of the middle classes, did the keeping of pets for their own sake become widespread once more.

In the generally egalitarian communities of the Paleolithic everyone could keep animals as companions, whereas in the highly stratified societies of the Egyptian, Greek, and Roman empires, right up until the twentieth century, the poorest had little opportunity to acquire pets for their own sake.

The visual arts of the classical period reveal the elevated status of domestic pets, especially dogs. Greek tombs depict dogs and occasionally cats gazing adoringly at their masters, while children’s tombs sometimes include representations of birds. Greek art of the period includes representations of cats — for example, a kitten sitting on a child’s shoulder — that clearly indicate their occasional status as pets. Ancient Romans who bred toy dogs can only have intended them as companions (the well-known Pompeian mosaic of a chained dog, inscribed cave canem, shows that many other dogs served primarily as guards). A carving on a Roman tomb depicts a fashionable lady with what looks like a lapdog peering out from under her armpit. Dogs also appear on children’s tombs, some quietly curled up, others seeming to invite the child to join in a game. On the opposite side of the world, in China, the aristocracy kept lapdogs that bear a striking resemblance to today’s Pekinese.

The switch from hunting and gathering and nomadism to settled agriculture and animal husbandry seems to have brought about a profound change in people’s treatment of animals. Certainly the major monotheistic religions of the Fertile Crescent — Judaism, Christianity, and Islam — all emphasize dominion over animals: “God blessed [the humans] and said to them, ‘Be fruitful and increase in number; fill the earth and subdue it. Rule over the fish of the sea and the birds of the air and over every living creature that moves on the ground.’” The Christian Church generally looked askance at any display of affection toward animals in general and pets in particular: in the thirteenth century the Franciscans (founded by the animal-loving St. Francis of Assisi) were taken to task by the authorities for their fondness for dogs, cats, and small birds. The reviling of cats as potential agents of Satan apparently stemmed from the “pagan” worship of cat gods and goddesses in rural areas of Europe. While generally regarding dogs as unclean, Islam viewed cats rather more positively, with the earliest cat sanctuary reputedly founded in Cairo in 1280. Only Buddhism consistently emphasized respect for nonhuman animals, embedded in its concept of reincarnation. Whether this institutionalization of monotheistic attitudes brought about a change in attitudes toward animals or merely legitimated a new necessity for productivity in societies that now relied heavily on animals for both food and transport, the result was a great deal of what we today regard as cruelty.

During the early middle ages (fifth through tenth centuries ce) attitudes toward domestic animals were largely utilitarian, at least in western Europe. The ninth-century poem “The Scholar and His Cat, Pangur Bán,” written by an Irish monk, compares the poet’s struggle to find insight with his cat’s mousing:

’Gainst the wall he sets his eye, full and fierce and sharp and sly
’Gainst the wall of knowledge I, all my little wisdom try

The poem’s eight stanzas mention no affection he might have felt for the cat, only admiration for his prowess as a hunter. Tenth-century Welsh statutes valued a female cat at four pence, but only for her mouse-catching and breeding abilities — not for her readiness to purr on someone’s lap. An untrained dog went for four pence, but the price doubled after its training, implying its value lay in the tasks it could perform.

The period between the eleventh and fourteenth centuries saw little change in attitudes toward animals in general, and pet keeping remained limited to those who could afford it — and could afford to ignore the disapproval of the church. Fashionable ladies continued to keep lapdogs. Any praise of working dogs for their faithfulness may simply have reflected how easy this made them to train. Farmers often gave names to individual animals, sheep for example, but not necessarily out of any particular affection. Cats became identified with their agricultural function, not as household pets, as in this passage from Daniel of Beccles’s book of advice for aspiring noblemen, the Urbanus: “Let not a brute beast be stabled in the hall, let not a pig or a cat be seen in it; the animals which can be seen in it are the charger and the palfrey, hounds entered to hare, mastiff pups, hawks, sparrow-hawks, falcons, and merlins.” Monks especially valued cats for their fur, which, being cheaper than fox fur, did not violate their vow of poverty: fourteenth-century East Anglians fixed the price of 1,000 cat skins at just four pence.

During the sixteenth century pet keeping continued to thrive among the well-to-do, especially women, and the word “pet,” in the sense of animal companion, first appeared in the English language. John Caius’s 1576 book Of Englishe Dogges divides the species into two kinds: peasant dogs, or “curs,” and “noble” dogs, which included dogs for hunting and retrievers for hawking (both largely the province of men) and lapdogs, which noblewomen continued to favor. Caius’s description leaves little doubt that the latter were pets in the modern sense of the word: “These puppies, the smaller they be, the more pleasure they provoke, as more meet playfellows for mincing mistresses to keep in their bosoms, to keep company withal in their chambers, to succor with sleep in bed, and nourish with meat at board, to lay in their laps and lick their lips as they ride in wagons.”

Cruelty, even to dogs, was widespread during the Renaissance. The term “hangdog” derives from the habit of killing old or injured dogs by hanging. People subjected cats to all kinds of treatment that we would condemn today. By no means the most extreme was a form of entertainment known as “Katzenmusik,” which consisted of tying strings of bells to several cats, cramming them into a sack, and then letting them out in an arena to fight, their growls and howls accompanied by the jangling of the bells. Going to bear- and bull-baiting events that used dogs was a perfectly acceptable alternative to attending the theater for a performance of William Shakespeare’s latest play.

Not until the seventeenth century did pet keeping become widespread. Before that, houses in towns, like those in the countryside, had been full of animals — pigs and poultry as well as dogs and cats — blurring the distinction between companionship and cohabitation. Yet church strictures against feeling affection for animals remained uppermost in some people’s minds: in 1590, even as she lay dying, Katherine Stubbes beat her favorite puppy, believing she and her husband had “offended God grievously in receiving many a time this bitch into our bed.”

The Christian Church generally looked askance at any display of affection toward animals in general and pets in particular: in the thirteenth century the Franciscans (founded by the animal-loving St. Francis of Assisi) were taken to task by the authorities for their fondness for dogs, cats, and small birds.

A gradual change in the perception of animals accompanied progressive urbanization: rather than seeing them as mere machines, as René Descartes and other philosophers suggested in the mid-seventeenth century, the general public widely accepted them as capable of not just receiving but returning affection. Thomas Bedingfield first proposed the benefits of harnessing this sentiment for practical ends: in The Art of Riding, a 1584 translation of an Italian manual, Claudio Corte’s Il Cavallerizzo, he told horse trainers that ensuring their charges’ love for them was more effective for training than the previous harsh methods based on “mastery.” Monkeys were popular pets, valued for their ability to “ape” human behavior. For the first time, cats became popular companions, especially for women, but small dogs were still more prevalent, frequently appearing in portraits. The practice of burying favorite dogs in special cemeteries reemerged. Whatever the species of animal, the concept of mutual affection came to be widely accepted.

Of course, the increasingly tight bond between some humans and some animals did not put an end to animal cruelty (nor has it still). The burning alive of cats enjoyed wide acceptance up to the seventeenth century, and until 1817, the Festival of the Cats, celebrated to this day in the Belgian city of Ypres, featured the throwing of a bag full of live cats from the top of the church tower (nowadays the bag contains soft toys). In the countryside, attitudes toward dogs could be far from sentimental: in 1698 a Dorset farmer recorded his satisfaction at having extracted eleven pounds of grease after killing and then simmering his elderly dog.

* * *

When pet keeping for its own sake began to expand during the eighteenth century, the choice of species was if anything greater than it is today. The majority of animal companions were not domesticated species but tamed wild specimens, an unwitting reflection of the habits of hunter-gatherers in distant and still uncharted parts of the world. Pet tortoises, monkeys, otters, and squirrels were all readily available for those who could afford them, but perhaps most popular in eighteenth-century London were caged songbirds (canaries and chaffinches were particularly affordable) and talking jackdaws, magpies, and parrots. The poet William Cowper (1731–1800) kept three pet hares that he named Puss, Tiney, and Bess (only to discover afterward that they were all male). His devotion to them extended to having a snuff box made with an engraved lid depicting the three and listing their names. They were evidently less devoted to him, since Puss in particular made regular escapes requiring forcible retrieval. Cowper later kept a series of three pet dogs: Mungo, the Marquis, and then Beau; his biography describes the latter as follows: “Whether frisking amid the flags and rushes, or pursuing the swallows when his master walked abroad, or whether licking his hand or nibbling the end of his pen when in his lap at home, Beau ofttimes, like his predecessor, the hare, beguiled Cowper’s heart of thoughts that made it ache, and forced him to a smile.”

Dogs kept exclusively as pets were still probably rare, but some owners evidently regarded their working dogs with affection. By the early eighteenth century, a farmer’s favorite hound might live indoors: “Caress’d and lov’d by every soul, he ranged the house without control.” By the end of the nineteenth century cats had become popular pets, made fashionable in the United Kingdom by Queen Victoria. Meanwhile, across the Atlantic Mark Twain wrote, without irony, in an essay published after his death in 1910, “When a man loves cats, I am his friend and comrade, without further introduction.”

* * *

Over the broad sweep of prehistory and then history, pet keeping went through two distinct stages. Even prehistorically, mankind had a far more complex perception of animals than simply that of the predator for his prey, the hunter for the hunted. Though always a valued source of protein, animals at some point — perhaps as sophisticated consciousness first evolved in the hominid brain — took on other significances, if the customs of surviving Paleolithic peoples are anything to go by. Humans chose some animals to share their living spaces, even integrating them intimately into the family. The widespread breast-feeding of young mammals, shocking to modern sensibilities, might superficially suggest a bond very different from that between today’s pet and its owner. However, it probably arose as a straightforward nutritional necessity — as the only way to raise baby mammals captured before weaning. The practice does, nevertheless, point to a powerful and apparently near-universal instinct among hunter-gatherers to extend their most intimate caring to and expend essential resources on young animals.

This first stage gradually phased out as hunter-gatherer groups gave way to societies stratified into rulers and subjects. In the second stage, pet keeping became the privilege of those with money and influence. A common thread runs through both: a difference between the sexes in their fundamental attitudes toward animals. In small-scale societies, women and children often took the most care of captured wild animals. In the Middle Ages, while aristocratic men valued hounds and hawks for their utility and the prestige they conferred, their ladies demonstrated affection for specially bred small dogs.

While frowned on for much of recorded history by the (almost entirely male) authorities, pet keeping continued in everyday life, sustained largely by women and primarily, but probably not exclusively, by the well-to-do. Societies occasionally attempted to suppress this: in medieval Germany, thousands of women stood accused of witchcraft based on their affection for their cats. Even in the late seventeenth century, those condemned during the Salem witch trials included two dogs “possessed” by the Devil.

By the eighteenth century, attitudes had begun to change, paving the way both for a much more humane approach to domestic animals in general and for the third stage of pet keeping, its universal acceptance in the West.

* * *

John Bradshaw is the foundation director of the Anthrozoology Institute at the University of Bristol, and author of the New York Times bestsellers Cat Sense and Dog Sense and coauthor of The Trainable Cat. He lives in Southampton, England.

Excerpted from The Animals Among Us: How Pets Make Us Human by John Bradshaw. Copyright © 2017. Available from Basic Books, an imprint of Hachette Book Group, Inc.

Editor: Dana Snitzky

Switch at Birth — But How?

$
0
0

This is an excerpt from The Atavist‘s issue no. 113, “The Lives of Others,” by writer Lindsay Jones. In remote Newfoundland, a search for answers about a series of baby mix-ups leads to a woman known as “Nurse Tiger.”

Lindsay Jones | The Atavist | March 2021 | 5 minutes (1,556 words)

The Atavist is Longreads‘ sister publication. For 10 years, it has been a digital pioneer in long-form narrative journalism, publishing one deeply reported, elegantly designed story each month. Support The Atavist by becoming a magazine member.

Rita Hynes lugged her pregnant body up the rural hospital’s wooden steps. It was the night of December 7, 1962, and her rounded belly tightened with each contraction. At just 20, Rita knew what she was in for. She had given birth two years prior, to a girl. Rita wasn’t married then, so the priest from her Catholic fishing hamlet on the southern coast of Newfoundland had snatched the infant from her arms and slapped Rita across the face. The baby would be raised by an aunt and uncle.

Rita, a slip of a woman, with blond hair and a rollicking laugh, soon became pregnant again by the baby girl’s father, a burly, blue-eyed fisherman named Ches Hynes, who was 11 years her senior. The couple married in the summer of 1961, the same day their son Stephen was born. But their happiness was short-lived: Stephen died as an infant, in his sleep.

Now Rita was pregnant for a third time. At the hospital, she felt the intensifying crests of pain—at first bearable, and then searing as the night wore on. Just after midnight, she heard the cries of her eight-pound baby pierce the air. A boy! She named him Clarence Peter Hynes, after his godfather, who was a close friend of her husband’s, and her brother, who had died in a fishing accident. Clarence was deposited in the hospital’s nursery and tucked into a bassinet, while Rita dozed in the women’s ward. This time, she surely hoped, no one and nothing would take her baby.

Clarence, whom everyone calls Clar, grew up in a fishing town, St. Bernard’s, perched on the edge of Newfoundland’s Fortune Bay. He was the first in a steady stream of infants to arrive at the Hyneses’ home, a small taupe bungalow on a hill overlooking the quay, with its fish sheds painted the bright colors of jelly beans. As a youngster, Clar watched out the kitchen window for boats steaming into the crescent-shaped harbor and then furiously pedaled his bike down to the wharf. He earned $4 an hour unloading and weighing nets teeming with squid and silver cod.

Clar slept in a top bunk in a room he shared with his brothers. They were fairer than he was—Clar had a toasty complexion and a thick head of dark hair. When they wanted to torment him, his brothers called him Freddy Fender, after the Mexican-American musician. He grew to become a local heartthrob, with a chiseled brow and lean, muscular frame. Clar was a natural athlete who excelled at hockey and cross-country. Rita, a typical hockey mom, banged on the glass during his games and leaned over the railings to yell at the referees.

At 16, when Clar left home for Ontario to work on the Canadian Pacific Railway, Rita cried for days. She knelt on a chair at the kitchen window, clutching her rosary beads and praying to God to bring her son back. She kept all the letters he sent her in her closet. When Clar did return, driving his navy blue Chevy Camaro into the village after many months away, the teenage girls of St. Bernard’s swooned. “Oh, Clar is so handsome!” his sister, Dorothy, remembered hearing again and again—her friends were always talking about her big brother.

Clar was 24 when he met a woman named Cheryl at a motel bar in Marystown, farther down the boot-shaped peninsula from where he grew up. Clar had an on-and-off girlfriend at the time, but when he saw Cheryl he was smitten. With pretty, bow-shaped lips and curly blond hair, she was the belle of the bar. She’d recently moved back to Newfoundland from the Toronto area, where she’d worked as a hairstylist. Cheryl noticed Clar looking at her. She didn’t normally date guys from rural fishing communities, or “down over the road.” They were a hard bunch. But as she and Clar talked over beers and glasses of Screech rum and 7Up, Cheryl found him attentive and kind. They danced and chatted the night away. She didn’t want it to end.

They were married two years later in Marystown’s white, steepled Anglican church. The ceremony was packed to the gills with family. Rita wore a royal blue dress with puffed sleeves, and her husband Ches a dark gray suit. They were thrilled to see Clar tie the knot.

Rita was diagnosed with late-stage ovarian cancer a few years later, at 50. Clar nursed her as a mother would a baby. He held her and rocked her in the Hyneses’ old bungalow on the hill, making sure to face a window on the ocean so she could see the waves. Rita stayed with Clar and Cheryl at their home “in town,” as everyone calls Newfoundland’s capital city, St. John’s, during the futile treatment she underwent. Clar spoon-fed his mother bowls of fish and potatoes. He spent day after day with her right up until the end, so she would never be alone.

Five years after that, lung cancer took Ches.

Clar and Cheryl built a life together in St. John’s, raising three children of their own. When the fishery that had sustained generations of islanders collapsed, Newfoundland’s economy reoriented itself around the offshore oil and gas business. By 2014, Clar had a job as a welding foreman at Bull Arm, one of the industry’s major fabrication sites, where employees were building an oil platform that would eventually be towed out to sea.

That December, 52 years to the day after Rita brought him into the world, Clar overheard a woman in the hallway just outside his office sing out to a coworker, “It’s Craig’s birthday!” The woman’s name was Tracey Avery, and she was a cleaner at Bull Arm. She was talking about her husband, who also worked at the site. How funny, Clar thought. “It’s my birthday, too,” he said with a laugh.

“Yes, b’y,” Tracey replied. (B’y is pronounced “bye”—the Newfoundland expression is one of surprise, like “oh really?”) “How old are you?”

When Clar told her his age, Tracey’s next words came tumbling out: “Where were you born?”

“Come By Chance Cottage Hospital,” Clar said.

Tracey stood stock still for a second, her mouth agape. Then she ran, leaving her mop and cart behind. Clar shivered.

In that moment, a secret began to worm its way into the light: Another child had been taken from Rita Hynes—and she wasn’t alone.

On ‘the rock,’ as Newfoundland is affectionately known, your bay and your bloodline still define who you are—they are the first things people ask about when they meet you.

Depending on how you look at it, the stirring of this long-buried truth was sheer coincidence—one of those wild things that just happens—or it was inevitable, born of the quiddity of place. Newfoundland, the island portion of the sprawling Canadian province known as Newfoundland and Labrador, is a massive triangular rock in the Atlantic Ocean, colonized centuries ago for its fishing grounds. It has a rugged coastline, with hundreds of communities nestled into crooks, crannies, and coves. Some towns have blush-inducing names such as Heart’s Desire, Leading Tickles, and Dildo, and each is its own remote kingdom, fortified by rolling bluffs. Extended families are vast and tightly bound. For a long time they had to be. In such an austere place, it was a matter of survival. Today on “the rock,” as Newfoundland is affectionately known, your bay and your bloodline still define who you are—they are the first things people ask about when they meet you.

Getting anywhere along Newfoundland’s 6,000 miles of mountainous coast has always been a challenge. In the early 20th century, people in many of the island’s approximately 1,300 outports—the local term for fishing towns—had limited access to health care. Cottage hospitals, strategically located to serve dozens of outports at once, were intended to eliminate unnecessary death and suffering. They were a place to have your appendix out, get stitched up after an accident, or give birth and recover under the care of qualified doctors and nurses. They heralded a new dawn for Newfoundland. According to Edward Lake, a nurse and health administrator who worked in cottage hospitals and later wrote the definitive account of their history, they were the start of the most advanced rural health care program North America had ever seen, forerunners to Canada’s publicly funded national system.

The first seven cottage hospitals opened in 1936. One was located in the village of Come By Chance, which had been given its curious name by English colonists. As the story goes, in 1612, white explorers came ashore in one bay, only to discover a well-worn path to another bay on another coastline. The path had been cut by the indigenous Beothuk people. (The Beothuk were wiped out in the 19th century by the encroachment of white settlers.) The route led to the mouth of a river flush with salmon. It was a fortuitous find, which perhaps explains why the colonists later christened the settlement they built there Come By Chance. More than three centuries on, the village would prove a prime spot for a cottage hospital, with more than 50 outports close by.

The cottage hospitals were cookie-cutter clapboard buildings designed to be inviting. From the outside they looked like quaint residences. Strangely, in Come By Chance, the hospital was built the wrong way round, with its back to the road. For those inclined to superstition, the error might seem like an omen—a foretelling of bigger mix-ups to come.

 

Read the full story at The Atavist

Sentenced to Life At 16

$
0
0

This is an excerpt from The Atavist‘s issue no. 114, “The Invisible Kid,” by writer Maddy Crowell. The year Adolfo Davis was arrested, he became one of 2,500 adolescents serving mandatory life sentences across the United States.

Maddy Crowell | The Atavist | April 2021 | 5 minutes (1,507 words)

The Atavist is Longreads‘ sister publication. For 10 years, it has been a digital pioneer in long-form narrative journalism, publishing one deeply reported, elegantly designed story each month. Support The Atavist by becoming a magazine member.

Sometime after he had given up hope and then recovered it, Adolfo Davis began writing letters from his prison cell. Around 1999, he bought paper and pens from the commissary and wrote one letter after another, three times a week. He wrote on his bed, a squeaky metal frame with a lumpy loaf of a mattress, under the ugly glare of a fluorescent light bulb. There was nothing much to look at in his cell, just gray walls and a burnt-orange door made of steel, with tiny holes drilled through it. Muffled sounds from the hallway helped him figure out what time of day it was, when it was mealtime, which guards were working.

“My name is Adolfo Davis, and I’m trying to get home and regain my freedom,” he would write. “I didn’t shoot nobody. Please, help me get a second chance at life.” He sent a letter to nearly every law firm in Chicago, and after that, to every firm he could find in the state of Illinois. Most of the time, the letters went unanswered. Occasionally, he received a curt apology: “Sorry, we are at capacity.” Or simply: “We can’t, but good luck.”

Adolfo was in his early twenties when he started writing the letters. He had a boyish smile, a light mustache, and a disarming charisma that could fold into stillness when he felt like being alone. In 1993, at the age of 16, he’d been convicted as an accomplice to a double murder that took place when he was 14. He claimed that he was there when the killings happened, but that he didn’t pull the trigger. For that he was serving a mandatory life sentence, without the possibility of parole.

Prisons in Illinois were teeming with cases like his—Black men who’d been locked up as teenagers. Few would ever be freed. Over the years, Adolfo watched friends become optimistic and then have their hopes dashed by the courts, by politicians, by their own lawyers. He once saw someone make it to the front door of the prison after a ruling was issued in his favor, only to be sent back to his cell when a state’s attorney made a last-minute phone call to a judge.

Sometimes Adolfo felt like he was trapped at the bottom of an hourglass, the sand piling up around him: Every falling grain meant another day of his life lost. Except that he wasn’t sure exactly what he was missing. He’d been free in the world for only 14 years—about as long as it takes some woolly bear caterpillars to become moths. What he remembered best was the small slice of Chicago’s South Side where he grew up. He remembered selling drugs on street corners, and coming home to find no food in the house. He remembered being evicted 11 times in 12 years, and sleeping in apartments crammed with other kids, aunties and uncles, friends. He remembered doing wheelies on his bike, showing off to the other kids in his neighborhood. He remembered getting up early on Sundays to get a Super Transfer—a bus ticket good for an entire day—and riding downtown, where skyscrapers towered above him. He and his friends would spend the day shining shoes or breakdancing for money.

The letters continued into Adolfo’s thirties. At some point, he began to wonder if he’d be writing them for the rest of his life. He would if he had to, because despite the terms of his sentence, the only thing that sustained him was the thought that he might eventually be released. So he kept writing; the months bled together, and the years did, too.

One day in 2009, Adolfo got a letter from the officials at Illinois’s Stateville prison, where he was incarcerated, notifying him that a lawyer would visit him the next day. Her name was Patricia Soung, and she was from the Children and Family Justice Center, a legal clinic run by Northwestern University, in Evanston, just outside Chicago. Adolfo had no idea what her visit was about, but he felt a sudden buoyancy.

When he met Soung, he could tell right away that she was, as he later put it, “an alpha”—professional and direct. Yet she seemed to care about him as a person, too. She and her team were working on juvenile-justice cases in Illinois, she explained, and they’d come across his. She wanted to take it on pro bono. Was he interested?

In more than a decade of writing letters, Adolfo had never sent one to Soung or the Children and Family Justice Center. This offer of possible salvation came entirely out of the blue.

***

At the time when Adolfo met Soung, the United States was the only country in the world that sentenced children convicted of certain crimes to life in prison. In Illinois, as in many other states, adolescents as young as 14 could be transferred to an adult court, allowing prosecutors to circumvent a juvenile-court system that was considered more rehabilitative than punitive. If a child was convicted of a double murder in adult court, the mandatory sentence was life imprisonment without the possibility of parole—judges were barred from taking into account the circumstances surrounding the crime to lower the sentence. The year Adolfo was arrested, 2,500 other adolescents across the country were serving mandatory life sentences.

In more than a decade of writing letters, Adolfo had never sent one to Soung or the Children and Family Justice Center. This offer of possible salvation came entirely out of the blue.

Individuals convicted of certain crimes before they were 18 could also be sentenced to death, until a 2005 Supreme Court decision, Roper v. Simmons, abolished that option on the grounds that it violated the Eighth Amendment’s prohibition against cruel and unusual punishment. The decision was based in part on the idea that adolescents had an “underdeveloped sense of responsibility” and were “more vulnerable or susceptible to negative influences and outside pressures, including peer pressure.”

A coalition of activists and lawyers decided to use Roper to try to bring an end to mandatory life sentences for minors. The group was led in large part by Bryan Stevenson, an Alabama lawyer who saw an opportunity in the ruling: If the Supreme Court agreed that adolescents’ brains were fundamentally different from adults’, he reasoned, then why should a child ever be sentenced as an adult? Stevenson began searching the country for test cases—people serving life sentences who’d been locked up as kids. He had nearly 2,000 to choose from.

Stevenson zeroed in on 35 cases, spread over 20 states. They mostly involved the youngest adolescents condemned to die in prison. Stevenson filed an appeal in each of the cases, and two of them eventually reached the Supreme Court. In the first, Miller v. Alabama, a man named Evan Miller was 14 when he beat his neighbor and then set fire to his trailer, killing him, after a night of drinking and drug use. In the second, Jackson v. Hobbs, Kuntrell Jackson, also 14, robbed an Arkansas video store with two older teenagers, one of whom killed the store’s clerk.

In 2012, the Supreme Court delivered a monumental five to four decision in favor of Miller. It ruled that it was unlawful to hand a child a mandatory life sentence that failed to take “into account the family and home environment … no matter how brutal or dysfunctional.” As Justice Ruth Bader Ginsburg put it during oral arguments, “You’re dealing with a 14-year-old being sentenced to life in prison, so he will die in prison without any hope. I mean, essentially, you’re making a 14-year-old a throwaway person.”

The ruling was groundbreaking in that it compelled judges to consider a child’s background in determining sentencing. But it also left open the question of whether the decision could apply to older cases, ones that had already been litigated. Soung’s team at Northwestern wanted to use Adolfo’s case to set a precedent, cementing that the Miller ruling could be applied retroactively. In 2014, they brought his case before the Illinois Supreme Court, and to Adolfo’s amazement the judges ruled in his favor: Based on Miller, he could appeal his life sentence. The decision didn’t set him free, but it cleared a path for that to happen.

Suddenly, Adolfo’s story garnered national attention. He found himself on the front page of The New York Times—a photo of him in an oversize brown prison uniform appeared above a story about his case. “A Murderer at 14, Then a Lifer, Now a Man Pondering a Future,” the headline read. Journalists from the Chicago Sun-Times, the Chicago Tribune, and WBEZ contacted him, asking him to share his story. “‘I’m just praying for a second chance,’” one headline declared, quoting Adolfo.

By then he was 38. He’d spent nearly a quarter-century—most of his life—behind bars. With every letter he sent and every prayer he whispered, he’d been waiting for this moment. The possibility of release softened the harsh edges of prison, made them tolerable. At the same time, he was wary of what might happen when his case went back to court. The system had always been against him. Why should anything change now?

 

Read the full story at The Atavist

Rules For Departure

Even the Dogs

The Black Cube Chronicles

$
0
0

While investigating allegations of sexual-assault against Harvey Weinstein, Ronan Farrow was surveilled by an Israeli private-intelligence agency called Black Cube. Agents from Black Cube tried to get close with Farrow and other journalists looking into Weinstein — as well as several women who were planning on coming forward with their stories — in an attempt to suppress the allegations. An excerpt from Farrow’s book, Catch and Kill: Lies, Spies, and a Conspiracy to Protect Predators.

The post The Black Cube Chronicles appeared first on Longreads.

Don’t F**K With the Pet Detectives

$
0
0

Phil Hoad | The Atavist | February 2021 | 5 minutes (1,558 words)

This is an excerpt from The Atavist‘s issue no. 112, “Cat and Mouse,” by writer Phil Hoad. With dozens of felines turning up dead around London, a pair of pet detectives set out to prove it was the work of a serial killer.

It was the body on the south London doorstep that got everyone’s attention. On the bright morning of September 23, 2015, a woman walked outside her home to find a cream-and-coffee-colored pelt, like a small furry Pierrot. It had dark forelegs, and its face was a smoky blot. It was a cat, slit throat to belly; its intestines were gone.

The Atavist is Longreads‘ sister publication. For 10 years, it has been a digital pioneer in long-form narrative journalism, publishing one deeply reported, elegantly designed story each month. Support The Atavist by becoming a magazine member.

The woman rang the authorities, who came and disposed of the body. Three days later, she looked at a leaflet that had come through her mail slot, asking whether anyone had seen Ukiyo, a four-year-old ragdoll mix whose coat matched that of the dead cat. The woman broke the bad news to Ukiyo’s owner, Penny Beeson, who lived just down Dalmally Road, a nearly unbroken strip of poky, pebble-dashed row houses in the Addiscombe area of Croydon.

Beeson was inconsolable. “I shook for the whole day,” she later told The Independent.

“R.I.P ukiyo I feel devastated,” her son, Richard, posted on Facebook. “Hacked to death and left on someone’s doorstep. Some people are so sick!”

A few days later, Addiscombe’s letter boxes clacked again as another leaflet was delivered. This one warned that Ukiyo’s demise wasn’t an isolated incident—there had been a troubling spate of cat deaths in the area. The leaflet was printed by a local group called South Norwood Animal Rescue and Liberty, or SNARL.

Tony Jenkins, one of SNARL’s founders, had recently become his own master. At 51, with a reassuring, yeomanly face and a golden tinge at the very tip of his long, gray ponytail, Jenkins was laid off after 25 years working for a nearby government council. He hadn’t gotten along with his boss, so getting sacked came as something of a relief. With a year’s severance in his pocket, “I was enjoying my downtime,” Jenkins said. That included being with his girlfriend, a 44-year-old South African who went by the name Boudicca Rising, after the first-century Celtic warrior queen who fought the Romans to save the Britons. Among other things, Rising and Jenkins shared feelings of guardianship toward animals. Their homes at one point housed 34 cats, a dog, two gerbils, and a cockatoo between them. The couple had formed SNARL together.

Scanning Facebook one day in September 2015, about a week before Ukiyo was found dead, Jenkins stumbled upon a post from the nearby branch of the United Kingdom’s largest veterinarian chain, Vets4Pets, that described four gruesome local incidents in the past few weeks: a cat with its throat cut, one with a severed tail, another decapitated, and a fourth with a slashed stomach. Only the final cat had survived. Jenkins told Rising about the post. “That doesn’t sound right,” she said. “We need to do some digging.”

Digging was her forte. Always impeccably dressed, with an ornate gothic kick, and unfailingly in heels, Rising was a multitasking demon on a laptop. By day she worked for an office management company. By night she was part of the global alliance of animal rights activists. She was one of many people who used small details in online videos of a man torturing felines to identify the culprit, a Canadian man named Luka Magnotta. He was reported to police, who didn’t take the allegations seriously, and Magnotta went on to murder and chop up his lover in 2012—a crime recounted in the Netflix documentary Don’t F**k with Cats.

On the heels of Ukiyo’s death, Rising and Jenkins distributed SNARL’s leaflets throughout Addiscombe, warning of the threat to local felines. While to an uninterested eye some of the attacks might have appeared to be the indiscriminate cruelty of nature—the work of a hungry predator, say—SNARL believed they might be a series of linked and deliberate killings. Whether the crimes were perpetrated by an individual or a group SNARL wasn’t sure. It hoped the leaflets would help turn up more information.

SNARL soon had reports of more incidents in the area, for a total of seven: one cat missing, two with what SNARL subsequently described as “serious injuries,” and four dead. Rising said that vets who saw the deceased cats’ bodies told her the mutilations had been made with a knife. On September 29, SNARL sent out an alert on its Facebook page saying as much. The cats’ wounds, the group insisted, “could only have been inflicted by a human. Their bodies have been displayed in such a way as to cause maximum distress.”

That was SNARL’s official line. On Rising’s personal page she went further, emphasizing her belief that Addiscombe was dealing with a serial killer. “This is a psychopath,” she wrote.

While to an uninterested eye some of the attacks might have appeared to be the indiscriminate cruelty of nature, SNARL believed they might be a series of linked and deliberate killings.

On the afternoon of October 24, 2015, two miles southeast of Addiscombe, 47-year-old Wayne Bryant picked his way over the fallen leaves of Threehalfpenny Wood, named for a 19th-century murder victim found there with that sum of money in his pocket. The dry autumn air kept Bryant alert as his wide-spaced blue eyes scanned left and right and he listened to the wind hissing through the oak canopy. Bryant’s cat, Amber, like many domestic felines, kept regular hours with her comings and goings, but the previous day she hadn’t returned in the mid-afternoon as she usually did. When Amber didn’t show up the following morning, Bryant and his wife, Wendy, formed a search party.

A few years before, Bryant had suffered a serious spinal injury at work, causing a leak of cerebrospinal fluid and, eventually, several hematomas. Animals had always been a big part of his life—he and Wendy had a menagerie of rescue pets, from dogs to guinea pigs to lizards—but as he struggled with memory problems and long-term unemployment, the emotional support they provided became irreplaceable. Bryant had had Amber for eight years, since she was a six-week-old kitten. “A friendly little thing,” he told the website AnimalLogic. “A little curtain-climber.”

As they searched the woods, Bryant’s wife called to him. In a small clearing off a path, sheltered by a cluster of exposed tree roots, the ball of black and orange fur was unmistakable. But Amber was headless and tailless, except for that appendage’s very tip, which had been placed on her belly. The couple were sickened. They shrouded their beloved pet in a towel and took her home. Then Bryant remembered an article in the Croydon Advertiser about a group convinced that several recent cat killings were all connected.

A couple of hours later, Jenkins and Rising were at Bryant’s door. “I remember Wayne’s first words to me: ‘Ain’t no fox did that,’” Jenkins told me. “If I ever write a book about this, that’s what I’d call it.”

It was the first time either Jenkins or Rising had come face-to-face with a suspected cat killing. Neither of them had any forensics training. Unwrapping the towel that held Amber, they noted the clean severing of her head and tail, which seemed to corroborate Bryant’s view that no animal could be responsible. They asked the family to show them the crime scene. There was no blood on the ground, meaning that either her injuries were inflicted after death or Amber was killed elsewhere and moved to the spot in Threehalfpenny Wood where her owners found her. Rising and Jenkins took Amber’s body to a vet for further examination.

Bryant gave a statement to the police, and Rising went to the Royal Society for the Prevention of Cruelty to Animals (RSPCA), the UK’s main animal welfare charity. She later claimed that a representative brushed her off, saying that a fox probably killed Amber. Besides, the RSPCA dealt primarily with instances of cruelty in which the victims were still alive: It received more than 11,000 complaints a year in Greater London alone.

Jenkins was incredulous when he heard about the RSPCA’s response. “Although Croydon’s got a bad reputation, a lot of crime, I don’t think our foxes carry knives. And foxes certainly do not kill cats,” he said. At least, “it’s very, very rare.” He doubted that scavenging creatures would be interested in removing and eating feline heads and tails. Rather, they’d go for the nutritious internal organs, and SNARL hadn’t seen that kind of damage in any killing other than Ukiyo’s.

In October, there was another suspected cat killing in Croydon. Then SNARL began to get reports from farther afield, one in neighboring Mitcham and two in nearby West Norwood. Nick Jerome’s cat, Oscar, was found headless on his street. “None of us went to pieces over it, but it was obviously distressing at the time,” he said. In Coulsdon, on the southern edge of Croydon, David Emmerson discovered his cat, Missy, decapitated and tailless. His 18-year-old daughter, already struggling with the loss of her aunt the previous year, was devastated. Emmerson never told his autistic son the full story of what happened. The truth was too ugly. “I never grew up as a cat person,” he said, “but maybe because we got her as a kitten, she became one of us. Mine was the lap she chose to sit on when she sat down. I’m not sure why. I adored her.”

The RSPCA had its party line and wasn’t getting involved, but that didn’t stop the local press, which knew a good story when it heard one. By mid-November, reporters had made a lurid christening: The Croydon Cat Killer was on the prowl.

The post Don’t F**K With the Pet Detectives appeared first on Longreads.


Switch at Birth — But How?

$
0
0

Lindsay Jones | The Atavist | March 2021 | 5 minutes (1,556 words)

This is an excerpt from The Atavist‘s issue no. 113, “The Lives of Others,” by writer Lindsay Jones. In remote Newfoundland, a search for answers about a series of baby mix-ups leads to a woman known as “Nurse Tiger.”

Rita Hynes lugged her pregnant body up the rural hospital’s wooden steps. It was the night of December 7, 1962, and her rounded belly tightened with each contraction. At just 20, Rita knew what she was in for. She had given birth two years prior, to a girl. Rita wasn’t married then, so the priest from her Catholic fishing hamlet on the southern coast of Newfoundland had snatched the infant from her arms and slapped Rita across the face. The baby would be raised by an aunt and uncle.

Rita, a slip of a woman, with blond hair and a rollicking laugh, soon became pregnant again by the baby girl’s father, a burly, blue-eyed fisherman named Ches Hynes, who was 11 years her senior. The couple married in the summer of 1961, the same day their son Stephen was born. But their happiness was short-lived: Stephen died as an infant, in his sleep.

The Atavist is Longreads‘ sister publication. For 10 years, it has been a digital pioneer in long-form narrative journalism, publishing one deeply reported, elegantly designed story each month. Support The Atavist by becoming a magazine member.

Now Rita was pregnant for a third time. At the hospital, she felt the intensifying crests of pain—at first bearable, and then searing as the night wore on. Just after midnight, she heard the cries of her eight-pound baby pierce the air. A boy! She named him Clarence Peter Hynes, after his godfather, who was a close friend of her husband’s, and her brother, who had died in a fishing accident. Clarence was deposited in the hospital’s nursery and tucked into a bassinet, while Rita dozed in the women’s ward. This time, she surely hoped, no one and nothing would take her baby.

Clarence, whom everyone calls Clar, grew up in a fishing town, St. Bernard’s, perched on the edge of Newfoundland’s Fortune Bay. He was the first in a steady stream of infants to arrive at the Hyneses’ home, a small taupe bungalow on a hill overlooking the quay, with its fish sheds painted the bright colors of jelly beans. As a youngster, Clar watched out the kitchen window for boats steaming into the crescent-shaped harbor and then furiously pedaled his bike down to the wharf. He earned $4 an hour unloading and weighing nets teeming with squid and silver cod.

Clar slept in a top bunk in a room he shared with his brothers. They were fairer than he was—Clar had a toasty complexion and a thick head of dark hair. When they wanted to torment him, his brothers called him Freddy Fender, after the Mexican-American musician. He grew to become a local heartthrob, with a chiseled brow and lean, muscular frame. Clar was a natural athlete who excelled at hockey and cross-country. Rita, a typical hockey mom, banged on the glass during his games and leaned over the railings to yell at the referees.

At 16, when Clar left home for Ontario to work on the Canadian Pacific Railway, Rita cried for days. She knelt on a chair at the kitchen window, clutching her rosary beads and praying to God to bring her son back. She kept all the letters he sent her in her closet. When Clar did return, driving his navy blue Chevy Camaro into the village after many months away, the teenage girls of St. Bernard’s swooned. “Oh, Clar is so handsome!” his sister, Dorothy, remembered hearing again and again—her friends were always talking about her big brother.

Clar was 24 when he met a woman named Cheryl at a motel bar in Marystown, farther down the boot-shaped peninsula from where he grew up. Clar had an on-and-off girlfriend at the time, but when he saw Cheryl he was smitten. With pretty, bow-shaped lips and curly blond hair, she was the belle of the bar. She’d recently moved back to Newfoundland from the Toronto area, where she’d worked as a hairstylist. Cheryl noticed Clar looking at her. She didn’t normally date guys from rural fishing communities, or “down over the road.” They were a hard bunch. But as she and Clar talked over beers and glasses of Screech rum and 7Up, Cheryl found him attentive and kind. They danced and chatted the night away. She didn’t want it to end.

They were married two years later in Marystown’s white, steepled Anglican church. The ceremony was packed to the gills with family. Rita wore a royal blue dress with puffed sleeves, and her husband Ches a dark gray suit. They were thrilled to see Clar tie the knot.

Rita was diagnosed with late-stage ovarian cancer a few years later, at 50. Clar nursed her as a mother would a baby. He held her and rocked her in the Hyneses’ old bungalow on the hill, making sure to face a window on the ocean so she could see the waves. Rita stayed with Clar and Cheryl at their home “in town,” as everyone calls Newfoundland’s capital city, St. John’s, during the futile treatment she underwent. Clar spoon-fed his mother bowls of fish and potatoes. He spent day after day with her right up until the end, so she would never be alone.

Five years after that, lung cancer took Ches.

Clar and Cheryl built a life together in St. John’s, raising three children of their own. When the fishery that had sustained generations of islanders collapsed, Newfoundland’s economy reoriented itself around the offshore oil and gas business. By 2014, Clar had a job as a welding foreman at Bull Arm, one of the industry’s major fabrication sites, where employees were building an oil platform that would eventually be towed out to sea.

That December, 52 years to the day after Rita brought him into the world, Clar overheard a woman in the hallway just outside his office sing out to a coworker, “It’s Craig’s birthday!” The woman’s name was Tracey Avery, and she was a cleaner at Bull Arm. She was talking about her husband, who also worked at the site. How funny, Clar thought. “It’s my birthday, too,” he said with a laugh.

“Yes, b’y,” Tracey replied. (B’y is pronounced “bye”—the Newfoundland expression is one of surprise, like “oh really?”) “How old are you?”

When Clar told her his age, Tracey’s next words came tumbling out: “Where were you born?”

“Come By Chance Cottage Hospital,” Clar said.

Tracey stood stock still for a second, her mouth agape. Then she ran, leaving her mop and cart behind. Clar shivered.

In that moment, a secret began to worm its way into the light: Another child had been taken from Rita Hynes—and she wasn’t alone.

On ‘the rock,’ as Newfoundland is affectionately known, your bay and your bloodline still define who you are—they are the first things people ask about when they meet you.

Depending on how you look at it, the stirring of this long-buried truth was sheer coincidence—one of those wild things that just happens—or it was inevitable, born of the quiddity of place. Newfoundland, the island portion of the sprawling Canadian province known as Newfoundland and Labrador, is a massive triangular rock in the Atlantic Ocean, colonized centuries ago for its fishing grounds. It has a rugged coastline, with hundreds of communities nestled into crooks, crannies, and coves. Some towns have blush-inducing names such as Heart’s Desire, Leading Tickles, and Dildo, and each is its own remote kingdom, fortified by rolling bluffs. Extended families are vast and tightly bound. For a long time they had to be. In such an austere place, it was a matter of survival. Today on “the rock,” as Newfoundland is affectionately known, your bay and your bloodline still define who you are—they are the first things people ask about when they meet you.

Getting anywhere along Newfoundland’s 6,000 miles of mountainous coast has always been a challenge. In the early 20th century, people in many of the island’s approximately 1,300 outports—the local term for fishing towns—had limited access to health care. Cottage hospitals, strategically located to serve dozens of outports at once, were intended to eliminate unnecessary death and suffering. They were a place to have your appendix out, get stitched up after an accident, or give birth and recover under the care of qualified doctors and nurses. They heralded a new dawn for Newfoundland. According to Edward Lake, a nurse and health administrator who worked in cottage hospitals and later wrote the definitive account of their history, they were the start of the most advanced rural health care program North America had ever seen, forerunners to Canada’s publicly funded national system.

The first seven cottage hospitals opened in 1936. One was located in the village of Come By Chance, which had been given its curious name by English colonists. As the story goes, in 1612, white explorers came ashore in one bay, only to discover a well-worn path to another bay on another coastline. The path had been cut by the indigenous Beothuk people. (The Beothuk were wiped out in the 19th century by the encroachment of white settlers.) The route led to the mouth of a river flush with salmon. It was a fortuitous find, which perhaps explains why the colonists later christened the settlement they built there Come By Chance. More than three centuries on, the village would prove a prime spot for a cottage hospital, with more than 50 outports close by.

The cottage hospitals were cookie-cutter clapboard buildings designed to be inviting. From the outside they looked like quaint residences. Strangely, in Come By Chance, the hospital was built the wrong way round, with its back to the road. For those inclined to superstition, the error might seem like an omen—a foretelling of bigger mix-ups to come.

The post Switch at Birth — But How? appeared first on Longreads.

Sentenced to Life At 16

$
0
0

Maddy Crowell | The Atavist | April 2021 | 5 minutes (1,507 words)

This is an excerpt from The Atavist‘s issue no. 114, “The Invisible Kid,” by writer Maddy Crowell. The year Adolfo Davis was arrested, he became one of 2,500 adolescents serving mandatory life sentences across the United States.

Sometime after he had given up hope and then recovered it, Adolfo Davis began writing letters from his prison cell. Around 1999, he bought paper and pens from the commissary and wrote one letter after another, three times a week. He wrote on his bed, a squeaky metal frame with a lumpy loaf of a mattress, under the ugly glare of a fluorescent light bulb. There was nothing much to look at in his cell, just gray walls and a burnt-orange door made of steel, with tiny holes drilled through it. Muffled sounds from the hallway helped him figure out what time of day it was, when it was mealtime, which guards were working.

“My name is Adolfo Davis, and I’m trying to get home and regain my freedom,” he would write. “I didn’t shoot nobody. Please, help me get a second chance at life.” He sent a letter to nearly every law firm in Chicago, and after that, to every firm he could find in the state of Illinois. Most of the time, the letters went unanswered. Occasionally, he received a curt apology: “Sorry, we are at capacity.” Or simply: “We can’t, but good luck.”

The Atavist is Longreads‘ sister publication. For 10 years, it has been a digital pioneer in long-form narrative journalism, publishing one deeply reported, elegantly designed story each month. Support The Atavist by becoming a magazine member.

Adolfo was in his early twenties when he started writing the letters. He had a boyish smile, a light mustache, and a disarming charisma that could fold into stillness when he felt like being alone. In 1993, at the age of 16, he’d been convicted as an accomplice to a double murder that took place when he was 14. He claimed that he was there when the killings happened, but that he didn’t pull the trigger. For that he was serving a mandatory life sentence, without the possibility of parole.

Prisons in Illinois were teeming with cases like his—Black men who’d been locked up as teenagers. Few would ever be freed. Over the years, Adolfo watched friends become optimistic and then have their hopes dashed by the courts, by politicians, by their own lawyers. He once saw someone make it to the front door of the prison after a ruling was issued in his favor, only to be sent back to his cell when a state’s attorney made a last-minute phone call to a judge.

Sometimes Adolfo felt like he was trapped at the bottom of an hourglass, the sand piling up around him: Every falling grain meant another day of his life lost. Except that he wasn’t sure exactly what he was missing. He’d been free in the world for only 14 years—about as long as it takes some woolly bear caterpillars to become moths. What he remembered best was the small slice of Chicago’s South Side where he grew up. He remembered selling drugs on street corners, and coming home to find no food in the house. He remembered being evicted 11 times in 12 years, and sleeping in apartments crammed with other kids, aunties and uncles, friends. He remembered doing wheelies on his bike, showing off to the other kids in his neighborhood. He remembered getting up early on Sundays to get a Super Transfer—a bus ticket good for an entire day—and riding downtown, where skyscrapers towered above him. He and his friends would spend the day shining shoes or breakdancing for money.

The letters continued into Adolfo’s thirties. At some point, he began to wonder if he’d be writing them for the rest of his life. He would if he had to, because despite the terms of his sentence, the only thing that sustained him was the thought that he might eventually be released. So he kept writing; the months bled together, and the years did, too.

One day in 2009, Adolfo got a letter from the officials at Illinois’s Stateville prison, where he was incarcerated, notifying him that a lawyer would visit him the next day. Her name was Patricia Soung, and she was from the Children and Family Justice Center, a legal clinic run by Northwestern University, in Evanston, just outside Chicago. Adolfo had no idea what her visit was about, but he felt a sudden buoyancy.

When he met Soung, he could tell right away that she was, as he later put it, “an alpha”—professional and direct. Yet she seemed to care about him as a person, too. She and her team were working on juvenile-justice cases in Illinois, she explained, and they’d come across his. She wanted to take it on pro bono. Was he interested?

In more than a decade of writing letters, Adolfo had never sent one to Soung or the Children and Family Justice Center. This offer of possible salvation came entirely out of the blue.

***

At the time when Adolfo met Soung, the United States was the only country in the world that sentenced children convicted of certain crimes to life in prison. In Illinois, as in many other states, adolescents as young as 14 could be transferred to an adult court, allowing prosecutors to circumvent a juvenile-court system that was considered more rehabilitative than punitive. If a child was convicted of a double murder in adult court, the mandatory sentence was life imprisonment without the possibility of parole—judges were barred from taking into account the circumstances surrounding the crime to lower the sentence. The year Adolfo was arrested, 2,500 other adolescents across the country were serving mandatory life sentences.

In more than a decade of writing letters, Adolfo had never sent one to Soung or the Children and Family Justice Center. This offer of possible salvation came entirely out of the blue.

Individuals convicted of certain crimes before they were 18 could also be sentenced to death, until a 2005 Supreme Court decision, Roper v. Simmons, abolished that option on the grounds that it violated the Eighth Amendment’s prohibition against cruel and unusual punishment. The decision was based in part on the idea that adolescents had an “underdeveloped sense of responsibility” and were “more vulnerable or susceptible to negative influences and outside pressures, including peer pressure.”

A coalition of activists and lawyers decided to use Roper to try to bring an end to mandatory life sentences for minors. The group was led in large part by Bryan Stevenson, an Alabama lawyer who saw an opportunity in the ruling: If the Supreme Court agreed that adolescents’ brains were fundamentally different from adults’, he reasoned, then why should a child ever be sentenced as an adult? Stevenson began searching the country for test cases—people serving life sentences who’d been locked up as kids. He had nearly 2,000 to choose from.

Stevenson zeroed in on 35 cases, spread over 20 states. They mostly involved the youngest adolescents condemned to die in prison. Stevenson filed an appeal in each of the cases, and two of them eventually reached the Supreme Court. In the first, Miller v. Alabama, a man named Evan Miller was 14 when he beat his neighbor and then set fire to his trailer, killing him, after a night of drinking and drug use. In the second, Jackson v. Hobbs, Kuntrell Jackson, also 14, robbed an Arkansas video store with two older teenagers, one of whom killed the store’s clerk.

In 2012, the Supreme Court delivered a monumental five to four decision in favor of Miller. It ruled that it was unlawful to hand a child a mandatory life sentence that failed to take “into account the family and home environment … no matter how brutal or dysfunctional.” As Justice Ruth Bader Ginsburg put it during oral arguments, “You’re dealing with a 14-year-old being sentenced to life in prison, so he will die in prison without any hope. I mean, essentially, you’re making a 14-year-old a throwaway person.”

The ruling was groundbreaking in that it compelled judges to consider a child’s background in determining sentencing. But it also left open the question of whether the decision could apply to older cases, ones that had already been litigated. Soung’s team at Northwestern wanted to use Adolfo’s case to set a precedent, cementing that the Miller ruling could be applied retroactively. In 2014, they brought his case before the Illinois Supreme Court, and to Adolfo’s amazement the judges ruled in his favor: Based on Miller, he could appeal his life sentence. The decision didn’t set him free, but it cleared a path for that to happen.

Suddenly, Adolfo’s story garnered national attention. He found himself on the front page of The New York Times—a photo of him in an oversize brown prison uniform appeared above a story about his case. “A Murderer at 14, Then a Lifer, Now a Man Pondering a Future,” the headline read. Journalists from the Chicago Sun-Times, the Chicago Tribune, and WBEZ contacted him, asking him to share his story. “‘I’m just praying for a second chance,’” one headline declared, quoting Adolfo.

By then he was 38. He’d spent nearly a quarter-century—most of his life—behind bars. With every letter he sent and every prayer he whispered, he’d been waiting for this moment. The possibility of release softened the harsh edges of prison, made them tolerable. At the same time, he was wary of what might happen when his case went back to court. The system had always been against him. Why should anything change now?

The post Sentenced to Life At 16 appeared first on Longreads.

Viewing all 50 articles
Browse latest View live