

SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.


Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
How we construct society significantly determines the ways different groups live—and die. Unfortunately, despite some rhetoric to the contrary, Trump's health secretary seems content to let corporations continue to sicken us.
The Senate Finance Committee hearing with Robert F. Kennedy Jr. was explosive. The Secretary of Health and Human Services was accused of “reckless disregard for science and the truth,” and senators from both parties were openly hostile as they questioned him extensively on his vaccine policies, as well as the firing of scientific advisory board members and agency heads and their replacement with ideologically driven anti-vaccine supporters. During that more than three-hour session, he was called a charlatan and a liar, and he returned the insults.
The distrust of his honesty and integrity was palpable. The public health community already mistrusted his views on vaccines and the role of science. There was, however, some modest hope that he would at least follow through on his views on the environmental causes of chronic disease and the food industry’s disastrous impact on obesity and diabetes, as well as other diseases. Sadly, that’s been anything but the case and there’s quite a history behind that reality.
In focusing on the environmental causes of disease, Kennedy was building on a public health tradition that saw disease, suffering, and death as, at least in part, a function of the worlds we’ve constructed for ourselves and others over time. Historically, some instances of unnecessary suffering are glaringly obvious. Take, for instance, the exploitation and often premature death of Africans enslaved and transported to the New World under conditions so inhumane that approximately 10% to 20% of them perished during what came to be known as the Middle Passage. And don’t forget the suffering and early deaths of so many who survived and were consigned by whites to forced labor in the American South, where the average life expectancy of a newborn slave child was less than 22 years, or about half that of a white infant of the same era.
Or, to take another example, in her famous 1906-1907 study Work-Accidents and the Law, Crystal Eastman, the feminist cofounder of the American Civil Liberties Union and a social reformer, wrote of 526 men who were killed in accidents in the steel mills of Pittsburgh and another 509 who suffered serious injuries in—yes!—a single year, arguing that many of those accidents would have been preventable had work conditions been different. As she grimly reported:
Seven men lost a leg, sixteen men were hopelessly crippled in one or both legs, one lost a foot, two lost half a foot, five lost an arm, three lost a hand, ten lost two or more fingers, two were left with crippled left arms, three with crippled right arms, and two with two useless arms. Eleven lost an eye, and three others had the sight of both eyes damaged. Two men have crippled backs, two received internal injuries, one is partially paralyzed, one feebleminded, and two are stricken with the weakness of old age while still in their prime.
Some aspects of the inevitable—fatal disease or other devastating genetic and biological conditions—are clearly affected by how societies care for their members. Historically, race, social class, geographic location, gender, age, and immigrant status have all been shown to have a tremendous impact on access to medical care and the quality of that care. The social and economic arrangements Americans created have shaped patterns of disease prevalence, distribution, and recovery over the course of our history.
Most obviously, a system dependent on slavery produced untold suffering and death among those most exploited; a commercial economy involving trade between various regions of the country and the world often lent a significant hand to the transmission of diseases from mosquitoes, rats, and other sources of infection. The development of cities with large immigrant populations gave landlords the opportunity to profit from renting airless tenements without adequate sewerage or pure water, producing epidemics of tuberculosis and cholera, among other diseases of poverty. Similarly, the disfiguring accidents and diseases caused by toxic chemicals were often a reflection of the rampant expansion of a laissez-faire industrial system that put profits above human life. And the Trump administration’s decision to promote the use of coal and ignore the impact of a fossil-fuel-based economy on the climate and on health is perhaps the most glaring example today of the urge to maintain a world that is (all too literally) killing us.
Smallpox in the 18th century, along with typhoid, typhus, yellow fever, and cholera epidemics, and a plague of childhood diseases in the 19th century, were all exacerbated by the squalid conditions in which people lived. The industrial revolution created conditions for the development of epidemics of silicosis, lead poisoning, and asbestosis. In more recent decades, agricultural workers in the vineyards of California and elsewhere were regularly showered with pesticides while harvesting the food that agricultural companies packaged and sold to the nation. In that process, millions of people have suffered diseases and deaths that could have been avoided.
Recently, our collective environmental practices have contributed disproportionately to global warming and so to extreme droughts, ever more severe hurricanes, and rising sea levels that threaten to flood entire nations, and we’re sure you won’t be surprised to learn that such events can, in turn, result in compromised resistance to disease. Endocrine disruptors like bisphenyl A, PCBs, and dioxins manufactured in the 20th century turned out to cause a variety of cancers, birth defects, and other developmental disorders. Meanwhile, hundreds of chemicals manufactured in recent decades have undoubtedly led to increased deaths, diseases, and neurological damage globally. And, of course, count on one thing: Issues like these won’t be seriously addressed by Robert Kennedy Jr., despite his occasional claims that he will.
The Covid-19 pandemic provided us with an example of how unequal the effects of disease regularly are. Over the course of the pandemic’s first few years, Covid-19 killed more than 1 out of every 300 Americans. However, the burden of those deaths was distributed anything but evenly through the population. Those in a weakened state and without access to decent healthcare were the most likely to become ill and die. Although “the greatest number of deaths [were] among non-Hispanic white people… the rate of Covid-19 cases, hospitalizations, and deaths [was] higher among people of color.”
According to data from the Centers for Disease Control and Prevention, compared to whites, “American Indians and Alaskan Natives were 3.1 times more likely to be hospitalized, Black or African Americans are 2.5 times more likely to be hospitalized and 1.7 times more likely to die, and Hispanic or Latino persons are 1.5 times more likely to get Covid-19 and 2.3 times more likely to be hospitalized” In stark graphs, the Poor People’s Campaign documented that “people living in poorer counties died at nearly two times the rate of people who lived in richer counties.” During the early phase of the epidemic, from December 2021 through February 2022, counties with the lowest median income “had a death rate nearly three times higher… compared to those with the highest median incomes,” a difference that can’t simply be explained by disparities in vaccination rates.
And where will our latest secretary of Health and Human Services be if something like that happens on his watch? While he may call on companies to voluntarily remove food colorings, we should expect that, in a crisis, he’ll ultimately tell Americans to change their behavior and not eat cereals with food colorings.
Who you are, where you live, what you do, and what you earn have always been the key factors determining your lifespan and your health, rather than the technological changes in medical treatment that have become available.
But don’t even count on that since such products are deemed necessary to maintain the profits of a food manufacturing and distribution system largely controlled by a few giant agricultural businesses. Real reform of such a system would undoubtedly benefit the health of Americans. However, in the absence of a strong social movement, the entrenched interests that have promoted such industrial food production will undoubtedly prove to be virtually immune to serious restructuring or change. Indeed, as nutritionist and public health advocate Marion Nestle has written, there is now little resistance to the continuing unchecked growth of the agricultural sector and few challenges to the rights of Campbell’s, McDonald’s, Monsanto, Perdue, Smithfield Foods, and others to conduct their businesses in ways that may indeed threaten the health of tens of millions of Americans.
Of course, there is also real truth to the story of progress toward better health. The average life span of a white boy born in 1900 in a large American city was only 46.3 years, and of a Black boy, only 33 years. By the second decade of the 21st century, however, the average life expectancy for Americans was close to 78 years, although the gap between Black and white remains. Similarly, this country has reduced the number of deaths that used to plague both children and women giving birth, while largely controlling cholera and other water-borne diseases through the introduction of relatively safe water supply and sewerage systems. The last 150 years, writes demographer Richard Easterlin, have seen the “average life span” more than double globally from 20 to 40 years at the turn of the last century to between 60 and 80 years today. And yet Secretary of Health Kennedy seems to be ready to jettison perhaps the single most important technology responsible for rising life spans: vaccines! Rather than mandating that children receive vaccines before entering school, Kennedy said the decisions should be left to the state and to parents. Despite efforts to backtrack on his long anti-vax history, in interviews on CNN and elsewhere, he has insisted that “there are no vaccines that are safe and effective.”
While national and international mortality statistics tell an important story, they often hide wide variations in the health and well-being of those who make up such figures. A closer look at the life spans of industrial workers, women, Native Americans, Blacks, Hispanics, and whites reveals vast differences in disease experience. The persistence of disparities in health and longevity among them may, in truth, be the most enduring health reality of American society. Although new discoveries in medical science, impressive technological interventions, and modest policy initiatives have improved American health, narrowing the gaps described above, those disparities have persisted for more than four centuries.
Who you are, where you live, what you do, and what you earn have always been the key factors determining your lifespan and your health, rather than the technological changes in medical treatment that have become available. The narrative of ever more improvement that’s been the bread and butter of so much of public health’s self-congratulatory history needs to be modified to acknowledge the millions of years lost through the (too) early deaths of Blacks, Native Americans, and poor and working-class whites since the colonial era.
In the 19th century, the incidence of classic infectious and communicable diseases, including cholera, smallpox, tuberculosis, and typhoid, was at least in part the product of specific decisions, including the way landlords profited by jamming people into tenements and leaving them with outdoor plumbing and a polluted water supply. In short, suffering wasn’t just the inevitable byproduct of urbanization and industrialization, but of a dominant ideology that reinforced a laissez-faire economic system with profit (for the few) as its main goal.
Why, you might wonder, did so few question the logic of crowding so many together when there was nearly unlimited space in which to live in a still sparsely populated nation? Who determined that some people’s health could be sacrificed for the wealth of others, even though there were often no objective reasons why conditions could not have been better?
In effect, leaders then made social and political decisions about who should live and who should die, as they will again in the Trump era. Unfortunately, it’s all too rare to think of diseases not as an inevitable byproduct of a particular exposure or an inevitable outgrowth of modernization or industrialization, but as the byproduct of decisions made by individuals, groups, and societies. In different eras, different conditions have been created that diseased, maimed, or killed people all too unequally.
Isn’t it time, in the era of Donald Trump and Robert Kennedy Jr., when, for instance, the administration’s devastation of the US Agency for International Development might, according to the medical journal The Lancet, lead to 14 million more deaths globally, to broaden the definition of what causes disease and death in the United States (and elsewhere)? Isn’t it time not just to focus on viruses and events in nature, but on the structure of an American society in which the rich are growing ever richer and income inequality is on the rise, a world in which corporations, government, and institutions make decisions that profoundly affect people’s health? Consciously or not, the decisions the dominant groups in a society make determine who lives and who dies, who flourishes and who prospers.
Some disease-related tragedies are unavoidable, but all too many are not. There was no need for children to die in such large numbers from infections in the crowded slums of the 19th century, nor for workers to suffer so extensively from chronic diseases and disabilities in the factories of the early 20th century. Nor is it necessary in the modern era to pollute the environment with synthetic plastics that lead to epidemics of cancer, heart disease, or stroke. Worse yet, it’s anything but necessary, as Donald Trump is determined to do, to continue to pollute the global environment through the endless overuse of fossil fuels, ensuring that this world will someday become so warm that it may no longer support human life across significant swaths of the globe. How we construct society, in other words, significantly determines the ways different groups live—and die.
An understanding of how Americans have built their past should give us the power to shape the future. Companies do not have to continue to introduce synthetic hormones, pesticides, or other materials into the milk American children drink, the wheat in the cereals millions of Americans eat, or the meat that is a staple of our diet. Even simple regulatory changes could have a positive impact on how we, our children, and our grandchildren will live and die. Many positive changes, though never achieved without a struggle, aren’t particularly revolutionary or even massively disruptive of existing social relationships. Europeans, for example, have decided to require chemical companies simply to test their products for safety before being introduced into the stream of commerce.
We as a people should not have to watch helplessly as the Earth’s ecosystem is devastated through habitat destruction, resource depletion, and global warming. We should be able to learn from the horrible global accidents of the recent past. Chernobyl in Ukraine, and Fukushima in Japan are perhaps the most well-known “dead zones” our species has produced through inattention to the risks we humans create—in those cases, of course, with nuclear power. But we can learn from other, less well-known communities where human decisions have resulted in untold health consequences. Take, for instance, the way polychlorinated biphenyls polluted the community around the factory in Anniston, Alabama, where they were first produced in the 1930s, or how the town of Times Beach, Missouri, had to be literally abandoned because of the way that now-banned Polychlorinated Biphenyls, or PCBs, were spread on its roads. A host of polluted landfills across this country and around the world are now Superfund sites in need of massive investment to detoxify.
Simply put, the message we can learn from the past is that we need not continue to build worlds that kill us but can, collectively, make more life-affirming decisions. In the age of Donald Trump, who is now seeking to end women’s use of Tylenol, and Robert Kennedy Jr, we have entered a world of medical quackery. As Senator Maria Cantwell (D-Wash.) exclaimed, “Sir, you’re a charlatan. That’s what you are.”
RFK Jr. sold out on pesticides, but we can course correct if as a society we reprioritize health and start making decisions that benefit people over corporate greed.
When Health and Human Services Secretary Robert F. Kennedy, Jr. started talking about pesticides, a lot of people got their hopes up that someone might finally fix the broken food system. But instead he bowed to corporate oligarchy when he listened to Big Ag rather than recommending that we stop exposing ourselves to toxic pesticides. This toxic food system wasn’t always our reality, and it doesn’t have to be our future.
In the United States, it is the Environmental Protection Agency’s (EPA) job to regulate pesticides. Pesticide manufacturers apply for registration of active ingredients by submitting research (often industry funded) claiming they are safe and effective when used as directed. EPA determines its registration decisions based on a risk assessment and other supporting documents, then a public comment period follows. However, EPA relies on industry-funded research for these decisions, when time and again we have seen the pesticide industry hide evidence that its products cause harm.
Take the herbicide paraquat for instance: Paraquat is a highly toxic pesticide; one teaspoon is enough to kill an adult. There is no antidote for paraquat poisoning. This herbicide is commonly used in the United States as weeds become increasingly resistant to glyphosate (the active ingredient in Bayer’s industrial formulation of Roundup™). Paraquat is banned for use in 72 countries. Exposure to paraquat has been increasingly associated with Parkinson’s disease and other chronic conditions like cancer, but Big Ag has successfully pushed back against calls to ban this pesticide in the US for decades.
But this issue is bigger than one chemical; there are hundreds of pesticides in use in this country, and all of them have the potential to cause harm. Be it weeds, bugs, rodents, or fungi, the purpose of these chemicals is to kill what they come in contact with. Our consolidated food system encourages farmers to prioritize quantity over crop diversity—meaning that the largest farms in this country are monoculture operations (farms growing one crop on massive swaths of land). One problem with monoculture is that the pest pressures are significant. It requires high inputs of agrichemicals; you either need a huge amount of labor to pull weeds and hand-pick pests, or you apply increasing quantities of synthetic pesticides to manage pests. Year over year, as farms use more and more pesticides, weeds and pests develop resistance, requiring more frequent application or resorting to stronger, more toxic formulations. This is a vicious cycle that traps farmers by keeping them on a “pesticide treadmill.”
Agorecology is an economically and ecologically viable alternative to our current food system’s foundation of extraction.
This monoculture, ultra-processed food system that relies heavily on toxic chemicals is also making us sick, with microplastics being found in our brains (plastic usage in agriculture is also a growing concern and a major contributor to microplastics in soil); PFAS contaminating our water (many pesticide formulations contain or are themselves PFAS); and children being exposed to pesticides in their backyards, at parks and schools, and in utero. At the same time, farmers are being squeezed by a system that makes it harder for small and medium-sized farms to make a living, with no protections in place except for the corporate players.
It wasn’t one thing that set us on the path to this reality where our food, water, soil, air, and bodies are contaminated with fossil fuel derived agrichemicals and microplastics; there were decisions and policies that over the course of only a few decades cornered us into this reality. The good news is that we can course correct if as a society we reprioritize health and start making decisions that benefit people over corporate greed.
A food system built on agroecology is one that doesn’t rely on agrichemicals to function and is therefore not captured by corporations. An agroecological food system in America looks like thriving and decentralized community food systems, where the people growing and consuming food have control over what goes into and comes out of their food system; grow food without reliance on agrichemical inputs or patented seeds; work with the environment rather than against it; and prioritize health, safety, and collective well-being.
Agorecology is an economically and ecologically viable alternative to our current food system’s foundation of extraction. It is actively practiced around the world, and it existed in what we now call the United States of America long before pesticides were introduced. Our job today is to shift our extractive mindsets to ones that prioritize health, in line with Indigenous wisdom.
My situation is emblematic of a broader problem faced by Autistic people: There is so much public misunderstanding of our condition and, in spite of some progress, nowhere near enough ways for us to advocate for ourselves.
Recently there has been highly welcome indignation and pushback against the quackish treatments and attitude of stigmatization advocated by President Donald Trump against Autistic people during his infamous September 22 press conference. Some of the most forceful criticisms have been made by Autistic individuals and Autistic-led organizations. It has also been satisfying to see a major political figure like Illinois’s Democratic Gov. JB Pritzker offer enlightened rhetoric on the subject. In an executive order in May designed to protect Illinois’s Autistic persons’ privacy from Health and Human Services Secretary Robert F. Kennedy Jr.’s proposal to create a nationwide registry of Autistic persons, Pritzker stressed that “autism is a neurological difference–not a disease or an epidemic.”
In recent years activists and writers like Eric Garcia Jr., Temple Grandin, and the late Steve Silberman have pushed back against the stigmas attached to Autism by Trump and RFK Jr.: that Autistic people represent a diseased, anti-social segment of the population that are in need of a “cure” for their condition. Silberman’s best selling 2015 book NeuroTribes was a particularly notable contribution to the public discourse, describing Autism not as a mental illness but a normal and healthy variation of human neurological development. Writers like Silberman have stressed that Autistic people have the potential to use their unique intellectual and emotional gifts to make valuable contributions to the broader society—if that society is willing and able to offer accommodations to allow Autistic people to thrive.
Unfortunately, while the relatively enlightened approach toward Autism outlined above has made some progress in positively impacting public understanding, that progress has also been relatively limited. That limitation is illustrated perfectly by the Trump administration’s focus on finding a “cure” and other aspects of its harmful, reactionary approach to Autism. The Trump administration’s approach to Autism is part and parcel of its punitive and uncaring approach to underprivileged Americans in general, as demonstrated by its draconian gutting of an already devastated American welfare state.
Some of the most serious problems in Autistic policy in the United States run much deeper than Trump’s cruelty and ignorance or the medical quackery promoted by RFK Jr. One of the most deep-seated problems relates to Autistic adults in the job market. The unemployment rate for Autistic adults in the United States is extremely high—85% according to one estimate.
I have direct experience with the subject of Autistic adult employment. As an adult in my early 30s—in 2012—I received my first official medical diagnosis of Autism Spectrum Disorder: I was diagnosed with Asperger’s Syndrome. This diagnosis was supposed to help me receive disability accommodations in future employment after I received my master’s degree. After all, according to the Americans with Disabilities Act of 1990, employers are supposed to provide “reasonable accommodations” to persons with documented disabilities in order to help them overcome barriers to performing a job.
Over the past 15 years, I have had about seven employers—all low wage jobs—and have mostly gone without disability accommodations—not because I don’t need them but because I’ve found it impossible in most cases to obtain them. In most of these jobs, it was a psychologically shattering strain for me to try to succeed at them and try to compensate for my learning disabilities and moderate verbal communication impairment.
As far as I can tell, one of the reasons for my difficulties in obtaining employment accommodations is that, looking at me on the surface, I appear “high functioning.” As a job counselor with my state government’s Department of Vocational Rehabilitation (DVR) said to me 15 years ago, “You have a master’s degree, you shouldn’t be working at McDonalds,” when I suggested the latter as a possible employment route. When I had my first meeting with a supervisor at a job with a medical company in 2021, she remarked—thinking she was giving me a compliment—that I “didn’t look” like I had Asperger’s Syndrome. According to her I appeared “well put together” and well spoken. However, before long, previously invisible manifestations of my disabilities became apparent to her; I quit the job after four months as the supervisor made clear she was preparing to write me up for ineptitude.
The Trump administration’s approach to Autism is part and parcel of its punitive and uncaring approach to underprivileged Americans in general, as demonstrated by its draconian gutting of an already devastated American welfare state.
Although at one point the supervisor suggested she would be willing to give me disability accommodations, the company’s corporate office refused, saying that I would have to go through the costly and lengthy process of getting a new diagnosis of Autism before they would consider granting accommodations. The corporate HR official said that my 2012 Aspergers diagnosis was obsolete because of new diagnostic criteria for Autism embodied in the 2013 publication of the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders.
However, perhaps the most important reason for my frequent failure to secure disability accommodations is that, in many cases, the willingness of employers to provide accommodations often comes into conflict with the need to maximize worker productivity in the interests of profit. Even when accommodations are officially provided, they can easily become reduced to irrelevance as supervisors feel the pressure to maximize efficiency and productivity and lash out at employees. I myself have been bullied at a previous job for aspects of my personality related to my Autism—in spite of this job being one of the few instances where I was provided with formal disability accommodations—and have seen other Autistic coworkers similarly treated.
Meanwhile, I can report that I have been employed in a full time job for the last four years with the same company, currently making per hour approximately $3.49 more than my state’s minimum wage. I work with no disability accommodations at this job and have only told one coworker that I am Autistic. Within the last year, the company has assigned me a more public-facing role in tasks especially incompatible with my Autism-related disabilities. I’m highly tempted to ask HR for accommodations—to at least minimize my work in the public-facing role—but fear rejection and unduly antagonizing my supervisor who has long faced a staffing shortage in the public-facing role.
I think my situation is emblematic of a broader problem faced by Autistic people: There is so much public misunderstanding of our condition and, in spite of some progress, nowhere near enough ways for us to advocate for the manner in which society can respect our needs.