From bioethics to medical anthropology to humanities and back: A year in review

I thought I would take this opportunity to reflect on the past year, where I will be in the future, and how the student fellowship has impacted me. I still hope to contribute to the Bill of Health blog going forward, but as my last official post as a Petrie-Flom Student Fellow, I would be remiss if I did not express my sincere gratitude to everyone at the Petrie-Flom Center, the faculty and staff, the other student fellows, and especially my mentors: Professors I. Glenn Cohen, Carmel Shachar, and Intisar A. Rabb.

My own project took a few different turns this year. My original proposal was to explore the ways in which bioethics and biomedical issues will play a significant role in reviving the dialectic between secular scholars and religious authority. Ayman Shabana rightly argues that respect for Islamic religious norms is essential for the legitimacy of bioethical standards in the Muslim context, wherein he attributes the legitimating power of these norms—as well as their religious and spiritual underpinnings—to their moral, legal, and communal dimensions. Building off of Shabana’s work, my initial argument held that the relationship between the secular and religious worlds is important because the discourse between the two, although often presumed to be dichotomous, is not necessarily antithetical nor is it impassable. This led me back to the arguments of the venerable philosophers Alasdair MacIntyre and Charles Taylor whereby, in critiquing the concept of secularism itself along with its historical contexts, furthered my argument and helped me to clarify the significant role that religion plays vis-à-vis categorical issues such as fundamental beliefs and metaphysics. I still maintain this, and it is something I continue to work on, although I decided to take my project in another direction.

Read More

What can an 11th century Islamic philosopher teach us about 21st century neuroscience?

There is a lot of fascinating research about the brain coming out of Stanford University, with some exciting, cutting-edge work being done there. Early last month I reported on the findings made by neuroscientists at Stanford in understanding how mental rehearsal prepares our minds for real-world action. Today, I’ll outline the recent advances made by a team led by Sergiu Pasca, MD, assistant professor of psychiatry and behavioral sciences at Stanford University, and discuss some of the ethical implications of this research.

Pasca’s method enables him to culture cells in order to form brain organoids with robust structures that are not compromised by cells from other parts of the body, thereby allowing him to more accurately replicate distinct brain regions. Doing so provides greater structural organization and also allows him and his team of researchers to better study and understand pathological mechanisms and perhaps one day to examine the molecular, cellular, and circuit levels of a person’s neurons. This is a promising method and a big step toward greater understanding of psychiatric and neurological disease, leading Pasca to declare, “This is our doorway into personalized psychiatry.” At the same time—although these “brain balls” are not brains, nor do they receive sensory inputs from the outside world—it is clear that as scientists progress in both the techniques and complexity of replication, major ethical questions and dilemmas will arise.

Chief among these will undoubtedly be the perennial ethical debate about the ontology of a human being. Is it only physical, material, social—in which case we might think of ourselves as technicians—or is it spiritual, religious, metaphysical—in which case we would more likely consider ourselves custodians? When we speak about attributing rights to animals or consciousness to AI, it is because at bottom we hold some fundamental belief: about dignity, a soul, being, or about what life might mean in a relational or social and emotional sense. This is no different with Pasca’s brain balls; in fact, it is an even more pressing quandary. As Bruce Goldman notes in his article, “One of the most amazing things about their brain balls was that, with not much chemical guidance, they tended to take on a default structure that’s a facsimile of the most evolutionarily advanced part of the brain: the human cerebral cortex, with all six layers you find in a living human brain.” The ethics of growing human organs are one thing, but the ethics of growing brain balls, which might eventually lead to more and more complex synaptic connections followed by even more elaborate renditions of an actual brain, will become especially contentious given the meaning and significance that we associate with the brain—both biologically and existentially.

Read More

Orcas, Dolphins, and Whales: non-human persons and animal rights

With few exceptions, most cultures put homo sapiens at the center or the apex of creation. Humans, it is generally believed, are distinguished from other animals by our self-awareness and our ability to use tools, to think, reason, and construct meaning and representations about life. The Abrahamic religious traditions are most notable in their anthropocentric vision of human purpose in creation; and although the metaphysics and teleology are sometimes challenged by advances in science and technology, the fact remains that human beings remain the paradigmatic case against which other animals or even artificial intelligence is measured. As a Muslim and a theist, I avow my belief in the unique status of humans; however, as someone who also believes in science and is keenly attuned to the environment, I have a great love for nature and the animal world, and a great desire to protect them.

It is with this, then, that I want to propose to put ethics before metaphysics in considering the moral status of what legal scholars and ethicists call “non-human persons.” In particular, I want to look at cetacean intelligence of orcas, dolphins, and whales to understand the way in which we might classify them as non-human persons which would be given certain rights and protections. Doing so, I argue, would enable us to collapse the bifurcations that influences much of Western thought thereby ushering in a more holistic, ecological and relational approach to ethics and being.

To begin with, I would like to make a distinction clear: I am not claiming that orcas, for example, are morally equivalent to humans, but I am suggesting that we ought to be more cautious with regard to understanding our place in the animal world as a whole, particularly as it relates to the precariousness of life itself. My argument below follows philosophical and ethical reasoning, though this might also be understood in the context of religious texts. The story of Yunus (aka Jonah) and the whale is found in both the Bible and the Qur’an. In short, Yunus felt discouraged that the people of Nineveh did not heed his call to worship God, and so he left in anger. Being cast into the sea, followed by being swallowed by the whale, was ostensibly punishment for his loss of hope and leaving the city without God’s permission; though on another level the exegetical scholars point to the fact of his supplication “O Lord! There is no god but you: Glory to you: I was indeed wrong” (Qur’an 21:87) as instructive of submitting to God’s will and the significance of humility. Indeed, the Qur’an goes on to say elsewhere: “Had he not been of those who exalt God, he would certainly have remained inside the whale until the Day of Resurrection.” (Qur’an 37:143-144). The whale, on this reading, is integral to the Abrahamic worldview insofar as it is the manifestation of God’s power and dominion over creation, as well as his lesson to human beings to remain humble. Read More

Psychoneuroimmunology and the mind’s impact on health

If you are a skier like me, you likely revelled in watching the alpine skiing events during this years’ Olympic Winter Games held in Pyeongchang, South Korea. Having raced myself when I was younger, I recall the feeling of being in the starting gate with all the anticipation and excitement it brings. But my memories are more than mere recollections of “images” in my head, for I also have vivid muscle memory, and when watching and cheering for Lindsey Vonn and Ted Ligety, I can literally feel my leg muscles contract as if I were on the course myself. Because I skied for so much of my life, my experience now as a spectator brings me back to the hardwired responses that I can call up even to this day in a very intuitive way simply by visualizing a course.

Researchers at Stanford have now corroborated what athletes and psychologists have long believed: that visualizing ourselves performing a task, such as skiing down a race course, or engaged in other routines, improves our performance and increases our success rate. The findings, reported by neuroscientists in Neuron, suggest that mental rehearsal prepares our minds for real-world action. Using a new tool called a brain-machine interface, the researchers have shown how mental learning translates into physical performance and offers a potentially new way to study and understand the mind.

Could this new tool assist us in replicating cognitive responses to real-world settings in a controlled environment? More studies will need to be carried out in order to further test these findings and better understand the results. And one potential point to take into account is that preforming a real action is different than performing the same task mentally via a brain-imaging interface given that one’s muscles, skeletal system, and nervous system are all working in tandem; but, a brain-imaging interface would indeed seem to have very practical implications for those who use prosthetics or are who are paralyzed. As our knowledge of biomechanics and neuroscience advances, as well as our capabilities to interface the two, we may be able to utilize this technology to assist us in creating more life-like prosthetics and perhaps, harnessing the mind’s inborn processes and complex synapses, help others walk again.

Looking toward the future, another interesting subject of research would be to use a brain-imaging interface to study psychoneuroimmunology. We may not have the technology or ability to conduct such a study at the moment, but it seems plausible that in the near future we could develop the tools needed to conduct more rigorous research on the interactions between psychological processes and the nervous and immune systems. If visualizing winning a ski race improves our performance, why not also envisioning good health outcomes: resilient bodies, strong immune systems, plentiful and efficient white blood cells. Simply willing ourselves to health might not be possible, but, to be sure, having a positive outlook has been shown to impact the outcome of disease, while conversely, increased levels of fear and distress before surgery have been associated with worse outcomes. These are but a few examples of the increasing evidence of the mind’s impact on health. It highlights the importance of recognizing a holistic approach that considers the roles of behavior, mood, thought, and psychology in bodily homeostasis. Read More

Culture, Medicine, and Psychiatry

By Yusuf Lenfest

Professor Robert Sapolsky, a professor of biology and neurology at Stanford University, rightly identifies depression as a particularly crippling disease insofar as it affects one’s very response mechanisms and modes of coping, namely, experiences of gratitude, joy, pleasure—at bottom, some of the key emotions of resistance and healing. In discussing depression, he provides an overview of the biological and chemical elements, touching on the role of neurotransmitters (epinephrine, dopamine, serotonin) in depression, and a summary of the psychological elements (and their relation to the biological); as such, his description focuses primarily on physical and biological explanations. However, to examine depression or any psychological illness in purely physical and biological terms misses a crucial element, namely: human culture, lived experience, and the different modes or methods of social thought. Culture plays a primary role in defining many mental disorders such as schizophrenia and psychosis, and even the symptoms, intensities, or typologies of depression, according to Arthur Kleinman in his seminal Writing at the Margin: Discourse Between Anthropology and Medicine.

Despite these findings, Western biomedicine by and large continues to analyze mental health in clinical and biological terms. This is not insignificant given the statistics:

  • Approximately 1 in 5 adults in the U.S.- 43.8 million or 18.5% – experiences mental illness in a given year.
  •  Approximately 1 in 5 youth aged 13–18 (21.4%) experiences a severe mental disorder at some point during their life. For children aged 8–15, the estimate is 13%.
  • Only 41% of adults in the U.S. with a mental health condition received mental health services in the past year. Among adults with a serious mental illness, 62.9% received mental health services in the past year.
  • Just over half (50.6%) of children aged 8-15 received mental health services in the previous year. (National Alliance on Mental Health)

Current trends in medicine suggest that the medical community broadly speaking is ill-equipped to adequately tackle this rising trend, especially with regard to the treatment of diverse patients from various cultures, religions, and social circumstances. To best address the problem, the medical community – both on the level of policy and practice -ought to take steps to understand and treat mental illness more holistically.

Read More

Illness, Disability, and Dignity

By Yusuf Lenfest

Medicine is meant to heal our ailments and treat our illnesses. Our deep knowledge of the body and the numerous mechanisms that contribute or correlate to good health is considered a triumph of the medical sciences. We can now perform transplants with relative ease, offer prosthetics to those who require them, and even cure some forms of blindness. But so much of modern medicine today is built around quantitative data—family histories, success and morbidity rates, pathologization, statistical analyses—without much conscious consideration of how one understands, copes, or derives meaning from their experience. True, such data is gathered for the purposes of more accurate diagnoses and as the first defense against an illness or medical condition; but physicians are taught to concentrate on the cure, and while few would dispute that that is certainly a good thing, we also ought to keep in mind that excessive focus on a default measure of “normal” does not necessarily allow us to express the diverse ways of being in the world nor adequately account for the ways in which people embrace their conditions.

Some autistic individuals, for example, believe that autism should be accepted as a difference and not as a disorder. That the autism spectrum is precisely that—a spectrum—is important: on the one hand, statistical analysis may reveal that these individuals are in the minority versus the average population, only 1%; but on the other hand, to take a different perspective, it means merely that the characteristics of these individuals manifest in a way that is atypical with how the institution and culture of medicine classifies them. Lest we forget, medicine is part of the dynamic structure of society and social norms—in the background and the foreground—of knowledge-making, and it is imbedded in place and society, as part of the structures existing in institutions. It is not possible to consider theoretical or epistemological claims apart from practical knowledge and applied sciences. Read More

Islam and the Beginning of Human Life

When does human life begin?

One of the more contentious bioethical and legal issues is about the beginning of human life. Nor is it difficult grasp why, for beyond political rhetoric it is a subject of considerable philosophical and legal debate and raises a number of questions which are profoundly difficult to answer. Biomedicine can roughly differentiate when life becomes viable, that is, at which point a fetus could survive as an infant if a mother gave birth prematurely; it can likewise recognize potential complications either in the development of the fetus or the health of the pregnant woman. Yet other questions are not as easy to answer, precisely because they tend to fall more in the spectrum of philosophy or personal belief: what constitutes a human being? What is a person? Is a potential life accorded the same rights as an actual life? For that matter, are there rights to begin with automatically, or are there criteria that must be met in order to procure rights? In short, questions that strike at the very core of who we are.

A number of these questions were debated by Muslim theologians and legal scholars in the pre-modern world when considering contexts of abortion or issues surrounding paternity. In the modern world, these questions have grown to include in vitro fertilization and surrogacy amongst others. Muslim scholars continue to grapple with these bioethical questions as the medical sciences grow more advanced and technology allows us to have ever more control over the basic aspects of reproduction, growth, and development. Per the question, When does human life begin? for example, Mohammed Ghaly analyses in an important article, “The Beginnings of Human Life: Islamic Bioethical Perspectives” some of the newer discussions and positions Muslim scholars have taken vis-à-vis contemporary bioethics and independent legal reasoning (ijtihad). Complementing this discussion is also a seminal article by Ayman Shabana, “Paternity Between Law and Biology: The Reconstruction of the Islamic Law of Paternity in the Wake of DNA Testing.” Shabana shows how classical rulings pertaining to paternity issues continue to hold higher authority, even despite the advent and availability of modern technology that would ostensibly challenge that authority. This is interesting for a number of reasons, not least of which is the possible change in perspective with regard to how religious authority is derived and its relationship to the medical sciences. Read More

Bioethics in Islam: Principles, Perspectives, Comparisons

An important questions in Islam, recurrent across time and space, is whether Islamic political theory recognizes rights claims against the state as distinct from rights claims against other members of the community. This continues to be an important subject today, intersecting the fields of law, religion, and moral philosophy. The classical tradition is divided on the matter, with the legal theory of the Shafi’i school of jurisprudence saying that rights are to be accorded viareligious authority, while the Hanafi school emphasized the universality of the notion of human inviolability (dhimma)—and the innate rights that derive from it—as God-given, universal, and applicable to all societies from the beginning of time.

Whereas in Western law there is generally a separation between law and ethics, in the Islamic tradition, there is more of a dialectical tension between the two: Where religious inwardness is more highly developed, attitude and intention are weighed more heavily, whereas in its absence however formalism and legalism are advanced as the ethical ideal.

Read More

What are Our Duties and Moral Responsibilities Toward Humans when Constructing AI?

Much of what we fear about artificial intelligence comes down to our underlying values and perception about life itself, as well as the place of the human in that life. The New Yorker cover last week was a telling example of the kind of dystopic societies we claim we wish to avoid.

I say “claim” not accidently, for in some respects the nascent stages of such a society do already exist; and perhaps they have existed for longer than we realize or care to admit. Regimes of power, what Michel Foucault called biopolitics, are embedded in our social institutions and in the mechanisms, technologies, and strategies by which human life is managed in the modern world. Accordingly, this arrangement could be positive, neutral, or nefarious—for it all depends on whether or not these institutions are used to subjugate (e.g. racism) or liberate (e.g. rights) the human being; whether they infringe upon the sovereignty of the individual or uphold the sovereignty of the state and the rule of law; in short, biopower is the impact of political power on all domains of human life. This is all the more pronounced today in the extent to which technological advances have enabled biopower to stretch beyond the political to almost all facets of daily life in the modern world. Read More

Religion, Health, and Medicine: the Dialectic of Embedded Social Systems

The philosopher in me understands that there are universal principles in logic, mathematics, and in basic scientific tenets such as the law of gravity. Be that as it may, the historian in me recognizes that we inherit epistemologies and ways of thinking from those before us, and from our own historical and cultural contexts. Certain ideas dominate the world; and, while some are indeed universal, especially those based on science, the fact remains that a number of other concepts are only seemingly universal. The concepts of personhood, divinity, self, and even society as we tend to understand them today are largely inherited from a Western, Christian worldview. As these ideas have wrestled with philosophical inquiry throughout history, they have either been decoupled from their origins in religious thought, or they have been secularized and rationalized a la Kantian categorical imperatives or the like—and then disseminated in universities, institutions, cultures, and literatures.

On one level, to speak of the Western world as “secular” is, as the philosopher Charles Taylor notes, to say that “belief in God, or in the transcendent in any form, is contested; it is an option among many; it is therefore fragile; for some people in some milieus, it is very difficult, even ‘weird’” (Taylor: 2011, 49). But on another and much deeper level, this very possibility was only ever tenable on account of two major factors: “First, there had to develop a culture that marks a clear division between the ‘natural’ and the ‘supernatural,’ and second, it had to come to seem possible to live entirely within the natural” (Taylor, 50). This was only possible because of a unique philosophical climate that actively sought to dislodge the old form of moral order and social “embeddedness” in an attempt to establish a “purely immanent order.” Taylor’s groundbreaking work, A Secular Age argues that secularism is part of a grand narrative in the West and shows that its historical and cultural foundations are in fact thoroughly Christian and European. He pushes back against Max Weber’s secularization thesis that religion diminishes in the modern world and in the wake of increasing developments in science and technology—and instead gives a different account of what secularism might mean: one that has deep implications for morality, politics, and philosophy.

Read More