As the opioid litigation continues over the shadow of one of our nation’s most pressing public health crises, some criticism has been levied at private lawyers representing the cities, counties, states, and individuals harmed by the crisis. For example, see the following tweet:
Let’s work out tax and healthcare financing policy county by county, with private lawyers taking a 25% cut every time. Judge Polster seems to like this idea.
The critiques are many, but can be summarized: (1) private lawyers are being enriched; (2) private lawyers are setting opioid policy; (3) private lawyers have misaligned incentives; and (4) private lawyers will not support public health.
Arguably, all these arguments bear some truth. However, do they suggest that the opioid litigation is incorrigibly tainted and tort litigation the improper avenue to address mass torts such as the opioid crisis?
Have you ever clicked ‘I agree’ to share information about yourself on a health app on your smartphone? Wondered if the results of new therapy reported on a patient community website were accurate? Considered altering a medical device to better meet your own needs, but had doubts about how the changes might affect its function?
While these kinds of decisions are increasingly routine, there is no clear path for getting information on health-related devices, advice on what data to collect, how to evaluate medical information found online, or concerns one might have around data sharing on patient platforms.
It’s not only patients who are facing these questions in the age of big data in medicine. Clinicians are also increasingly confronted with diverse forms of molecular, genetic, lifestyle, and digital data, and often the quality, meaning, and actionability of this data is unclear.
The difficulties of interpreting unstructured data, such as symptom logs recorded on personal devices, add another layer of complexity for clinicians trying to decide which course of action would best meet their duty of beneficence and enable the best possible care for patients.
If you are a skier like me, you likely revelled in watching the alpine skiing events during this years’ Olympic Winter Games held in Pyeongchang, South Korea. Having raced myself when I was younger, I recall the feeling of being in the starting gate with all the anticipation and excitement it brings. But my memories are more than mere recollections of “images” in my head, for I also have vivid muscle memory, and when watching and cheering for Lindsey Vonn and Ted Ligety, I can literally feel my leg muscles contract as if I were on the course myself. Because I skied for so much of my life, my experience now as a spectator brings me back to the hardwired responses that I can call up even to this day in a very intuitive way simply by visualizing a course.
Researchers at Stanford have now corroborated what athletes and psychologists have long believed: that visualizing ourselves performing a task, such as skiing down a race course, or engaged in other routines, improves our performance and increases our success rate. The findings, reported by neuroscientists in Neuron, suggest that mental rehearsal prepares our minds for real-world action. Using a new tool called a brain-machine interface, the researchers have shown how mental learning translates into physical performance and offers a potentially new way to study and understand the mind.
Could this new tool assist us in replicating cognitive responses to real-world settings in a controlled environment? More studies will need to be carried out in order to further test these findings and better understand the results. And one potential point to take into account is that preforming a real action is different than performing the same task mentally via a brain-imaging interface given that one’s muscles, skeletal system, and nervous system are all working in tandem; but, a brain-imaging interface would indeed seem to have very practical implications for those who use prosthetics or are who are paralyzed. As our knowledge of biomechanics and neuroscience advances, as well as our capabilities to interface the two, we may be able to utilize this technology to assist us in creating more life-like prosthetics and perhaps, harnessing the mind’s inborn processes and complex synapses, help others walk again.
Looking toward the future, another interesting subject of research would be to use a brain-imaging interface to study psychoneuroimmunology. We may not have the technology or ability to conduct such a study at the moment, but it seems plausible that in the near future we could develop the tools needed to conduct more rigorous research on the interactions between psychological processes and the nervous and immune systems. If visualizing winning a ski race improves our performance, why not also envisioning good health outcomes: resilient bodies, strong immune systems, plentiful and efficient white blood cells. Simply willing ourselves to health might not be possible, but, to be sure, having a positive outlook has been shown to impact the outcome of disease, while conversely, increased levels of fear and distress before surgery have been associated with worse outcomes. These are but a few examples of the increasing evidence of the mind’s impact on health. It highlights the importance of recognizing a holistic approach that considers the roles of behavior, mood, thought, and psychology in bodily homeostasis. Read More
The philosopher in me understands that there are universal principles in logic, mathematics, and in basic scientific tenets such as the law of gravity. Be that as it may, the historian in me recognizes that we inherit epistemologies and ways of thinking from those before us, and from our own historical and cultural contexts. Certain ideas dominate the world; and, while some are indeed universal, especially those based on science, the fact remains that a number of other concepts are only seemingly universal. The concepts of personhood, divinity, self, and even society as we tend to understand them today are largely inherited from a Western, Christian worldview. As these ideas have wrestled with philosophical inquiry throughout history, they have either been decoupled from their origins in religious thought, or they have been secularized and rationalized a la Kantian categorical imperatives or the like—and then disseminated in universities, institutions, cultures, and literatures.
On one level, to speak of the Western world as “secular” is, as the philosopher Charles Taylor notes, to say that “belief in God, or in the transcendent in any form, is contested; it is an option among many; it is therefore fragile; for some people in some milieus, it is very difficult, even ‘weird’” (Taylor: 2011, 49). But on another and much deeper level, this very possibility was only ever tenable on account of two major factors: “First, there had to develop a culture that marks a clear division between the ‘natural’ and the ‘supernatural,’ and second, it had to come to seem possible to live entirely within the natural” (Taylor, 50). This was only possible because of a unique philosophical climate that actively sought to dislodge the old form of moral order and social “embeddedness” in an attempt to establish a “purely immanent order.” Taylor’s groundbreaking work, A Secular Age argues that secularism is part of a grand narrative in the West and shows that its historical and cultural foundations are in fact thoroughly Christian and European. He pushes back against Max Weber’s secularization thesis that religion diminishes in the modern world and in the wake of increasing developments in science and technology—and instead gives a different account of what secularism might mean: one that has deep implications for morality, politics, and philosophy.