Fentanyl is a potent opioid analgesic and has been the center of the opioid and overdose epidemic. As an illicit agent, fentanyl is often in the form of a powder, which is then either insufflated (the fancy medical term for snorting) or dissolved in water and injected intravenously. It is fifty to one-hundred times more potent than heroin, the drug it replaced as the illicit opioid of choice. It can cause significant euphoria and analgesia, which is why it is so widely used. It can also cause respiratory depression or complete respiratory arrest, the reason it can be so deadly. It is readily absorbed when insufflated or injected and the actions are almost immediate. These are the facts.
The Beazley Institute for Health Law and Policy at Loyola University Chicago School of Law and Annals of Health Law & Life Sciences invite original submissions for presentations at our Thirteenth Annual Health Law Symposium: Addressing the Health Care Needs of Justice-Involved Populations. The Symposium will take place at Loyola University Chicago School of Law on Friday, November 15, 2019 beginning at 9:00am.
The Symposium will explore legal barriers that justice-involved populations face in accessing health care, and address how those barriers can be alleviated. “Justice-involved populations” generally refers to individuals who are incarcerated in prisons, jails, immigrant detention centers, juvenile detention centers, on probation, or individuals who are otherwise involved with the U.S. justice system. Read More
As the suicide rate increases across the United States, researchers at the UNC Gillings School of Global Public Health approached the issue by considering the financial anxiety caused by low wages. Alex Gertner, Jason Rotter, and Paul Shafer used the LawAtlas minimum wage dataset to explore the associations between state minimum wages and suicide rates in the United States.
Their study was published in the American Journal of Preventive Medicine on March 21, 2019.
Temple University Center for Public Health Law Research spoke with Mr. Gertner about their study.
The U.S. Preventive Services Task Force (USPSTF), published recommendations recently urging clinicians to refer pregnant and postpartum women to counseling if they are at risk of depression.
The recommendations respond to the prevalence of perinatal depression, which is considered to be the most common pregnancy complication. Perinatal depression, affects up to one in seven women and can develop at any time after a woman becomes pregnant, immediately following the brith of a child, or even up to a year after.
Among the many concerning potential consequences of maternal depression are premature births and low birth weights, as well as neglect and inattentiveness from mothers after the baby is born, which can subject infants to risk of additional problems, according to Karina Davidson, a USPSTF member who helped write the recent recommendations. Read More
Much ado has been made about Amazon’s new hit, “Homecoming,” which recently received three Golden Globe nominations, including one for best drama series. The psychological thriller, directed by “Mr. Robot” creator Sam Esmail and starring Julia Roberts, has been characterized as “an irresistible mystery-box drama” and “the good kind of ‘what the hell is going on here?’ TV.” Tim Goodman described the show, which was adopted from Eli Horowitz and Micah Bloomberg’s Gimlet Media “cult hit” podcast of the same moniker, as a “dazzling” play “on memory, the military industrial complex, conspiracy and unchecked government privilege.”
The series revolves around novice caseworker Heidi Bergman’s (Roberts) experiences administering the Tampa, Florida-based Homecoming Transitional Support Center (HTSC). HTSC is a privately-run, Department of Defense (DoD) contract facility, which purports to assist combat-traumatized servicemembers readjust from the battlefield and reintegrate to civilian life. Indeed, Bergman opens the drama’s aptly-titled pilot, “Mandatory,” by explaining to her “client,” three-tour-combat-veteran Walter Cruz (Stephan James), that the treatment facility is “a safe space for you to process your military experience and re-familiarize yourself with civilian life in a monitored environment, which, just means getting you situated now that you’re back home, rear-wise, health-wise, basically, I just work for you.” Read More
As a nurse practitioner in a busy suburban emergency department, pain is my job. Pain is one of the most common reasons people come to an emergency department (ED). It could be abdominal pain, chest pain, back pain or even emotional pain, including depression or suicidal ideations. Pain is a driver for people seeking medical care. We have made pain into a vital sign, and we ask, “How would you rate your pain on a scale of 1 to 10?” a mandatory question for any patient who steps through our door.
This whole concept evolved circa 1987 when the Institute of Medicine urged healthcare providers to use a quantified measure for pain. It gained even more traction in 1990 when then president of the American Pain Society, Dr. Mitchell Max, called for improved means to assess and treat pain. The term “oligoanalgesia” gained popularity in the published literature, meaning that we weren’t giving enough pain medication to patients in the ED, in clinics or in any other healthcare setting. Healthcare providers responded. We asked about and we thought, more effectively treated pain to address this issue.
I had always considered my field of expertise to be emergency medicine. I worked through the ranks as an emergency medical technician, then onward as a paramedic, which included a nine-year stint on a busy medical helicopter. I worked in disaster medicine, and was the associate director of a Harvard-affiliated disaster medicine fellowship in Boston. My current practice is as a nurse practitioner in a busy suburban emergency department (ED) and I am still active in emergency medical services as a SWAT medic and as an educator.
The emergency part of what I do is the exciting part —the part that stimulates the excitatory neurotransmitters that flood the brain, preparing it to act quickly and concisely.
We are selling ourselves short, however, when we label this role as “emergency” providers. Instead, “public health provider” is a much more appropriate term to use, because emergency departments and those who provide care there are really public health workers.
All of us who practice in emergency medicine know that real emergencies are few and far between. Our day-to-day is much more mundane. We deal with many urgent issues as well as some less urgent, primary care problems. We may even spend time filling printer paper or bringing a patient their lunch. We may help to find someone a homeless shelter, send a family home with warm coats for the kids, or pack up a bag with food and toiletries for a young girl we feel is being trafficked.
In light of all this, the purpose and the policies of the emergency department need to be redefined. Read More
Suicide is a global problem that causes 800,000 deaths per year worldwide. In the United States, suicide rates rose by 25 percent in the past two decades, and suicide now kills 45,000 Americans each year, which is more than auto accidents or homicides.
Traditional methods of predicting suicide, such as questionnaires administered by doctors, are notoriously inaccurate. Hoping to save lives by predicting suicide more accurately, hospitals, governments, and internet companies are developing artificial intelligence (AI) based prediction tools. This essay analyzes the risks these systems pose to safety, privacy, and autonomy, which have been under-explored.
Two parallel tracks of AI-based suicide prediction have emerged.
The first, which I call “medical suicide prediction,” uses AI to analyze patient records. Medical suicide prediction is not yet widely used, aside from one program at the Department of Veterans Affairs (VA). Because medical suicide prediction occurs within the healthcare context, it is subject to federal laws, such as HIPAA, which protects the privacy and security of patient information, and the Federal Common Rule, which protects human research subjects.
My focus here is on the second track of AI-based suicide prediction, which I call “social suicide prediction.” Though essentially unregulated, social suicide prediction uses behavioral data mined from consumers’ digital interactions. The companies involved, which include large internet platforms such as Facebook and Twitter, are not generally subject to HIPAA’s privacy regulations, principles of medical ethics, or rules governing research on human subjects.
The New Yorker just published an article full of ethical questions about the best health care treatment for dementia patients. It should make you think about which life you would choose. Larissa MacFarquhar’s piece is titled “The Comforting Fictions of Dementia Care.” Its subtitle suggests a sad story, noting “Many facilities are using nostalgic environments as a means of soothing the misery, panic, and rage their residents experience.” The article tells numerous powerful stories of dementia patients’ good and bad experiences.
By Rebecca Dresser
Anyone fortunate enough to live beyond middle age faces a risk of developing dementia. Dementia is a widely feared disability. People often say they wouldn’t want to live if they developed the condition.
Experts in law and ethics praise advance directives, or instructions to follow on behalf of patients, as a tool giving people control over the life-sustaining medical care they later receive as mentally impaired dementia patients. Some advance directive supporters also want the law to recognize advance requests to withhold ordinary food and water in the late stages of dementia. And some argue that the U.S. should follow the Netherlands in allowing doctors to give lethal drugs to people who made advance directives asking for assisted death if dementia makes them unable to live at home or to recognize their loved ones.