Our antibiotics are becoming useless
By 2050, 10 million people could die each year from diseases that have grown resistant to drugs.
“Common diseases are becoming untreatable.” That’s the blunt warning issued on page one of a major new United Nations report on drug resistance. If we don’t make a radical change now, the report says, drug-resistant diseases could kill 10 million people a year by 2050.
Drug resistance is what happens when we overuse antibiotics in the treatment of humans, animals, and plants. When a new antibiotic is introduced, it can have great, even life-saving results — for a while. But then the bacteria adapt. Gradually, the antibiotic becomes less effective, and we’re left with a disease that we don’t know how to treat.
Already, 700,000 people around the world die of drug-resistant diseases each year, including 230,000 deaths from multidrug-resistant tuberculosis. Common problems like STDs and urinary tract infections are also becoming resistant to treatment. Routine hospital procedures like C-sections could become more dangerous as well as the risk associated with infection increases.
Yet doctors, farmers, and others continue to dole out too many antibiotics, driving the resistance. Amy Mathers, who directs the University of Virginia’s Sink Lab, told me that over the past decade there’s been a surge of US patients infected with bacteria for which there’s no effective antibiotic. “I see that once a month,” she said. “Ten years ago, that was a rarity.”
Experts like Mathers are increasingly warning that drug-resistant superbugs pose a huge threat to our health. Now, the UN report adds that drug resistance could also severely mess up our economy. By causing health care expenditures to skyrocket, it could prompt economic damage on a par with the 2008-2009 financial crisis.
The good news is this problem can be solved really cheaply. If each person in high- and middle-income countries invested $2 a year in this cause, we could research new drugs and implement effective measures to reduce the threat of resistance, the report says.
“For the US, the total cost to fix the broken antibiotics model is $1.5-2 billion per year,” Kevin Outterson, a Boston University professor who specializes in antibiotic resistance and who was not directly involved in the UN study, told me. “It’s the equivalent of what we spend on toilet paper every few months.”
What’s more, unlike climate change, this is an issue on which there’s both scientific and political consensus — it’s not as if the right and the left disagree as to whether the problem is real.
Which raises the question: If there’s such a cost-effective way to solve such a high-impact problem, and it’s ideologically uncontroversial, why aren’t we all over it?
Companies don’t have the financial incentives
It takes many years and lots of funding to do the research and development needed to bring a new antibiotic to market. Most new compounds fail. Even when they succeed, the payoff is small: An antibiotic — which is, at least in theory, a drug of last resort — doesn’t sell as well as a drug that needs to be taken daily. So for biotech companies, the financial incentive just isn’t there.
Although drug resistance affects high-income and low-income countries alike, wealthy Western countries may be better equipped to respond to a health crisis, and thus feel less urgency about tackling the problem proactively.
The UN report and a number of outside experts argue that to solve this issue, we need to stop treating antibiotics as if they’re any other product on the free market, where value is determined by the number of units sold. Instead, we should think of antibiotics as public goods that are crucial to a functioning society — like infrastructure or national security. And the government should fund their research and development.
“This is a product where we want to sell as little as possible,” Outterson explained. “The ideal would be an amazing antibiotic that just sits on a shelf for decades, waiting for when we need it. That’s great for public health, but it’s a freaking disaster for a company.”
This mismatch with the pharmaceutical industry’s profit-making imperative is why the government (and ideally also the private sector and civil society) needs to step in, the UN report says. That could include incentives like grant funding and tax credits to support early-stage research. The report also urges wealthy countries to help poorer nations improve their health systems, and recommends the creation of a major new intergovernmental panel — like the one on climate change, but for drug resistance.
Yet for governments to mobilize around this issue, the public may first have to push it as a priority — and it’s not clear that enough Americans see it as such.
“I do not think the political will or even the knowledge base is present in the US to make this a high-enough priority to solve the problem today,” Mathers told me. She believes the first thing we need is more public education to bring this threat into focus for the average American.
Outterson agreed that a report — even a major UN report — won’t do much good on its own. “If I had a dollar for every report on this issue, I’d have a lot of money,” he said. His fear is that the death toll may have to climb very high before a critical mass of people start noticing, caring, and mobilizing. “We will eventually respond,” he said. “The question is how many people will have to die before we start that response.”
Our antibiotics are becoming useless - 10-05-2019 10:02:33am
Is screen time bad for the brain?
A GENERATION ago, parents worried about the effects of television; before that, it was the radio.
Now, the concern is “screen time”, a catchall term for the amount of time that children, especially preteens and teenagers, spend interacting with TVs, computers, smartphones, digital pads and video games. This age group draws particular attention because screen immersion rises sharply during adolescence, and because brain development accelerates then, too, as neural networks are pruned and consolidated in the transition to adulthood.
CBS’ 60 Minutes recently reported on early results from the ABCD Study (for Adolescent Brain Cognitive Development), a US$300 million (RM1.25 billion) project financed by the National Institutes of Health.
The study aims to reveal how brain development is affected by a range of experiences, including substance use, concussions and screen time. As part of an exposé on screen time, 60 Minutes reported that heavy screen use was associated with lower scores on some aptitude tests, and to accelerated “cortical thinning” — a natural process — in some children. But the data is preliminary, and it is unclear whether the effects are lasting or even meaningful.
Does screen addiction change the brain? Yes, but so does every other activity that children engage in: sleep, homework, playing soccer, arguing, growing up in poverty, reading and vaping in secret. The adolescent brain continually changes, or “rewires” itself, in response to daily experience and that adaptation continues into the early to mid-20s.
What scientists want to learn is whether screen time causes measurable differences in adolescent brain structure or function, and whether those differences are meaningful. Do they cause attention deficits, mood problems, or delays in reading or problem-solving ability?
Have any such brain differences been found? Not convincingly. More than 100 scientific reports and surveys have studied screen habits and well-being in young people, looking for emotional and behavioural differences, as well as changes in attitude, such as in body image.
In 2014, scientists from Queen’s University Belfast reviewed 43 of the best designed such studies. The studies found that social networking allows people to broaden their circle of social contacts in ways that could be both good and bad, for instance, by exposing young people to abusive content.
The review’s authors concluded that there was “an absence of robust causal research regarding the impact of social media on the mental well-being of young people”. In short, results have been mixed and sometimes contradictory.
Psychologists have also examined whether playing violent video games is connected to aggressive behaviour. More than 200 such studies have been carried out; some researchers found links, others have not. One challenge in studying this and other aspects of screen time is identifying the direction of causality: Do children who play a lot of violent video games become more aggressive as a result, or were they drawn to such content because they were more aggressive from the start?
Individual variation is the rule in brain development. The size of specific brain regions such as the prefrontal cortex, the rate at which those regions edit and consolidate their networks, and the variations in these parameters from person to person make it very difficult to interpret findings. To address such obstacles, scientists need huge numbers of research subjects and a far better understanding of the brain.
The ongoing ABCD study expects to follow 11,800 children through adolescence, with annual magnetic resonance imaging, to see if changes in the brain are linked to behaviour or health.
The study began in 2013, recruiting 21 academic research centres, and initially focused on the effects of drug and alcohol use on the adolescent brain. Since then, the project has expanded and now includes other targets such as the effects of brain injury, screen time, genetics and an array of “other environmental factors”.
The recently published paper covered by 60 Minutes provided an early glimpse of the anticipated results. A research team, based at the University of California, San Diego, analysed brain scans from more than 4,500 preteens and correlated those with the children’s amount of screen time (as reported by the children themselves in questionnaires) and their scores on language and thinking tests. The findings were a mixed bag.
Some heavy screen users showed cortical thinning at younger ages than expected; but this thinning is part of natural brain maturation, and scientists don’t know what that difference means. Some heavy users scored below the curve on aptitude tests, others performed well.
But the accuracy is hard to ascertain. The association between small differences in brain structure and how people actually behave is even more vague. Clear conclusions are extremely hard to come by, complicated by the fact that a brain scan is no more than a snapshot in time: a year from now, some of the observed relationships could be reversed.
But surely, screen addiction is somehow bad for the brain? It’s probably both bad and good, depending on the individual and his viewing habits. Most parents are probably already aware of the biggest downside of screen time: the extent to which it can displace other childhood experiences, including sleep, climbing over fences, designing elaborate practical jokes and getting into trouble.
Indeed, many parents — maybe most — watched hours of TV a day themselves as youngsters. Their experiences may be more similar to their children’s than they know. --NYT
Is screen time bad for the brain?, Dec 23, 2018
nst.com.my