The Office of Management and Budget is revising the social cost of carbon, and the gates are open for public comment. Go on! Contribute if you have an idea about how much climate change might cost, in a social sense.
The figure gets plugged into all kinds of benefit-cost analyses — today’s imprimatur of efficaciousness — in the hunt for optimal policies that weigh climate change mitigation and adaptation against other worthy ways to spend money. Is it economically sensible to build a seawall now and fend off the uncertain rise in ocean levels decades down the road? Is it a good idea to demand power plants capture their carbon and inject it underground (assuming that’s a functional technology)? Should we simply dump all of our money into renewables? Or what about into energy efficiency?
How do we get the most bang for our buck when dealing with climate change?
Another tough question. And maybe also the wrong question.
I wrote about this once before, covering a debate between law professor Doug Kysar and economics professor Matt Kotchen. And a summary published last month in Environmental Research Letters by Jonathan Koomey tackles the same thorny issue with full-throated advocacy for a new approach. “The benefit-cost approach,” writes Koomey, “while it has been useful in many contexts, has serious limitations that call into question its utility for analyzing climate change.”
So what do we do instead? The proposal is to focus on physical limits — 2 degrees Celsius, in this case — and devise the optimal policy for reaching that goal. In this way, we’re determining costs within given, necessary boundaries. It will be interesting to see if, or when, this takes hold.
At least Koomey sings optimism: ”I call this method ‘working forward toward a goal’, and it’s one that will see wide application as we face the climate challenge in the years ahead.”
From, of all places, a word-a-day email: “Destroying species is like tearing pages out of an unread book, written in a language humans hardly know how to read, about the place where they live” (Holmes Rolston III, professor of philosophy).
The religion of economics provides an idolatry of rapid change, unaffected by the elementary truism that a change which is not unquestionable improvement is a doubtful blessing. The burden of proof is placed on those who take the “ecological viewpoint”: unless they can produce evidence of marked injury to man, the change will proceed. Common sense, on the contrary, would suggest that the burden of proof should lie on the man who wants to introduce a change; he has to demonstrate that there cannot be any damaging consequences. But this would take too much time, and would therefore be uneconomic. Ecology, indeed, ought to be a compulsory subject for all economists, whether professionals or laymen, as this might serve to restore at least a modicum of balance. For ecology holds that an environmental setting developed over millions of years must be considered to have some merit. Anything so complicated as a planet, inhabited by more than a million and a half species of plants and animals, all of them living together in a more or less balanced equilibrium in which they continuously use and re-use the same molecules of the soil and air, cannot be improved by aimless and uninformed tinkering. All changes in a complex mechanism involve some risk and should be undertaken only after careful study of all the facts available. Changes should be made on a small scale so as to provide a test before they are widely applied. When information is incomplete, changes should stay close to the natural processes which have in their favor the indisputable evidence of having supported life for a very long time.
(From Basic Ecology by Ralph and Mildred Buchsbaum)
In “The DNA Data Deluge,” a recent article in IEEE Spectrum, the authors elaborate at length on the following conundrum (which is a problem, in their opinion):
As [DNA] sequencing machines improve and appear in more laboratories, the total computing burden is growing. It’s a problem that threatens to hold back this revolutionary technology. Computing, not sequencing, is now the slower and more costly aspect of genomics research.
The performance of a DNA sequencer has improved three- to five-fold each year since 2008 — astonishing, and far more rapidly than the capabilities of computer processors, which still chunk along according to Moore’s Law. (“Computer processors basically doubled in speed every two years over that same period,” say the authors.)
Take two recent studies written up in Chemical & Engineering News: “Nanoparticles In Athletic Apparel May Seep Into Sweat” and ”Nanoparticles Could Disrupt Immune Cell Function.” In the first, scientists in Germany tested the toxicological implications of sports clothes interlaced with titanium dioxide and silver nanoparticles. “Little is known about what effects these materials have on human health,” according to the article. (I’m surprised this phrase is still used without irony.) The experiment ultimately presented three conclusions. First, the titanium is not absorbed by sweat at dangerous levels. Second, the silver may be absorbed at dangerous levels. Third, more experimentation is needed.
In the second article, researchers moved beyond the standard toxicological test of cell survival in the presence of nanoparticles and instead focused on cell function. When exposed to iron oxide, they found that immune cells are less effective at killing bacteria. Robert L. Tanguay of Oregon State University, who was not an author on the study,
says these results suggest that nanoparticles may need additional safety testing on top of the standard toxicity studies. “That could open up a huge Pandora’s box,” he says, adding to the number of variables researchers consider when assessing nanomaterial toxicity.
* * *
I suppose if there really is a storm of progress, then it’s blowing hard as ever, pushing those angels along, uncovering one new mess after another.
It is crude but true that cancer makes for profitable business. The FDA approved 19 new drugs in 2012 and three this year, according to CenterWatch: Kadcyla ($9,800 per month), Pomalyst ($10,5000 per 28 days), and Stivarga ($9,350 per 28 days). There is no doubt a need for more and better cancer treatments. No doubt.
But two new studies in the past month cast light on the trouble of pushing research harder than perhaps it should be pushed. In an article entitled “Characteristics of Oncology Clinical Trials: Insights From a Systematic Analysis of ClinicalTrials.gov” published in JAMA Internal Medicine, a number of authors conclude that the research on which cancer treatment is based often lacks the statistical rigor and robustness of treatment for other diseases. “An inherent tension arises between the desire to use new, life-saving treatments and the imperative to develop the evidence that patients, clinicians, regulatory agencies, and advocacy groups need to make sound decisions.” said Bradford Hirsch, lead author of the study, in a press release. “Unfortunately, the high prevalence of small studies that lack rigor limits the ability to assess the evidence supporting specific treatments.”
And released just yesterday in PLOS-One, a new study targets the widespread lack of reproducibility that afflicts much of the oncology literature:
When asked if investigators had ever tried to reproduce a finding from a published paper and not been able to do so, 54.6% (237/434) of all survey respondents said that they had, with 58.5% (154/263) of faculty having experienced the inability to reproduce data, and 48.5% (83/171) of trainees having the same experience. Of note, some of the non-repeatable data were published in well-known and respected journals including several high impact journals…
Upon finding results from a paper that could not be repeated the question arises; “what to do with that contradictory data?” When survey respondents were asked if they contacted the authors of the research paper, 78% (71/91) said that they had, but the ultimate results of that contact were very mixed…Overall, only 33.3% of respondents were ever able to explain or resolve their discrepant findings. This implies that 66.7% of conflicting results between research groups were not resolved through communication with the authors of the original publication.
These complementary problems run deep through the cycle that connects R&D with commercial markets, and solutions on any meaningful scale may require systemic examination and overhaul. But little choice exists. As one of the papers concludes, “the lives of our patients depend on it.”
From “Inhabited Prairie,” opens May 23 at Yancey Richardson Gallery
Much has been written recently on the subject of bees, their startling and costly population decline, what may or may not be responsible. The problem is known informally as colony collapse disorder and its coverage unearths great statistics, like this one: “One of every three bites of food eaten worldwide depends on pollinators, especially bees, for a successful harvest,” taken from the lede of Lizzie Grossman’s article at e360.
Kosek, publishing in Cultural Anthropology, is interested in deciphering the symbolic power of bees in American culture but, as above, the course of his writing unearths some unusual statistics. We learn in the abstract that, ”at present, the largest source of funding for apiary research comes from the U.S. military…” And given bees’ potential as intelligence gatherers, the military is now “deploying bees as efﬁcient and effective homeland security detective devices,” Kosek tells us through an interview with the Stealthy Insect Sensor Project Team at Los Alamos.
So the dying worker is replaced by the mechanized warrior.