Lifetime placement on a sex offender registry is one of many “collateral consequence of criminal conviction.” It is not formally associated with criminal sentencing, and it falls outside the purview of our judicial system. Our courts recognize collateral consequences as civil, rather than criminal, penalties. They are considered, somewhat oddly, “non-punitive.”
There is a long list of grievances with regard to collateral consequences, the legal literature on this issue broad, and broader every year. Writing recently in the volume Law as Punishment / Law as Regulation, University of Vermont associate professor Alec Ewald said,
Collateral consequences represent a damaging new manifestation of a virulent exclusionary tradition in American citizenship law. Denying millions of people full civic standing and autonomy is not new in American political life. What is novel is that collateral sanctions withdraw indefinitely core attributes of citizenship, and do so in a formal but shadowy way through the cumulative effects of scores of scattered, quasi-penal policies whose ambiguous legal status has in effect insulated them from necessary public, legislative, and judicial examination.
But an interesting new case has popped up in Washinton D.C., documented here by The Washington Post. Registered sex offender Dennis Sobin is “registering the registerers” at www.IdiotsRegistry.info. “If it’s not punishment to be on a list, we thought we’d put the people who do the registering on a list,” Sobin explained to the Post.
The website is making its way into the courts now, Sobin targeted with a lawsuit but defending himself on First Amendment grounds. Comical as the whole bit seems, a bit boorish maybe, the kernel of the issue speaks to an important and pervasive question facing our bloated system of incarceration: when, if ever, does a convict earn his redemption?
There are new clocks coming out that keep time using lasers. I wrote about them here, but didn’t have room to tell of one of their more interesting abilities.
Optical atomic clocks, as they’re known, provide unique insight into some of physics’ most fundamental constants. Take the words of Dennis McCarthy, Director of Time at the U.S. Naval Observatory until 2005, who clarified that “the kinds of optical atomic clocks we now see around the world are no longer clocks. They’re gravity meters.”
Gravity is the weakest known physical force. It’s very difficult to measure precisely. It also interacts faintly with the passage of time. New optical clocks are so sensitive as timekeepers that running two of them in parallel offset by a few centimeters in elevation allows scientists to measure gravity. Because one clock is closer to earth than the other, it will run slightly slower.
The field of geodesy will likely benefit. We’ll start to move the clocks around. Like migrating birds on magnetic trails, these little machines will tune into gravitational pull. They will scan earth’s density as they travel, they will transcribe the silent imbrication of shifting tectonics.
Will somebody drive atomic clocks on truck bed? Will that be a job of the future?
The Office of Management and Budget is revising the social cost of carbon, and the gates are open for public comment. Go on! Contribute if you have an idea about how much climate change might cost, in a social sense.
The figure gets plugged into all kinds of benefit-cost analyses — today’s imprimatur of efficaciousness — in the hunt for optimal policies that weigh climate change mitigation and adaptation against other worthy ways to spend money. Is it economically sensible to build a seawall now and fend off the uncertain rise in ocean levels decades down the road? Is it a good idea to demand power plants capture their carbon and inject it underground (assuming that’s a functional technology)? Should we simply dump all of our money into renewables? Or what about into energy efficiency?
How do we get the most bang for our buck when dealing with climate change?
Another tough question. And maybe also the wrong question.
I wrote about this once before, covering a debate between law professor Doug Kysar and economics professor Matt Kotchen. And a summary published last month in Environmental Research Letters by Jonathan Koomey tackles the same thorny issue with full-throated advocacy for a new approach. “The benefit-cost approach,” writes Koomey, “while it has been useful in many contexts, has serious limitations that call into question its utility for analyzing climate change.”
So what do we do instead? The proposal is to focus on physical limits — 2 degrees Celsius, in this case — and devise the optimal policy for reaching that goal. In this way, we’re determining costs within given, necessary boundaries. It will be interesting to see if, or when, this takes hold.
At least Koomey sings optimism: “I call this method ‘working forward toward a goal’, and it’s one that will see wide application as we face the climate challenge in the years ahead.”
From, of all places, a word-a-day email: “Destroying species is like tearing pages out of an unread book, written in a language humans hardly know how to read, about the place where they live” (Holmes Rolston III, professor of philosophy).
The religion of economics provides an idolatry of rapid change, unaffected by the elementary truism that a change which is not unquestionable improvement is a doubtful blessing. The burden of proof is placed on those who take the “ecological viewpoint”: unless they can produce evidence of marked injury to man, the change will proceed. Common sense, on the contrary, would suggest that the burden of proof should lie on the man who wants to introduce a change; he has to demonstrate that there cannot be any damaging consequences. But this would take too much time, and would therefore be uneconomic. Ecology, indeed, ought to be a compulsory subject for all economists, whether professionals or laymen, as this might serve to restore at least a modicum of balance. For ecology holds that an environmental setting developed over millions of years must be considered to have some merit. Anything so complicated as a planet, inhabited by more than a million and a half species of plants and animals, all of them living together in a more or less balanced equilibrium in which they continuously use and re-use the same molecules of the soil and air, cannot be improved by aimless and uninformed tinkering. All changes in a complex mechanism involve some risk and should be undertaken only after careful study of all the facts available. Changes should be made on a small scale so as to provide a test before they are widely applied. When information is incomplete, changes should stay close to the natural processes which have in their favor the indisputable evidence of having supported life for a very long time.
(From Basic Ecology by Ralph and Mildred Buchsbaum)
In “The DNA Data Deluge,” a recent article in IEEE Spectrum, the authors elaborate at length on the following conundrum (which is a problem, in their opinion):
As [DNA] sequencing machines improve and appear in more laboratories, the total computing burden is growing. It’s a problem that threatens to hold back this revolutionary technology. Computing, not sequencing, is now the slower and more costly aspect of genomics research.
The performance of a DNA sequencer has improved three- to five-fold each year since 2008 — astonishing, and far more rapidly than the capabilities of computer processors, which still chunk along according to Moore’s Law. (“Computer processors basically doubled in speed every two years over that same period,” say the authors.)
Take two recent studies written up in Chemical & Engineering News: “Nanoparticles In Athletic Apparel May Seep Into Sweat” and “Nanoparticles Could Disrupt Immune Cell Function.” In the first, scientists in Germany tested the toxicological implications of sports clothes interlaced with titanium dioxide and silver nanoparticles. “Little is known about what effects these materials have on human health,” according to the article. (I’m surprised this phrase is still used without irony.) The experiment ultimately presented three conclusions. First, the titanium is not absorbed by sweat at dangerous levels. Second, the silver may be absorbed at dangerous levels. Third, more experimentation is needed.
In the second article, researchers moved beyond the standard toxicological test of cell survival in the presence of nanoparticles and instead focused on cell function. When exposed to iron oxide, they found that immune cells are less effective at killing bacteria. Robert L. Tanguay of Oregon State University, who was not an author on the study,
says these results suggest that nanoparticles may need additional safety testing on top of the standard toxicity studies. “That could open up a huge Pandora’s box,” he says, adding to the number of variables researchers consider when assessing nanomaterial toxicity.
* * *
I suppose if there really is a storm of progress, then it’s blowing hard as ever, pushing those angels along, uncovering one new mess after another.
It is crude but true that cancer makes for profitable business. The FDA approved 19 new drugs in 2012 and three this year, according to CenterWatch: Kadcyla ($9,800 per month), Pomalyst ($10,5000 per 28 days), and Stivarga ($9,350 per 28 days). There is no doubt a need for more and better cancer treatments. No doubt.
But two new studies in the past month cast light on the trouble of pushing research harder than perhaps it should be pushed. In an article entitled “Characteristics of Oncology Clinical Trials: Insights From a Systematic Analysis of ClinicalTrials.gov” published in JAMA Internal Medicine, a number of authors conclude that the research on which cancer treatment is based often lacks the statistical rigor and robustness of treatment for other diseases. “An inherent tension arises between the desire to use new, life-saving treatments and the imperative to develop the evidence that patients, clinicians, regulatory agencies, and advocacy groups need to make sound decisions.” said Bradford Hirsch, lead author of the study, in a press release. “Unfortunately, the high prevalence of small studies that lack rigor limits the ability to assess the evidence supporting specific treatments.”
And released just yesterday in PLOS-One, a new study targets the widespread lack of reproducibility that afflicts much of the oncology literature:
When asked if investigators had ever tried to reproduce a finding from a published paper and not been able to do so, 54.6% (237/434) of all survey respondents said that they had, with 58.5% (154/263) of faculty having experienced the inability to reproduce data, and 48.5% (83/171) of trainees having the same experience. Of note, some of the non-repeatable data were published in well-known and respected journals including several high impact journals…
Upon finding results from a paper that could not be repeated the question arises; “what to do with that contradictory data?” When survey respondents were asked if they contacted the authors of the research paper, 78% (71/91) said that they had, but the ultimate results of that contact were very mixed…Overall, only 33.3% of respondents were ever able to explain or resolve their discrepant findings. This implies that 66.7% of conflicting results between research groups were not resolved through communication with the authors of the original publication.
These complementary problems run deep through the cycle that connects R&D with commercial markets, and solutions on any meaningful scale may require systemic examination and overhaul. But little choice exists. As one of the papers concludes, “the lives of our patients depend on it.”