A round-up of articles from the last few months of 2017 on community policing, algorithmic bias, science communication and climate change.
The New York Times investigates how police departments are responding to the end of federal assistance for collaborative community policing.
Since President Trump took office, for example, the Justice Department has not entered into a single court-monitored consent decree with a troubled police department, even in towns with widespread constitutional violations, records show. It has also ordered reviews of existing consent decrees — which are a tougher, more punitive alternative to the collaborative reform initiative.
The changes, designed to ease pressure on law enforcement, have actually encountered some resistance from police chiefs in cities that participated in the programs. And those chiefs work not only in big-city Democratic strongholds, but also in places like Spokane, which has a Republican mayor and is the largest city in a county that voted overwhelmingly for Mr. Trump.
The Chronicle of Higher Education covers how coastal colleges are coping with rising sea levels.
By 2050, in Norfolk, with even a moderate rise in sea level, the probability of at least one flood a year six feet above the current high-tide level will increase sevenfold. Such a flood would put almost 13,000 homes below water and consume more than five square miles of land.
Hemmed in by the Lafayette and Elizabeth Rivers, which define Norfolk’s shoreline, Old Dominion is in a precarious spot. To plan for the future, the university has brought together local government, businesses, and the U.S. Navy (which operates the world’s largest naval base, in the city) to share research and resiliency strategies. Research collaborations have helped direct federal funds to help the community prepare for what’s likely to come.
Elizabeth Kolbert explains why negative emissions may be climate change’s greatest moral hazard and a practical necessity.
When the I.P.C.C. went looking for ways to hold the temperature increase under two degrees Celsius, it found the math punishing. Global emissions would have to fall rapidly and dramatically—pretty much down to zero by the middle of this century. (This would entail, among other things, replacing most of the world’s power plants, revamping its agricultural systems, and eliminating gasoline-powered vehicles, all within the next few decades.) Alternatively, humanity could, in effect, go into hock. It could allow CO2 levels temporarily to exceed the two-degree threshold—a situation that’s become known as “overshoot”—and then, via negative emissions, pull the excess CO2 out of the air.
The I.P.C.C. considered more than a thousand possible scenarios. Of these, only a hundred and sixteen limit warming to below two degrees, and of these a hundred and eight involve negative emissions. In many below-two-degree scenarios, the quantity of negative emissions called for reaches the same order of magnitude as the “positive” emissions being produced today.
In the opinion pages. . .
Tim Requarth explains why scientists can’t change minds through facts alone.
Is it any surprise, then, that lectures from scientists built on the premise that they simply know more (even if it’s true) fail to convince this audience? Rather than fill the information deficit by building an arsenal of facts, scientists should instead consider how they deploy their knowledge. They may have more luck communicating if, in addition to presenting facts and figures, they appeal to emotions. This could mean not simply explaining the science of how something works but spending time on why it matters to the author and why it ought to matter to the reader.
Ellora Israni discusses the Supreme Court’s recent decision not to hear a case that would have focused on how algorithms impact due process rights.
At Mr. Loomis’s sentencing, the judge cited, among other factors, Mr. Loomis’s high risk of recidivism as predicted by a computer program called COMPAS, a risk assessment algorithm used by the state of Wisconsin. The judge denied probation and prescribed an 11-year sentence: six years in prison, plus five years of extended supervision.
No one knows exactly how COMPAS works; its manufacturer refuses to disclose the proprietary algorithm. We only know the final risk assessment score it spits out, which judges may consider at sentencing.
Mr. Loomis challenged the use of an algorithm as a violation of his due process rights to be sentenced individually, and without consideration of impermissible factors like gender. The Wisconsin Supreme Court rejected his challenge. In June, the United States Supreme Court declined to hear his case, meaning a majority of justices effectively condoned the algorithm’s use. Their decision will have far-ranging effects.
Is the bitcoin boom evidence for a societal transfer of trust from institutions to code? Tim Wu thinks so.
Bitcoin’s rise may reflect, for better or worse, a monumental transfer of social trust: away from human institutions backed by government and to systems reliant on well-tested computer code. It is a trend that transcends finance: In our fear of human error, we are putting an increasingly deep faith in technology.