Meet the Siemens SP260D

Electric motors are taking over from ICEs, for everything.

To make a change from the ongoing TV fantasy drama The Fall of the American Empire, aka The Game of the Throneless, let me introduce you to the Siemens SP260D.

This is an electrical aircraft engine. More details here.

This is only the second of Siemens’ efforts in the line, though they have been making electric motors since the 1890s. (AEG beat them to it, in 1889.) The striking datum is the power-to-weight ratio: 260 kW (footnote) from 50 kg, making 5.2 kW/kg. What should we compare this to?

A table of power-to-weight ratios for a sample of engines on the market today.

References: Siemens, Magnix, Lycoming, Tesla, Honda, Mercedes-AMG)

Continue reading “Meet the Siemens SP260D”

Shock news: denialist hack trashes electric cars

Bret Stephens is out to lunch on electric cars.

NYT journalist Bret Stephens has written a column attacking Elon Musk as “the Donald Trump of Silicon Valley”. Musk, whose 27% share of Tesla stock is currently worth $13.2bn, can look after himself. Perhaps Stephens has friends in the dispirited coterie of Tesla bears who need a helping hand?

What interests me is Stephens’ undocumented attack on Tesla’s main product, electric cars.

Tesla, by contrast, today is a terrible idea with a brilliant leader. The terrible idea is that electric cars are the wave of the future, at least for the mass market. Gasoline has advantages in energy density, cost, infrastructure and transportability that electricity doesn’t and won’t for decades. […] Electric vehicles were supposed to be the car of the future because we were running out of oiluntil we weren’t.

Set aside the easily checked fact that governments do not subsidise electric cars because they worry the world is running out of oil, but because of climate change and urban air pollution – plus a good dose of energy independence, as in China and India. Let’s see how electric cars have actually been selling. A chart from the IEA:

Source: IEA, Global EV Outlook 2017, data in Tables 4-6

The 5-year CAGRs are: PHEVs 143%, BEVs 85%, all EVs 107% (see spreadsheet). Continue reading “Shock news: denialist hack trashes electric cars”

Annals of commerce: product downgrades

Not everything you buy is getting better. Here are a couple of pet peeves:

I. Unfinished cast iron cookware

Cast iron skillets have been popular for decades. Properly seasoned and cared for, they last pretty much forever, are easy to clean, and are especially good at browning meat owing to the Maillard reaction that is catalyzed by iron. They used to be made with two well established technologies. The first is sand casting, and it’s the same way the engine block of your car is made. First, a wood pattern is made in the shape of the desired pan, but larger by about 1/8″ per foot because the pan will shrink as it cools. This pattern is embedded in damp sand in a mold with two parts, removed without disturbing the sand, and molten iron is run into the space it leaves.

The result of this process is a (1) rough casting with a very scrabbly surface of mill scale, ready to machine to the required dimensions and finish (the second technology). Back in the day, the skillet was (2) put on a lathe and  the inside turned to a perfectly flat inner bottom and smooth sides. This removes the hard, sandy layer on top and exposes the cast iron. You can find these pans at garage sales and on Ebay, and if they’re not too old and used, you can still see the spiral track of the lathe tool on the pan.

The skillet you will find today at your hardware store is probably Lodge, a company that used to make its wares correctly, but they have discovered a wonderful way to cut corners: just skip step (2), give the rough casting a coat of black paint, and call it “pre-seasoned”!  Here is what a new skillet made this way looks like.

You might make this smooth trying to get your fried eggs off it with metal spatulas–after a century or so.    Continue reading “Annals of commerce: product downgrades”

Peak gasoline

Gasoline demand will peak in 15 years, more or less.

It’s pretty chart time again!

A natural follow-up to my very broad-brush survey of the global emissions trajectory is: when can we expect oil demand, one of the big components, to peak?

To a first approximation, oil is used for transport by land, sea and air. The biggest chunk is gasoline for cars and diesel for trucks. These are still growing, and will continue to do so for some time. So start with gasoline for cars. When will this peak? I have had a go.

Cars last about 20 years, and every year <4% of the growing fleet is scrapped. The annual net increase is linear, like total sales. When new electric cars pass the total annual net increase, the total stock of ICE cars will peak and start to fall.

The growth of electric car sales is very rapid and exponential, but it’s also uncertain. I took three scenarios: the 58% CAGR that fits the last five years of data, and more cautious lower rates of 25% and 40%. Sales of EVs will pass the net growth in the car fleet in 2026, 2030, and 2037 in the three scenarios: 10 to 20 years from now. If I had to guess a “peak ICE cars” year, I would go for 2032, 15 years ahead.

The total stock of ICE cars is a fair proxy for gasoline consumption. So the same years are possible peaks for that. The range is disappointingly wide, but it’s is not useless information for global emissions. If diesel tracks gasoline (I think it will), the overall peak in oil demand will come at the same time.

A net zero economy requires a complete ICE phaseout and not peak but zero gasoline and diesel. To get this by 2050, all new ICE sales would need to stop around 2035, a much tougher proposition. Still, we have seen with coal that once the rot really sets in, things speed up. Some markets – I fancy diesel buses – may collapse completely quite soon.

A lot could go wrong. But a lot could also go better. It’s a fat risk distribution.

High-fibre background and speculation below the jump. There is not yet enough sales data for commercial electric vehicles to allow even a guesstimate for the phaseout of the competing diesels; but I offer qualitative reasons for thinking that they will follow a similar trajectory.

Continue reading “Peak gasoline”

Crime and Big Data: Autopilot vs. Power Steering

There has been a host of recent articles and books decrying the use of “big data” to make decisions about individual behaviors. This is true in commerce (Amazon, Facebook, etc.), but also true in criminal justice, my field of research. Moreover, some of the algorithms that forecast dangerousness are proprietary, making it all but impossible to determine the basis for challenging a sentence based on the algorithm’s outcome. Recent books, such as Weapons of Math Destruction and The Rise of Big Data Policing, underscore the dangers of such activity. This is the essence of an autopilot approach to forecasting behavior – hands off the wheel, leave the driving to us.

There is some research that supports this type of algorithmic decision-making. In particular, Paul Meehl, in Clinical versus Statistical Prediction, showed that, overall, clinicians were not as good as statistical methods in forecasting failure on parole, as well as the efficacy of various mental health treatments. True, this book was written over fifty years ago, but it seems to have stood the test of time.

It is dangerous, however, to relegate to the algorithm the last word, which all too many decision-makers are wont to do (and against which Meehl cautioned). All too often the algorithms, often based on so-so (i.e., same-old, same-old) variables – age, race, sex, income, prior record – are used to “predict” future conduct, ignoring other variables that may be more meaningful on the individual level. And the algorithms may not be sufficiently sensitive to real differences: two people may have the same score even though one person may have started out doing violent crime and then moved on to petty theft, while the other may have started out with petty crime and graduated to violent crime.

That is, the fact that a person has a high recidivist score based on the so-so variables should be seen as a threshold issue, a potential barrier to stopping criminal activity. It should be followed by a more nuanced look at the individual’s additional life experiences (which do not fit into simple categories, and therefore cannot be included as “variables” in the algorithms). That is, everyone has an age and a race, etc., but not everyone was abused as a child, was born in another country, or spent their teen years shuffling through foster homes. Therefore, these factors (and as important, the timing and sequence of these factors) are not part of the algorithm but may be as determinative of future behavior as the aforementioned variables. This is the essence of a power steering approach to forecasting behavior – you crunch the data, but I decide how to use it and where to go.

Regarding power steering, I’m sure that many of you would rather look at an animated map of weather heading your way than to base your decisions (umbrella or not?) on a static (autopilot) weather forecast (BTW, does a 30 percent chance of rain refer to the likelihood of my getting wet in a given time period or to the fact that 30% of the area will be rainy and may skip me entirely?). The same issues are there in crime analysis. A few years ago I coauthored a book on crime mapping, which introduced the term that heads this post. In that book we described the benefit of giving the crime analyst the steering wheel, to guide the analysis based on his/her knowledge of the unique time and space characteristics of the areas in which the crime patterns developed.

In summary, there’s nothing wrong with using big data to assist with decision-making. The mistake comes in when using such data to forecast individual behavior, to the exclusion of information that is not amenable to data-crunching because it is highly individualistic – and may be as important in assessing behavior than the aforementioned variables.

National Grid and the end of British coal

The British grid operator flags its first day without coal.

Somebody in the National Grid control room in the UK has a sense of history.

NG is greatly understating the length of the British coal story. The 1880s were when electrical power stations began. But the switch to coal in England was already under way in 1600, as the trees ran out and wood prices soared. The first Industrial Revolution in the 18th century depended on coal. The steam engine was invented to pump water out of mines. Since coal is no longer burnt for anything else than electricity, the day marks the end of the four-century coal era in the first modern industrial country.

National Grid are noticeably not moaning about their loss of “baseload” coal plants, unlike Rick Perry. In her successful 1989 electricity privatisation, Margaret Thatcher split transmission, a technical monopoly, from generation, which can and should be competitive. National Grid started out in the public sector; it was later privatised, but remains tightly regulated with a public-interest mandate and no generating assets. (In the US the company operates as a conventional mixed utility). The model, as good ideas tend to do, has spread to Texas, Australia, China, India, Denmark, and Germany, and no doubt others. These grid operators are uniformly phlegmatic about the energy transition. They publish reports  about how to integrate 20%, 40%, 100% renewable electricity in their grids, how to fix the problems, and how much it will cost. They never SFIK say: stop this, we can’t cope, a secure supply requires baseload coal or nuclear plants. You only hear this biased testimony from the old-fashioned silo monopolies in parts of the USA and in Japan.

Another telling detail is the little gaps in the chart. The few surviving British coal generators have not been running at night. This is orthodox Econ 101: the grid control room has a merit order list of generators, and will call on the cheapest first. Renewables have zero marginal cost, and therefore go first when demand is low in the small hours. Orthodoxy is terrible for the owners of coal plants, which were designed and financed on the assumption that they will run as “baseload”, that is almost all the time, with spikes in demand met by more expensive gas generators. This effect  is cutting into the returns from coal plants, in Germany, Texas, Colorado, and India, even faster than the slide in the comparative LCOE of new build.

A Nobel Prize for John Goodenough

It’s long past time.

Over the jump, an open letter to the Chemistry Committee of the Nobel Prizes urging them at long last to award the Nobel Prize to Professor John Goodenough, who invented the lithium-ion battery. If you agree, you can email a message of support to the Royal Swedish Academy of Sciences at info@kva.se. Continue reading “A Nobel Prize for John Goodenough”

Gresham’s Second Law

The inventor of the Web writes to its 2bn users.

Sir Tim Berners-Lee, who with Robert Cailliau created the World-Wide Web 28 years ago with the specification for HTML, has published an open letter to the Web’s 2 billion users today.

The text is here, in English, French, Spanish, Portuguese and Arabic. He invites everybody to share it, so I’ll save you the fatigue of clicking on the link to reproduce it below the jump. Some quick comments from me to get you going.

Internet users by country, 2011

1. Berners-Lee is one of the few people who can speak with real authority on this stuff. If he says we have big problems, it’s a safe prior that we do. If he says they can be fixed, there is a very good chance they can.

2. The approach is too narrow. I rely here on another authority, Mike O’Hare of this blog. He has written about the crisis in society created by the arrival of transmission and reproduction of information at near-zero marginal cost, leading to the implosion of subscriber revenues for journalism and the dramatic thinning out of newsroom staff.

Let’s give this insight a catchy name: Gresham’s Second Law.

Robert Gresham was an English financier of the Elizabethan era, who has given his name to the first genuine economic law: that is, a generalisation based on solid observation, explained by a robust theory. (He had eminent predecessors including Copernicus so the attribution is a little unfair.) The Law reads:

Bad money drives out good.

That is, with a bullion specie currency, when the king debases it by reducing the bullion content of the coins, the price of the metal rises in nominal terms, and anyone who can get hold of the old, fatter coins can make a quick doubloon by melting them down. So the good old coins disappear.

Gresham’s-nth-granddaughter’s Second Law, which I have just invented, is similar:

Bad information drives out good.

The cost of production of good information – science, literature, accurate reporting – is high. The cost of production of lies, bullshit, smears, pornography, and rumours is negligible. On the consumption side, bad information is designed to appeal to our lower human nature (Kahneman’s System 1). Good information is often difficult, unwelcome or both, and requires the support of the lazy System 2. So the good information always has a struggle to be heard.

Now consider a technical innovation that lowers the cost of reproduction or transmission of information: say from hand copying to print, or print to the Internet. In the print era, Adolf Hitler had to struggle to get his message across. He had to find a printer for Mein Kampf (the title was accurate). He had to build a united movement from the hard-right flotsam floating round Munich, through endless face-to-face meetings in beer halls. Even in favourable conditions, it took him a decade before he could mount a credible challenge to gain power. Contrast Donald Trump. Starting from nowhere politically in 2015, he won an election in a much larger country with little more infrastructure than a Twitter account and support from the Breitbart website.

The contrast can be explained in terms of  Gresham’s Second Law. The drop in transmission costs removes an obstacle to the dissemination of bad information, and releases its advantage in lower costs of production. So the problem has got worse.

3. Berners-Lee is right that we need to brake bad information. His example of political advertising linking to fake news is just one abuse. Americans in particular need to rethink free speech absolutism. Citizens United represented to many of us a reductio ad absurdum. As legal persons, corporations are slaves, with inferior, not equal rights to the humans they serve. Should corporate bodies – with access to much bigger megaphones than individuals – be held to a higher standard of care in their public speech? Companies that mislead their stockholders face severe sanctions, and deception in advertising is limited to suggestio falsi and suppressio veri, outright lies being banned. I don’t see why the privilege of corporate political and cultural speech should not face analogous restrictions.

4. We also have to think positively: how can the good information be paid for? The answer for science has been socialism leavened by philanthropy. Literature and music seem to be doing all right in the market system, though that’s just a non-expert impression. A sufficient number of customers for music seem prepared to pay one or two dollars for a song rather than pirate everything, a convention that relies more on an honesty-box ethic than on sanctions. The immediate crisis is in reporting. It’s good news that Sir Tim’s team will be looking at micropayments. They should be looking at socialism too. It’s already how we pay for education and health.

* * * * * *

(Letter over the jump) Continue reading “Gresham’s Second Law”

Balkin’s Three (revised) Laws of Robotics

Balkin updates Asimov.

The eminent Yale scholar of, and blogger on, constitutional law Jack Balkin has published a very nice article updating Issac Asimov’s famous Three Laws of Robotics.

robot

It’s a rich short essay, not a treatise; an opening shot in a new debate, not the last word. Read it and comment please.

A few takes of mine. Continue reading “Balkin’s Three (revised) Laws of Robotics”

Network Architecture, Social Media, and Social Justice: Some Preliminary Thoughts

Since the election, I’ve seen a lot of writing about the ways in which social media can help explain our contemporary political divide, from the rise of fake news on Facebook to the belated ban of racists on Twitter (sorry, I don’t use the politically correct term “alt-right”, and, by the way, how ironic is it to be PC about that group, of all people) to the existence of news bubbles (as foretold by Eli Parisier). I worry, though, that too much of this conversation focuses on policy fixes, and not enough looks at the deeper issues: how these systems and networks are designed from the ground up. We seem to be talking about how to fix Facebook, but not about whether Facebook is ultimately able to be fixed.

Facebook is private property and its rules of discourse are governed by proprietary algorithms designed with revenue-maximization–not social benefit–in mind. Any social benefits–and there are many–are incidental. So even though it acts as a de facto town square, it’s really a town mall. And that makes the ground rules–and our ability to change them–different. The same is true of all other platforms, not just Facebook. What I want to explore is how particular architectures embed certain kinds of expectations about people and their needs, and the ways in which these structures facilitate certain activities and circumscribe others.  Ultimately, is there a way for us to consider society as we build out new platforms, particularly when it comes to social media, where people are both the producers of value and its consumers? Put another way, how is it possible that, after the rise of literary theory, we are not thinking more deeply about a system where private corporations control the ways in which language is produced, distributed, and consumed? Surely that is affecting its content–and its potential.

If Walter Benjamin were alive, I imagine he would talk about the ways in which the cyber flaneur is constrained. We live in intentional online communities. There is little juxtaposition or happenstance. We go online, typically, with a search in mind–and even though we use a “browser” to get there, we are motivated to get to our destination. Searching and browsing are fundamentally different. Alternatively, we are on Facebook, which is less directional. We are looking for stuff to see, but we are only doing this from known entities, not ones we happen to encounter, and this stuff to see is typically disconnected from our corporeal selves that eat, live, work, play, and travel through real spaces, through different neighborhoods whose inhabitants have different perspectives. Even our offline movements are increasingly point-to-point, designed with minimal travel times in mind, through the aid of navigation apps.

How you feel about this depends on what your goals are (and where you are in the system). The mid-20th century drive to build highways through American city centers made a certain kind of sense: it allowed for speedy transfer of surburbanites to their work in the urban core. As long as this was the goal, highways were the answer, and the governmental subsidies that embedded these policy choices made sense. But highways through cities, of course, also destroy urban neighborhoods, make mixed-use development more difficult, and create sprawl and its attendant social isolation. You can either read the Power Broker or have grown up in suburban Atlanta to understand that. So if your goal is vibrant city life, highways–and subsidies–don’t make sense. I fear that we are currently building highways and gated communities in cyberspace without thinking about their potential side effects. We are only concerned with efficiency, not sustainability. Continue reading “Network Architecture, Social Media, and Social Justice: Some Preliminary Thoughts”