So it appears. The cost of computing will continue to fall by a factor of 1000 every fifteen years. That means everything keeps changing.
Everyone knows that exponential growth can’t continue indefinitely; eventually it hits a wall. But every time Moore’s Law – the capacity of a chip doubles, and the price of raw computing power halves, about every eighteen months – seems to be heading for a wall, a door appears. Now it seems to have happened again.
Just to review the arithmetic: since ten squared two to the tenth is 1024, the cost of computation has, since 1948, been falling by a thousandfold every 15 years. As long as that rate of change continues, we can be certain that the future, in economic terms, will not resemble the past. Forgetting that is dangerous.
Author: Mark Kleiman
Professor of Public Policy at the NYU Marron Institute for Urban Management and editor of the Journal of Drug Policy Analysis. Teaches about the methods of policy analysis about drug abuse control and crime control policy, working out the implications of two principles: that swift and certain sanctions don't have to be severe to be effective, and that well-designed threats usually don't have to be carried out.
Books:
Drugs and Drug Policy: What Everyone Needs to Know (with Jonathan Caulkins and Angela Hawken)
When Brute Force Fails: How to Have Less Crime and Less Punishment (Princeton, 2009; named one of the "books of the year" by The Economist
Against Excess: Drug Policy for Results (Basic, 1993)
Marijuana: Costs of Abuse, Costs of Control (Greenwood, 1989)
UCLA Homepage
Curriculum Vitae
Contact: Markarkleiman-at-gmail.com
View all posts by Mark Kleiman
Just to review the arithmetic: ten squared is 1024
Ten squared is 100; you mean two to the power of ten.
He probably meant 102
(Weird tag support)
10 (radix 2) is what I meant.
This is one of the most pernicious fallacies since people started misunderstanding what is meant by "The butterfly flapped its wings in Brazil and a tornado appears in Iowa."
The cost of computation is already so cheap that it is essentially a non-issue for all but the most massive enterprises who are either (a) trying to vacuum up and manipulate The Universe and Three Examples; or (b) engaged on one side or another of an arms race involving ciphers.
The problem with generalizing ANYTHING from the cost of computing is that one can eat Alpha-Bits but one can't bite into bytes. Most of what will be made possible through the continuation of Moore's law is ephemera that won't matter a good goddamn three weeks later, because, while the processors are getting faster, the minds of most policymakers are becoming, if anything, even shallower and more superficial.
It's just another example of the theory of constraints, which is just a modern consultant's rehash of Liebig's Law of the Minimum — when your plant is starved of, say, phosphorus, no amount of all other essential ingredients in any abundance will matter an iota (if anything, they will further stress the plant).
Computing power is cheap and will keep getting cheaper, at least for a while yet. But our rising irrationality quotient just shows that we're not limited on computing power. Our societies are already DRIPS — data rich, information poor.
JMG, it's not just CPU power that's been growing – it's disk capacity and speed.
That turns into solid state storage when the transistors get small enough. Likewise, smaller usually means less power per transistor. That lets a $600 have more storage than a desktop computer from 14 years ago.
I do agree that expanded computation abilities won't get you the moon. GIGO is still the name. (Garbage In, Garbage Out)
And the only quantity exceeding the stupidity of users is the stupidity of the programmers.
Moore's law would matter a whole lot more, if the amount of computation required to achieve the same result weren't doubling every 19 months due to changes in programming practices… Ditto for memory requirements.
It's a fallacy IMHO to jump from the density of memory chips to the cost of computation. For one, the increase in the clock rate of preocesors has just about stopped because of heat issues: so the effort now goes into multi-core processors, which need more complex parallel software, and more energy-efficient (rather than faster) procesors for mobile devices. The true cost of computation should be asssessed in terms of running useful programs, the manipulation of complex graphics for gaming, number-crunching for statistics, searching databases, and so on. Clearly these have not been getting faster in line with Moore's Law. Anecdotally, desktops today are only a little faster for daily work than they were 5 years ago. The Law may be technically still true, but its scope and relevance are shrinking all the time.
Oh, joy. Does this mean I have to start buying lots more soon-to-be-obsolescent crap for myself and my kids in order to keep up? I've been rather enjoying the slower pace of development lately.
James W, I recommend this cartoon:
http://www.ubersoft.net/comic/hd/2003/04/personif…
ALEX: Mark, computers do not mock. They're inanimate objects, for crying out loud…
MARK: They are sentient and malevolent, and they plot against me.
ALEX: Mark, they do not —
MARK: Alex, think. Every year, computers get faster and faster… but software keeps running at about the same speed. Where does all that extra power go?
(Click through to the punchline.)
Things don't actually have to get better for Mark's prediction to be true. They just have to get different.
Some of the commenters have talked about Amdahl's Law (which essentially says that things can't get faster than the slow parts of what you do) but I am much more concerned about Baumol's. If the productivity of virtual stuff (for some definition of "productivity") keeps increasing exponentially while the productivity of doing bricks-and-mortar or metal-and-plastic things increases linearly at best, physical stuff will keep getting more expensive, and — much worse — the incentives for investing in physical stuff will continue to decrease.
It's not really surprising, in this version of things, that we have an enormous shortfall in roads, bridges, water systems, clean energy and so forth at the same time that people can invent new billion-dollar financial instruments every other millisecond.
James – it's not that desktop computers are doing more and more, it's that the minimum practical size of a computer gets smaller and smaller. Smaller computers spread out into more niches, allowing computation to occur in more and more places, using less and less power.
EG: my daughter's hearing aides have more CPU power than a desktop computer I bought 20 years ago. (best guess, not sure. It was a 20 MHZ 386, and took many watts of power to run.) It does fancy digital signal processing to adapt to her specific hearing deficiencies. But unlike my computer of 20 years ago, they weigh a few ounces. With further size and power decreases, it's likely that new applications of technology will emerge – not just improvements to general purpose systems.
Long time reader, and I believe it is my first time posting since this is something I fell can speak to.
The interesting parts of consumer/personal computing are no longer in the desktop; it is in mobile computing and cloud based services. Smartphones, netbooks, iPads, etc. all provide people instant access to any information they need. The computing power for complicated tasks does not need to be in your own computer, it can exist in the network, with results simply retrieved and displayed by the device you carry.
Moore's Law, which is really about the number of transitors available for a minimum cost, continues to have an impact on personal computing in a number of areas. The cost, power consumption, and size of computing devices continue to decrease, and thus become more easily accessible to a larger percentage of the population (think One Laptop Per Child http://laptop.org), while the computational ability of devices continues to increase. With more computation being done in big server farms that host services for everyone, the power consumption and computational ability of servers, perhaps seen as a more traditional benefactor of Moore's Law, remains important. Finally, there is the networking and telecommunications equipment that glues it all together. Having worked in the networking and telecommunications industry for the last 12 years, I can tell you that the new multicore system-on-a-chip processors that are becoming available are pretty impressive. It is not about increasing the clock frequency, but moving more and more functionality closer to the core of the CPU. The bottlenecks in computing these days are in accesses outside of the chip.
I think that for most people, it could mean you don't need to buy new stuff as often. Of course, companies will attempt to push upgrades, but realistically, for the average person, once you get a device, as long as it is supported and continues to function, there will likely not be the same driving force to upgrade to something better. The hardware will be good enough to do what you want, and upgrades will come through software and service updates in the network.
As a brief follow-on, this has implications in other areas as well. It becomes possible to make use of a multitude of wireless sensors that can relay data between one another, or to a central location, and that data can be processed in real-time. This could be used in a variety of applications to improve safety, or just increase human knowledge.
I think we are just starting to address the problem of how to digest and understand all of this data; we still have a long ways to go there. As JMG implied, what use is all the data if we can't comprehend it and do something useful with it.
In Mark's defense, he's ten times as good at math than Megan McArdle. Or 1/10 as good, whatever. But seriously, he got the computation just right, if slightly misstated.
Just to take an obvious possible outcome: privacy will disappear entirely. Every telephoned word you say or that is said to you, every email or text you write or comment you leave or photo you take or appear in or look at, every website you visit or article you read, everything on your facebook page, every office you enter, every intersection you drive through, every subway entrance you walk into, every penny you spend, every paycheck or payment you receive, your school records, army records, employment records, credit reports – all will be instantly searchable at no cost.
Privacy is a concern; there are people working on that too, but it is obviously not just a technical issue. It comes down to privacy rights and being able to be in complete control of data about you. Imagine, for example, that you could create different groups, similar to facebook lists. Different groups would be allowed to see certain levels of information about you. If you had complete control of your personal information, you could chose to share location information with friends (already done today with many different applications, such as foursquare), but maybe also with certain advertisers to find things near yoou that might be of interest. Law enforcement would need a warrant for this information.
I admit this is perhaps a bit optimistic, but regardless, there will need to be improvements in this area. Over time, more issues (identity theft, stalking, etc) will arise from all this information being available, and there will be even more of a push to solve these types of problems.
Privacy is a concern; there are people working on that too, but it is obviously not just a technical issue. It comes down to privacy rights and being able to be in complete control of data about you. Imagine, for example, that you could create different groups, similar to facebook lists. Different groups would be allowed to see certain levels of information about you. If you had complete control of your personal information, you could chose to share location information with friends (already done today with many different applications, such as foursquare), but maybe also with certain advertisers to find things near yoou that might be of interest. Law enforcement would need a warrant for this information.
I admit this is perhaps a bit optimistic, but regardless, there will need to be improvements in this area. Over time, more issues (identity theft, stalking, etc) will arise from all this information being available, and there will be even more of a push to solve these types of problems.
We're heading into an era in which people's dirt will be public. And if everybody already knows the dirt, the power which blackmailers hold over societies might fade.
And if the power of blackmail is scheduled to recede, blackmailers might have to use their power now before it's lost.
So, expect a wave of blackmail, rising over, say, the next 10 years, before cresting and falling back?
If you take away a corrupt person's source of wealth or power, that's a dangerous time. They look for another source?
So this means that Social Security should be o.k.? I mean, what does it matter if there are five times as many recipients/worker if that worker is twenty times as productive?
Unless, of course, politicans and economics worked such that the worker never saw most of that productivity gain in the form of increased wages on which FICA were collected…but that would mean that an increasing proportion were being siphoned off by her bosses and the the owners of the firms even as her situation became more and more precarious…and surely a free people in control of their nation wouldn't allow THAT!
Some quibbles, first:
Moore stated his law, at various times and places, with various time spans for the doubling of the number of transistors, which could be squeezed on a given, say, square inch of silicon. Using Intel's own chips as a proxy, it appears, in practice as if the doubling period has been around 2 years, maybe slightly more, not 18 months. That makes a huge difference in Mark's calculation — maybe as much as an order of magnitude difference over his chosen 15 years, since 2^7 is 128, while 2^10 is 1024.
As to whether the future can resemble the past, in the face of so much computing power, there's a further bit of calculation that might be noted: although doubling sounds like a lot, it actually takes something like an order of magnitude change (~10x or 12x) in computing power to get a subjective transformation in the gestalt experience. It takes 7 or 8 years to accumulate enough 2-year doublings to produce an order of magnitude change in potential computing power. You can see this kind of progress, by tracing PC user-interface milestones:
1973-4: CP/M
1980-1: DOS
1990: Windows 3.0
1995-98: Windows 95/98; (internet takes off as a popular medium)
2001: Windows XP
2009: Windows 7
As others have noted, progress, now, is proliferation: smartphones and DVRs and netbooks and readers, to name just the devices familiar to consumers. And, the increasing bandwidth and pervasiveness of the network, symbolized by the evolution of the cell-phone network.
1G – 1979-84
2G – 1991-95 (GSM)
3G – 2001-
iPhone 3G – 2008
4G – (LTE)
Notice the spacing of "system" change seems to stretch out to nearly a decade.
The "doubling" of technological potential by manufacturing progress, which is what Moore's Law is about — progress in manufacturing — does make sudden-ness a feature of the order-of-magnitude changes that enable changes in our experience of the tech. The order of magnitude change necessary to enable such life and culture changes as cellphones, or smartphones (or personal computers, or Windows/Mac graphical user interface computers or the World Wide Web) may take 7 or 8 or 10 or a dozen years. But, here's the thing: the last doubling to get over the threshold for that transformative user experience still only took place in the last 2 years. Half the advance in hardware necessary for a change that gestated for a decade took place only in the last two years.