The economist William Stanley Jevons made an observation about coal in his 1865 book, “The Coal question”. It became known as Jevon’s paradox, and it was:

      increases in energy production efficiency leads to more not less consumption

In Jevon’s time the big worry was that England would run out of coal. It was hoped that more efficient coal-use (eg. better steam engines etc.) would lead to a lower consumption and therefore England’s coal reserves would last a lot longer.

“write down my name …” (josef.stuefer via flickr)

Economics is very often counter-intuitive. Jevons argued the opposite would happen. If a doubling of fuel efficiency more than doubled work demanded then overall coal use would increase. If improved technology had a lesser effect coal use would decrease as perhaps expected.

What does this mean for IT? Let’s consider the raw input costs of IT projects: Infrastructure, Software & People.


… is of course getting cheaper and cheaper, while at the same time getting more powerful. Moore’s Law still holds and is driving greater processing power at cheaper cost. There are grumblings about the law’s demise, that chips are getting more expensive to develop, but big brains such as Andy Bechtolsheim assert that the law is still alive and well.

Networking speeds double every two years or so (Nielsen’s Law) and the value of any network increases with more connected devices (Metcalfe’s Law). Connecting to decent bandwidth is cheap but highly valuable and necessary. With the rise of mobile networking the same trends are occurring but now the network is available anywhere.

Manufacturing costs have plummeted. One just has to look at the cost of a Raspberry Pi. Better manufacturing automation, low margins and stiff competition are driving continual investment in lower production costs.

Then of course you have cloud computing which is consolidating data centres into mega data centres and banking huge scales of magnitude. Did you know that every dollar of revenue to Amazon Web Services results in three or four lost dollars to established vendors? This is showing up in the results of IBM and Oracle.


Cloud computing extends up into the software realm too and this impacts those previously mentioned tier-1 vendors. COTS software is now SaaS.

Agile development is reducing the risk of development projects and this lowers cost further. A minimum viable product can be produced to prototype new platforms at very low cost.

Open Source has been underwriting software cost reductions for almost 20 years.

The age of trolling software rent seekers may be coming to an end. We live in hope.


Massive outsourcing and off-shoring has had some impact on stagnating wages in IT, thereby limiting – or perhaps burying – costs. (Phew got that distasteful line out of the way.) Countering this there is downward rigidity in wage costs also or, as normal non-economists say, wages don’t go down much. The people cost can be one of the most expensive parts of any project.

That said, collectively people have become much more efficient. IT departments are more efficient through consolidation of data centres, standardisation, automation, orchestration, Green IT and commoditisation of particular platforms (eg. VOIP, Email, OS). IT operating budgets shrink or stagnate, but more is done.

Where IT departments are not efficient and have high transaction costs (eg. Deploying a DB costs $100,000 and takes 4 weeks) the department is being circumvented and a cloud solution deployed, even if the overall cost is higher over time.

The outcome

Overall costs in IT continue to drop for all the above reasons. In line with Jevon’s and England’s coal problems though, IT doesn’t bank the savings and use less computing resources. The opposite happens. IT consumes more. Why is that?

Projects we’d never have done in the past because they were too expensive, or difficult to tie to economic transactions, now become viable (eg. Engagement systems such as Social marketing). Companies can pursue a wider range of projects. Smaller companies can develop capabilities previously only available to large organisations. And these new capabilities once pursued and attained in a market, become a cost of doing business. They can also support the creation of further advanced capabilities. More demand for “work” is generated in IT that is gained by the extra efficiency.

Some costs and problems across the enterprise get worse, especially those that have to deal with the complexity of the entire ecosystem.  For example, Performance management, Change transformation, security, SOA, Enterprise Architecture, orchestration, Cloud architecture, networking, data stewardship, power and cooling and application testing.  These costs and problems are systemic though. They don’t stop new projects. In fact over time they create their own requirements and therefore projects.

Where will this all end? Back to Jevons again, it was expected to end with the exhaustion of coal. What resource will be exhausted first in IT? Electricity? Software skills? Our ability to manage complexity?  Processing power? Physical data centre costs and space? I can’t see any end in sight yet folks… feed the beast!