McKinsey’s View on Whether Information Technology Matters

Ten years ago this month, the Harvard Business Review published an article by Nicholas Carr, who argued that “IT Doesn’t Matter,” i.e., that information technology doesn’t create strategic advantage.

To be sure, there are plenty of mundane IT applications—expense reporting and benefits enrollment systems, for example—but if IT categorically doesn’t matter, how does one explain, say, Google’s success? After all, Google went from a graduate student research project to over $50 billion in annual revenues and a market cap of over $300 billion Wednesday in large part by leveraging IT embodying innovative algorithms delivered cost-effectively at scale. Recent research from Will Forrest, a Principal at McKinsey & Company and one of the world’s leading experts on the business value of IT, helps illuminate whether IT matters, and if so, how, by differentiating old approaches such as labor automation from new ones such as digital products and services, team collaboration tools, and business model transformation.

Before delving into Forrest’s insights, let’s review the opposing positions.  Carr’s argument, simply stated, is that IT doesn’t matter because IT has become ubiquitous and is thus a commodity. He observed that “what makes a resource truly strategic…is not ubiquity but scarcity.” In other words, if every firm has access to the same servers, storage, networks, or packaged software (today one might add cloud infrastructure, platform, and software as a service to the list), then no firm can gain the upper hand by using them. If this argument is correct, then IT should be the focus of disciplined cost management.

Read more »

Source: Forbes.com: Technology News