Chad Dickerson @ InfoWorld has touched on a provocative question that is certain to raise hackles on IT managers (and vendors), especially if they fail to read beyond the headline.
Chad has written on the issue of IT as a commodity before: anyone who has worked on a project (as opposed to a “click here to install” installation) understands that while the components become more manageable or commoditized over time, gluing them together still requires some effort that the vendor frequently glosses over or handwaves away.
But when I look at the question — “Does IT matter?” — I see a different way to thinking about this. Too often, IT solutions are long on technology and short on information: they add complexity and/or sophistication to a process but they don’t necessarily make the information easier to work with.
This is where traditional ROI evaluations break down in many cases. The costs that can be listed on a project budget or capital allocation request don’t always (ever?) list human factors, like time spent on a given part of a process or how the information the project is supposed to deliver is to be used.
I’m sure anyone who has worked in any enterprise with a tech infrastructure can point to some system integration project or other technology initiative that failed to figure in the people who use the information being “technologized.” How often are these projects promoted or defended on costs savings for the IT group or reduced support needs, versus faster or more accurate information access and delivery? And how often do these savings fail to materialize because of a lack of real understanding of the process being “improved?”
A more thorough project scope document would include some analysis and cost details for the people who work with the systems, be they external customers or internal ones, and the costs of transition/training and documentation.