This is how I roll. For the past 25 years, I have marked the end of summer not by devouring the closing page of the latest beach read or squeezing a lime into the final gin and tonic of the season, but by asking IT leaders, "What are you doing to prepare for the future?"
Two responses have typically predominated. Over the years, CIOs have said that they were either "fixing" IT or "focusing" IT on delivering what the business needs.
Here's some good news: This year's data indicates that "fixing IT" has all but disappeared from line-of-sight preparations for the future. I am pleased to report that in 78% of the Global 2000, IT does not suck. The various IT modernization efforts launched slowly and with limited funding in the shadow of the financial trauma of 2007-11 appear to have taken hold. Most of the "turnaround" CIOs who were airlifted into troubled IT shops have stabilized computational resources and succeeded in rendering enterprise IT nontoxic to key stakeholders.
The IT community is to be commended for this. We are talking about rendering a massively heterogeneous, mind-bogglingly complex array of technologies and methodologies that border on being unmanageable into a stable and value-producing asset. That was accomplished in the face of ridiculously low budgets, a vendor marketing approach in which more money is spent on golf outings and sporting events than on R&D or thought leadership, and a tragically tech-illiterate corps of executives. Kudos all around for the IT tribe.
Having gotten IT somewhere that was well worth going to, we still have to wonder what more the future holds. (It's a question that will expire only when we have no more future to look forward to.) Well, it would be nice if IT budgets went up, but we can't expect that anytime soon. I seriously envision a day when boards of directors will fire CIOs for not spending enough money on IT, but you don't need me to tell you that that day has not yet arrived.
And it isn't just our own budgets that matter. All IT practitioners wish that our suppliers spent less on the swag (pens, flash drives, T-shirts) they distribute at those out-of-date, pipe-and-drape trade shows, and more on investments in understanding. I'm a futurist first, but I'm also a realist. Vendors appear doomed to always overspend on the trivial and leave the hard work of figuring out how to use technology to make money and create mission value to the folks in the trenches.
Bear in mind, though, that while IT may not suck at long last, that won't last for long. Change is upon us. Is your organization prepared for the disruptions associated with the "SMAC stack" -- the mix of social, mobile, analytics and cloud? Have you adjusted your talent pipeline? Have you put in place the appropriate risk-adjusted "experimentation sandboxes" to gain experience with these technologies ahead of deploying them at scale? Have you created a network of smart people doing smart things on the edges?
Of course, in the history of computing, enterprise IT has rarely ever been prepared for the future. Think about it. Were we ready for the PC, client/server or the Web? Decidedly not. But we can't go on that way.
H.G. Wells planted the seed for modern futurism with a series of essays called "Anticipations," in which he advocated that thinkers/actors in the present should devote substantive cognitive resources to shaping the future. I think he was right.
Thornton A. May is author of The New Know: Innovation Powered by Analytics and executive director of the IT Leadership Academy at Florida State College in Jacksonville. You can contact him at firstname.lastname@example.org or follow him on Twitter ( @deanitla).
Read more about management in Computerworld's Management Topic Center.