The U.S. federal government spends about $82 billion each year on information technology investments. Does the public receive good value for this investment? There are really two questions here: (1) Is the money spent effectively? (2) Is it spent on the right things? The technology environment has changed greatly over the last few decades, as have the country’s needs. Perhaps more importantly, management theory and best practices have changed as well. Has the U.S. government changed to keep pace? Has it incorporated best practices from the private sector, or even from other governments? Is it making the most effective use of its technology to deliver value to its public?
I am not just asking about whether the government is up to date in adopting new gizmos and products that are being invented. I’m asking whether the government is managing its technology investments in the best possible way, and whether it is focused on the right investments.
I hope to start a discussion on these topics by drawing on a number of strains of contemporary thinking, including: agile and lean software development, continuous delivery and DevOps, cloud infrastructure, adaptive leadership, Government 2.0, open data, crowdsourcing, and the work being done by the UK federal IT staff.
Please note that all opinions expressed here are my personal opinions, not those of the government agency I happen to be employed by. I am blogging as an individual citizen, not in my role as a government employee.