by Zwackus
Tue Jun 11th, 2013 at 05:03:18 AM EST
Here on ET, it's generally acknowledged that infrastructure is something that is best done by the government, and preferably during recessions via deficit spending. Infrastructure is commonly thought of as things that provide for the public good, and that are most useful when provided to all.
Lots of things count as infrastructure. Roads and bridges are the obvious examples, but power generation and distribution, mail and package delivery, rail and air transport, phone and data networks, medical services, security and disaster relief, and a whole variety of other things could also be considered infrastructure.
How about software?
front-paged by afew
I wonder if there's any value to thinking about and treating software as a form of infrastructure. PC operating systems might be the most obvious case - it's a piece of code that is necessary to run every other piece of code, and there are advantages to everyone when everyone else uses the same or compatible code. However, those who actually work in network IT, or with any sort of networked hardware, could surely chime in with examples of the sort of programs and protocols that make everything else work.
However, its a fact that a lot of software, if not most software is rather crappy. It's unreliable, it's prone to attack, it frequently does not do what we want it to do, and it doesn't tell us what's wrong in a way that most users can understand. Windows is the most obvious example of chronically crappy software, but it is far from the only example.
Why is this? There are all kinds of complicated reasons which people other than me could explain in much better detail, but I think the list would typically include such problems as . . .
1 - Big code is complicated, and complicated things are hard.
2 - The need to write software for many different pieces of hardware, and the corresponding difficulties with compatibility.
3 - Reliability is only occasionally a matter of concern for the private companies that fund software development.
4 - Security is only occasionally a matter of concern for the private companies that fund software development.
5 - Usability is only occasionally a matter of concern for the private companies that fund software development.
6 - Employment patterns in the industry do not help much with the process of finding, developing, and maintaining talent.
7 - A lot of good code is kept secret.
8 - Making real progress in reliability, usability, or security may well require a lot of open-ended primary research, and perhaps different hardware.
If my argument holds water up until this point, then it may well make a lot of sense for software to be considered a part of the national/global infrastructure, and treated accordingly.
1 - Rationally designed specs and standards on the national or international level, to insure that things work together and meet minimum standards of reliability and flexibility. Designing meaningful ways to measure and evaluate such things would be a major accomplishment in and of itself.
2 - National and/or international centers for software research and design, which have the funding and the mandate to solve the really difficult issues of security and reliability and compatibility, and to build software packages from the group up with these goals in mind.
3 - Open and free sharing of the resulting product.
4 - Monitoring and approving the private sector addons to the public sector code, to make sure they're not screwing everything up.
I'm not an IT person, so maybe this just sounds silly. But ET is always a great place to get well-informed and reasonable commentary on silly-sounding ideas.