Monday, August 09, 2010

Should Software Development Be Regulated?

The question of government regulation of business is high on the agenda these days. Over the last couple of years we have witnessed some spectacular events, like the Great Recession of 2008 and Deepwater Horizon explosion (and subsequent Gulf of Mexico oil spill) of 2010. These have already become case studies of the importance of government regulation. New legislation on financial and healthcare reforms will significantly increase the role of government in those areas. So, here is my question: has the time finally come to regulate software development?

I realize a lot of people have a negative knee-jerk reaction to anything that might expand the role of government (I can almost hear them scream!). Personally, coming from a communist country, I tend to be fairly skeptical in this area: I've seen what happens when bureaucrats are given unchecked power over people's lives. But let's consider the matter objectively. After thinking about it for a while I came up with three different areas where regulation can bring positive change.

Professionalism
It never ceases to amaze me that you cannot work as a plumber without a plumbing license, but no license is required to write software. Mind you, obtaining a plumber's license is far from a formality: it requires four years of job training, and the applicant must pass a written exam. On the other hand, anyone can apply for a software engineer position: it is up to the hiring company whether or not to ask for evidence of some formal training. Some companies administer tests, or ask a bunch of technical questions during interview process, but there aren't any standards.

As a direct result, ranks of software developers are full of people who picked programming as a hobby or were attracted to it by higher salaries, but never learned the mathematical foundations of the discipline. I would argue that these people are more likely to use poor coding practices, steer clear of object-oriented programming, and never bother with design patterns. Note that I am not advocating the supremacy of college graduates; all I'm saying that programming requires proper training.

By the way, similar observation can be made about businesses. For example, a financial services company may own cars, but is unlikely to have an in-house team of mechanics who fix them. And yet, the same exact company doesn't have second thoughts about maintaining an in-house software development organization.

Quality Control
Given the role software is playing in our lives, it's hard to understand why people tolerate low-quality applications. Although there are many reasons for poor quality, the industry pretty much knows how to address this problem. It all starts with a solid design, of course: application architecture should be appropriate for the task. Developers should write automated unit tests and ensure good code coverage, and these tests should be executed as part of every build. Each application should have well-defined white box and black box test cases, and appropriate performance testing should be done before the system goes live.

However, good quality control can be expensive: for example, the time used to write unit tests is the time developers do not implement new functionality. Automated testing tools for QA can be very expensive, too. It's no surprise some businesses prefer to save money on quality, given the extraordinary tolerance consumers have towards buggy software. By enforcing standard QA processes, government regulators can make good reliable software a reality and make life easier for the end user.

Security
Over the last 15 years, as high-speed internet access became first widespread and then ubiquitous, software applications grew to rely more and more on connectivity. Sadly, this opened the floodgates for an entirely new class of problems: cyber attacks. Let me quote from an excellent book on the subject, Richard Clarke's "Cyber War":
These military and intelligence organizations are preparing the cyber battlefield with things called "logic bombs" and "trapdoors," placing virtual explosives in other countries in peacetime. Given the unique nature of cyber war, there may be incentives to go first. The most likely targets are civilian in nature. The speed at which thousands of targets can be hit, almost anywhere in the world, brings with it the prospect of highly volatile crises.
Of course, cyber attackers exploit security weaknesses in software, and of course the system is as secure as its weakest link. But how does software acquire these weaknesses in the first place? One reason is that people who develop it lack the knowledge and expertise to do proper threat modeling. And even if the application was developed with security in mind, has it ever been tested for security vulnerabilities? This is where government regulators could step in, making sure all software has been secured at an appropriate level.

In conclusion of this essay, I would like to acknowledge that regulation doesn't always work, and it is entirely possible that a bad appointee will turn the initiative completely upside down. After all, doesn't Great Recession illustrate inability of the SEC to control derivatives market? And didn't oil rig explosion shed light on mass incompetence and corruption at MMS? But software has become such an important aspect of our civilization that we must at least begin a conversation.

No comments: