Monday, August 09, 2010

Should Software Development Be Regulated?

The question of government regulation of business is high on the agenda these days. Over the last couple of years we have witnessed some spectacular events, like the Great Recession of 2008 and Deepwater Horizon explosion (and subsequent Gulf of Mexico oil spill) of 2010. These have already become case studies of the importance of government regulation. New legislation on financial and healthcare reforms will significantly increase the role of government in those areas. So, here is my question: has the time finally come to regulate software development?

I realize a lot of people have a negative knee-jerk reaction to anything that might expand the role of government (I can almost hear them scream!). Personally, coming from a communist country, I tend to be fairly skeptical in this area: I've seen what happens when bureaucrats are given unchecked power over people's lives. But let's consider the matter objectively. After thinking about it for a while I came up with three different areas where regulation can bring positive change.

Professionalism
It never ceases to amaze me that you cannot work as a plumber without a plumbing license, but no license is required to write software. Mind you, obtaining a plumber's license is far from a formality: it requires four years of job training, and the applicant must pass a written exam. On the other hand, anyone can apply for a software engineer position: it is up to the hiring company whether or not to ask for evidence of some formal training. Some companies administer tests, or ask a bunch of technical questions during interview process, but there aren't any standards.

As a direct result, ranks of software developers are full of people who picked programming as a hobby or were attracted to it by higher salaries, but never learned the mathematical foundations of the discipline. I would argue that these people are more likely to use poor coding practices, steer clear of object-oriented programming, and never bother with design patterns. Note that I am not advocating the supremacy of college graduates; all I'm saying that programming requires proper training.

By the way, similar observation can be made about businesses. For example, a financial services company may own cars, but is unlikely to have an in-house team of mechanics who fix them. And yet, the same exact company doesn't have second thoughts about maintaining an in-house software development organization.

Quality Control
Given the role software is playing in our lives, it's hard to understand why people tolerate low-quality applications. Although there are many reasons for poor quality, the industry pretty much knows how to address this problem. It all starts with a solid design, of course: application architecture should be appropriate for the task. Developers should write automated unit tests and ensure good code coverage, and these tests should be executed as part of every build. Each application should have well-defined white box and black box test cases, and appropriate performance testing should be done before the system goes live.

However, good quality control can be expensive: for example, the time used to write unit tests is the time developers do not implement new functionality. Automated testing tools for QA can be very expensive, too. It's no surprise some businesses prefer to save money on quality, given the extraordinary tolerance consumers have towards buggy software. By enforcing standard QA processes, government regulators can make good reliable software a reality and make life easier for the end user.

Security
Over the last 15 years, as high-speed internet access became first widespread and then ubiquitous, software applications grew to rely more and more on connectivity. Sadly, this opened the floodgates for an entirely new class of problems: cyber attacks. Let me quote from an excellent book on the subject, Richard Clarke's "Cyber War":
These military and intelligence organizations are preparing the cyber battlefield with things called "logic bombs" and "trapdoors," placing virtual explosives in other countries in peacetime. Given the unique nature of cyber war, there may be incentives to go first. The most likely targets are civilian in nature. The speed at which thousands of targets can be hit, almost anywhere in the world, brings with it the prospect of highly volatile crises.
Of course, cyber attackers exploit security weaknesses in software, and of course the system is as secure as its weakest link. But how does software acquire these weaknesses in the first place? One reason is that people who develop it lack the knowledge and expertise to do proper threat modeling. And even if the application was developed with security in mind, has it ever been tested for security vulnerabilities? This is where government regulators could step in, making sure all software has been secured at an appropriate level.

In conclusion of this essay, I would like to acknowledge that regulation doesn't always work, and it is entirely possible that a bad appointee will turn the initiative completely upside down. After all, doesn't Great Recession illustrate inability of the SEC to control derivatives market? And didn't oil rig explosion shed light on mass incompetence and corruption at MMS? But software has become such an important aspect of our civilization that we must at least begin a conversation.

Monday, August 02, 2010

VB or C#? A Personal Journey

Last time I checked LinkedIn group .NET People, there were 435 posts in the "VB or C#?" discussion. That's strange, I said to myself. After ten years and four language iterations are there enough differences to spark the debate? So I started reading...

Well, there were a couple of people who found genuine gaps (like XML literals in VB or yield keyword in C#). There were a couple of trolls, and a couple of people just having a good laugh ("I prefer C# over VB because I am an American!"). But the majority of comments were pure opinion. "Code is cleaner", "more readable", "I hate semicolons", "I love curly braces", "too verbose", "closest to plain English" were some of the statements repeated over and over. IMHO, this entire discussion sheds more light on the .NET development community than on programming languages themselves.

It's no secret that people come to software development using [at least] two separate routes. Some study Computer Science in college (even if it's not their major or they never graduate). They are probably taught programming courses in Java or C++, so C# comes naturally to this group. Second category of developers started out in a different line of work and discovered Office automation with VBA somewhere along the way. Or perhaps they learned VBScript in order to maintain their department's ASP page on the intranet. When .NET came along, this group made a transition to VB.NET.

Now, I'm not trying to argue which group has better programmers - I've seen extremely bright engineers without CS degree, as well as some dim bulbs who turned out to have a Master of Science in CS. But it's a common knowledge that C# was designed from the ground up as a managed object-oriented language, while VB.NET is essentially the outcome of multiple cosmetic surgeries made to an aging body. First change happened when original BASIC - Beginner's All-purpose Symbolic Instruction Code - was updated to support structural programming. It has acquired the "Visual" prefix, but didn't become fully object-oriented until its VB.NET incarnation. Nowadays, Microsoft works diligently to keep the language on par with C#, adding constructs like generics, lambda expressions, closures, and so on.

However, the efforts to modernize VB have little impact on most VB programmers, who probably just aren't familiar enough with contemporary design and programming patterns. So, it's no surprise they tend to get a little bit defensive...

Interestingly, I myself managed to travel both paths to software development. My college major was Applied Mathematics and Cybernetics, and I had plenty of instruction on typical CS subjects. We used Turbo Pascal in the classroom, and by the end of school I transitioned to Borland C++. Incidentally, Soviet Union imploded at about the same time, and in the chaos that followed, my aspirations to find a job in IT became laughable (people were lucky if they had any job at all - it was not unusual in those days for a doctor to work as a taxi driver). So, I ended up doing bookkeeping, accounting and then business planning for a big multinational corporation.

Before long, I was dabbling in Microsoft Access and creating automated databases and spreadsheets for my team. VB was easy and forgiving, and, more importantly, it was ubiquitous. When I finally managed to switch my career back to IT, I didn't feel comfortable with latest C++ tools and frameworks, so I stuck with VBScript and VB6. When .NET was introduced, my first instinct was to transition to VB.NET. However, I decided that it was time to re-educate myself. I started reading about design patterns (which weren't even on the radar when I was in college), test-driven development and extreme programming. I studied source code and tackled new classes of problems, like multi-threaded services development.

Eventually, I realized that C# was a better choice for me, made a switch, and never looked back. This was around 2005, when gap between the two languages was fairly big. Five years later, it is almost gone. But like I said earlier, it's easier to update a compiler than to change people's mindset. Both VB and C# are here to stay, I'm just waiting for someone to port another of my college-era languages, Prolog, to .NET framework...