Tuesday, 8 April 2014

Going Backwards

Recently I've been wondering if PC software development peaked a few years ago, and has declined in some ways.  In the 1980s and earlier it was difficult, with much software being developed in assembler or C.  The 90s brought us things like Visual Basic which were enormously more productive, even if frowned about by some software purists.  Object orientation started to pay dividends, and with the arrival of things like .Net we had very elegant ways to produce powerful applications.

From my point of view as someone who has always worked with networking, things like the elegance of network streams and threading were a boon.  As far as the graphical user interface aspects, WPF was truly wonderful, even if it did require people effectively learning 2 languages.

But recently we seemed to have gained the idea that we must always use patterns, frameworks and middleware.  Things that started out as an elegant solution for some kinds of application, such as MVVM, have become mandatory.  Being able to ‘bolt in’ components where this gave great benefit to some applications became a requirement to use IoC containers for all applications.  Using middleware as a convenient way to connect between disparate systems became the only way of doing things.

I have two motivations for writing this.  The first is reading people’s advice to new developers as to what they ‘must’ do.  For example, someone asked a perfectly reasonable question about naming conventions in WPF, and was shot down with people saying how you ‘must’ not name controls because you ‘should be’ using MVVM.  Despite not knowing what the application was doing, and the fact that MVVM can easily make some applications a good 10 times more complex.  Similarly reading how you ‘must not’ use ‘new’ to create objects, as this is being a ‘control freak’ and an IoC container ‘must’ be used to inject the appropriate object, even if this has zero benefit to the particular application and adds significant complexity.

The second is having taken part in a large graduate program where teams had to compete against each other to produce a medium-sized client server application.  The teams varied enormously in their approaches, although the majority used things like REST, messaging services and Spring.  One team decided on a lower level approach, managing their own threads and using TCP/IP sockets and streams.  This was frowned upon, as it was felt that it would be so much less productive.  In fact they won the competition.  They had achieved much more than anyone else, the application performance was vastly better (it flew whereas the others crawled), and it turned out to be much easier to understand and maintain.

But it also applies in the ‘real’ world.  I remember helping a colleague as he put together a design using MEF, Entity Framework and lots of other enabling technologies, and weeks later was still hitting incompatibilities and the like.  Had he been using a typical approach from a decade ago it would have been completed in that time, and be much more maintainable.


Of course I’m not being extreme on this, just saying there are horses for courses.  Otherwise I’d end up like Linus Torvalds and still just coding in C.  And look where it got him!