New or Old Software?
Is old software better than new?
I came across interesting infographic on some website which displays number of code lines in different software. I do not know the accuracy of the graph, but what is unsurprising that the number tends to grow with time, or with every new version of the software. Given the fact that the time between software version releases typically reduces, that means that more and more lines of codes need to be produced per time unit to ship next release of the software.
How that can technically be achieved? There are several paths:
- More people work on the same project as before. That is true in many cases. Software companies have grown to gigantic sizes both financially and by number of people resources they employ.
- A programmer writes more code per time unit as he/she did years ago. Possible but unlikely. Intuition suggests opposite, due to larger complexity which might need to rethink twice before writing a line.
- Use of sophisticated automation tools which write code for you. Partially. There are areas where automation in code writing (not to be confused with automation in system administration, see my other post) is popular (e. g. documentation generation). But for most other parts it comes down to real person with his/her speed limits.
- Inclusion of previously written code and libraries. This is valid point. Well written software can serve generations of releases, and pieces written in the older days can and do serve as building blocks for newer systems.
The last point in the list is very good reminder that not only what is new is good, and sometimes jumping to the new platform, library, vendor etc only because it is new is not the best option. The fact that so many systems include parts dating back to several decades tells exactly that. It is very tempting to declare this or that new technology being disruptive, but lets do not forget that it did not appear from nowhere in the first place. It has probably more problems and issues than the old legacy system in the first place. It takes good amount of time to smooth out every piece built, and sometimes it is not months but years. Especially that is true in data-intensive and network-dependent areas, like telecoms where huge amount of data is accumulated over time which completely changes system behavior from what it was in the beginning. It is nearly impossible to achieve neither experience nor amount of real life data necessary to predict the system behavior. That is where old, mature systems are highly valuable and should not be dismissed as outdated or not in fashion.
This all reminds me of personal experience with relatively newly designed built system, where instead of classical transactional database engine, it was decided to use one relatively new, document oriented database engine. There were several reasons why it was selected, one of them of course its schema-less structure: nobody likes to write database creation statements, check for field sizes, etc... - it all requires a lot of previous planning and quite a lot of overhead work. The project proceeded, the system worked and still works well. However, the same persons involved in the project now asked if something to be changed/improved in the system say that a good old relational database engine would be first thing to put in.
That was just an example to not underestimate legacy systems.