Part of the challenge for mission critical software is that very little of it is engineered and built to last for its intended lifetime.
This is not at all well thought out, just a “blip” that flitted through my head the other day… But at the risk of suppressing the idea, I’ll toss it out into the public domain for ridicule or conversation.
In an engineering world, requirements lead to rough sketches which lead to engineering designs (CAD), and then factor in production engineering to indicate efficient ways to build it, and then spend time in creating reusable and malleable designs with which to exercise the prototype and make further refinements to the design based on feedback (like Boeing 777 being designed entirely on CAD), and then finally produce the item.
By contrast, software engineering is largely an oxymoron in most places. Many in the software world have a good time honing people processes and doing a better job at being pragmatic with our resources (myself included). As a whole, we have a few pre-made hammers and nails and springs, we have some off-the-shelf components here and there, but largely we bang out unique snowflakes left and right. And we try to do it as efficiently as possible. (The promise of standard components has long gone unfulfilled — as if each window, bathroom, kitchen cupboard in your house has to be mildly unique.)
In some case, this is a fine approach. Relatively cheap. If system needs to be redone, just build another one with a different team a few years later. But in other cases where a system is supposed to serve major business functions and is supposed to be around for 10, 20, or 30 years… Which is the better approach?
- Build, grow, and maintain single system over time (which ends up being a legacy system on old hardware and using old software, etc., building up excessive technical debt)
- Be vigilant and rebuild the entire system every few years to take advantage of the latest in technology and new found domain knowledge, avoiding technical debt
- Spend time building a system or tools that can help stamp out the features and even allow you to re-tool to use new technologies. Something like an aircraft design CAD system that allows 3D simulation.
And yes, I realize the seeming cost of trying things out in “hard”-ware is possibly more expensive, so that the tooling has evolved to compensate. And yes, software is so easy to change, why bother working that hard to engineer it… If it works, great! If not, just keep hammering away at it until it does work. No material wasted! And yes, I know we have come a long way I suppose (although why are freaking dates so darn hard for database vendors to get right?)
Of course, it took engineering a long time to get there… And it requires a different skill set mix than just a preponderance of software “production line” workers:
- “Requirements” Modelers
- Design Engineers to package things up into a balanced whole that hits the sweet spot of functionality, cost, and performance (et al)
- Technologists for the various disciplines (from UX to DB, and Java to .NET or COBOL)
- Testers (who help early in the design stages to flesh out the design alternatives)
So, do we need to consider better ways to create major, long-lived systems? Do we need to be more CAD like? Maybe Model Driven Architecture (gasp)? Or Domain Specific Languages (yowza)? Or Language Workbenches (eek)?
As I said, not well fleshed out at all… just some extemporaneous/loose thoughts flinging around my (all too empty?) head.