Monthly Archives: April 2009

Good humor

This was in an email after I purchased something…

This is an automated message from the Yahoo! payments robot to let you know we got your money. Flickr will send you a more entertaining email shortly.

I like this sense of humor from an application 🙂

Large-scale Separation of Concerns

Found this draft post from: “Last edited on April 20, 2009 at 4:10 pm” … Wonder where I was headed? Hah, I’ll add a 2011 conclusion.

While we have enjoyed employing Separation of Concerns at many levels:

  • Design Patterns
  • Layered architecture

I wonder if it can go farther to assist us in developing robust applications.

I have been preaching a “Dual Architecture” for eons…

  • Application architecture (features)
  • Technical architecture (code, framework)

However, I think this could be taken a bit further.

We use source code, classes, tables to be an abstraction of the underlying true system being run. The source gets converted… the classes get converted, database tables get converted… to something that the machine knows how to deal with.

User gets the functionality expected, the system does the grunt work.

The user is not too worried about the “how” it works, but rather more so that it simply works.

With software, it would be nice to be able to work at a similar higher level of intention, or functional need, with less regard to the nitty gritty detail.

After all, with most business applications, the fastest moving component is technology continually changing.

A major, large-scale SoC is to separate the Functional needs from the Technology.

Users care little if the app is built on Java, .NET, Ruby, COBOL…

2011 Addition:

Of all the languages and frameworks that I have used over the past 30 years, I would say Ruby and Rails (and it’s community) with MongoDB/MongoMapper has come the closest to allowing a greater portion of development time and thought to be spent on solving the customers needs and less time on the infrastructure nuts and bolts.And it is more than the language (Ruby) and the framework (Rails). It is the entire culture and constellation of tools and gems and community that makes a difference.

In addition, it feels like I can code more to the intention of what I need the software to do, and less about the details of the language getting in my way.

 

Should some apps be more highly engineered?

Part of the challenge for mission critical software is that very little of it is engineered and built to last for its intended lifetime.

This is not at all well thought out, just a “blip” that flitted through my head the other day… But at the risk of suppressing the idea, I’ll toss it out into the public domain for ridicule or conversation.

Some observations…

In an engineering world, requirements lead to rough sketches which lead to engineering designs (CAD), and then factor in production engineering to indicate efficient ways to build it, and then spend time in creating reusable and malleable designs with which to exercise the prototype and make further refinements to the design based on feedback (like Boeing 777 being designed entirely on CAD), and then finally produce the item.

By contrast, software engineering is largely an oxymoron in most places. Many in the software world have a good time honing people processes and doing a better job at being pragmatic with our resources (myself included). As a whole, we have a few pre-made hammers and nails and springs, we have some off-the-shelf components here and there, but largely we bang out unique snowflakes left and right. And we try to do it as efficiently as possible. (The promise of standard components has long gone unfulfilled — as if each window, bathroom, kitchen cupboard in your house has to be mildly unique.)

In some case, this is a fine approach. Relatively cheap. If system needs to be redone, just build another one with a different team a few years later. But in other cases where a system is supposed to serve major business functions and is supposed to be around for 10, 20, or 30 years… Which is the better approach?

  1. Build, grow, and maintain single system over time (which ends up being a legacy system on old hardware and using old software, etc., building up excessive technical debt)
  2. Be vigilant and rebuild the entire system every few years to take advantage of the latest in technology and new found domain knowledge, avoiding technical debt
  3. Spend time building a system or tools that can help stamp out the features and even allow you to re-tool to use new technologies. Something like an aircraft design CAD system that allows 3D simulation.

And yes, I realize the seeming cost of trying things out in “hard”-ware is possibly more expensive, so that the tooling has evolved to compensate. And yes, software is so easy to change, why bother working that hard to engineer it… If it works, great! If not, just keep hammering away at it until it does work. No material wasted! And yes, I know we have come a long way I suppose (although why are freaking dates so darn hard for database vendors to get right?)

Of course, it took engineering a long time to get there… And it requires a different skill set mix than just a preponderance of software “production line” workers:

  • “Requirements” Modelers
  • Design Engineers to package things up into a balanced whole that hits the sweet spot of functionality, cost, and performance (et al)
  • Technologists for the various disciplines (from UX to DB, and Java to .NET or COBOL)
  • Fabricators
  • Testers (who help early in the design stages to flesh out the design alternatives)

So, do we need to consider better ways to create major, long-lived systems? Do we need to be more CAD like? Maybe Model Driven Architecture (gasp)? Or Domain Specific Languages (yowza)? Or Language Workbenches (eek)?

As I said, not well fleshed out at all… just some extemporaneous/loose thoughts flinging around my (all too empty?) head.

Don’t Code the Fluffy Mackerel!

In 2009 I like to think I can cook pretty well when I put my mind to it — I began cooking in the late 70s under a chef’s tutelage… Cooking has some parallels to software development. (I think… I just made this up and it sounded good, so I am going to run with it for a bit.)

You can be very prescriptive and follow recipes to the “T” — like waterfall. Not a bad approach when you are learning (and in cooking — unlike in software — it can’t last more than a couple of days). However, always following the recipe yields no creativity.

So, as you master your craft (cooking or software) by following some prescriptive processes to get a taste of quick success, you are hopefully gleaning the underlying aspects of the trade. You can also work alongside a master chef to learn tricks and tips and sage advice. If you do a good job of learning the art and science of cooking, if you begin to understand the basics, you can then start to follow a more agile approach to cooking. The more you practice, the more you understand the inter-dependencies between food elements (or code). Doing a little sampling along the way (frequent builds, testing) helps ensure you are on the right track. If you decide to get fancy, sometimes you have to make throw-away (edible) prototypes so that your final presentation is the one you present to your guests.

I hope I never cooked (or coded) anything like what follows in the URL below. This link is a real hoot (and the real reason for this post). I almost wet my pants:

   Scary 1974 Weight Watchers Recipes.

I think this is a reminder to be certain you never, ever, code when you are on “date expired” acid from the late 60’s. You might code the “Frankfurter Spectacular” or the “Fluffy Mackerel” and expect admiring glances from your peers at the next scrum.

How could Weight Watchers have possibly published these recipes and photos thinking they were remotely “good” at helping you watch your weight? I guess the same way some lousy code gets written — because it seemed good at the time!

Oh, wait, it just dawned on me! After seeing these dishes appear at your place at the table, who would ever want to eat again? Even the celery log looks like a piece of compost. So this served as a great appetite suppressant I bet!

Oh, as I was wracking my brain trying to comprehend how someone — even in the 70s — could consider these recipes worthy of publishing and photographing, I recall my own childhood. At some point in the late 60s, early 70s, my Mom decided to make Cod Fish Balls. I kid you not. Breaded I think. And deep fat fried I think. Probably in Crisco. Too bad I don’t have an old photo of them to add to the Fluffy Mackerel Flickr collection!

Cool Mapping with SpatialKey

For our Blazemark incident preplanning software, I wanted to see what the properties/structures and associated water supplies (e.g., hydrants) looked like on a map.

Initially, I had run across “ZeeMaps” from using wetpaint. So I made a map:

.

It was pretty easy, I simply created a SQL query and dumped out a CSV list of data to have remotely processed. You get a response back via email as to the success/fail of the data load.

Later, a friend of mine mentioned his company (Universal Minds) built a cool product on Flex called “SpatialKey.” So I decided to give their beta a whirl. Definitely easier to use and more feature rich than ZeeMaps.

Have a look:

Technical Stimulus

So there I was at HIMSS where I met two friends…

While one vendor was explaining their products, they referenced how you could take advantage of the Stimulus Package — each doctor can get $44,000 by signing up for and using the electronic health record system they were selling.

As a selling point for their wonderful products, they had a real, live user — CIO of a 260-bed hospital — describe his experience. By automating the paper forms, they gained better use of their data and less storage needs, etc. The savings from reducing their staff of 150 people down to a mere 50 people was “$4M in FTE alone!”

I wonder if anyone else sees the irony in this… Or is it just me?