I attended the BCS Mini-SPA event a few months back.
The premise of the mini SPA is as follows:
If you attended SPA2006 you might find that the miniSPA2006 programme allows you to catch up with sessions you didn’t select at the event.
The annual SPA conference (formerly known as OT) is where IT practitioners gather to reflect, exchange ideas and learn.
It also served as a convenient advert for next year’s full SPA event.
It was also free, had a free lunch and got me out of the office for a day, so it pretty much fulfilled all my criteria.
The structure of the day was 6 sessions, divided into two parallel streams.
I attended “Distributed workforces”, “Modelling with Views” and “A Good Read” but the one that really interested me was “A Good Read”.
This was a panel of five people who had each proposed a book to discuss. Each member of the panel then read each book so they could discuss it and give their views and insights.
The really interesting part for me was that someone proposed Programming Pearls by Jon Bentley.
I’ve owned a copy of this book for years but have yet to finish it, (it’s back on my “to read” list now though).
Everybody roundly praised the book but one of the members of the panel questioned whether we needed to know that level of detail when it comes to coding efficient algorithms – “wouldn’t it be simpler to throw more CPU and RAM at a problem?” they said.
Someone in the audience then countered that algorithm efficiency was relevant once again when programming Web-apps. They said something along the lines of “Wait until 100 people hit that page on your site“.
Sadly the session ran out of time at that point so no conclusion was reached.
My own belief is that you do need to know code at that level, especially if you write Web sites or other similar client/server apps with many concurrent client requests.
I’m not saying everyone should know the Quicksort algorithm inside out, but if you program in Java (for example) you should know the difference between a Vector, an ArrayList and a plain old array and when to use each.
I have had personal experience of a badly written for loop bringing down a Web site on launch day.
The for loop in itself wasn’t the worst code ever written by any means, but it was probably executed 30 to 40 times per individual home page hit.
Multiply that by a few dozen concurrent hits (it was a busy site) and any flaws in that code were mercilessly exposed.
Embarrassingly for me, it was my code. Oops.
Ever since that day I’ve been unable to forget that no amount of “CPU and RAM” (and we had a lot) will help if you don’t get your algorithms right in the first place.