# Tuesday, January 27, 2009

http://www.developer.com/img/2002/09/18/OOP01.gifToday I read a nice post by Brian Rasmussen in which he describes how to set up Visual Studio to generate class definitions which are sealed by default. I had to post my own point of view in the matter although it is going to be awkward. Not in the teenage, “define me”-sense but in my choice of language as I can’t really quote him effectively, so you’ll have to make do with me paraphrasing his post :)

Now I’d like to put myself in the I-could-not-disagree-more camp. The default choice in my humble opinion should be to leave classes open and have all members be virtual if you want to take it to the extreme. This would leave the system open for change just as the SOLID principles state. Java got it right in my opinion.

To be able to make the decision on whether a class should be open for inheritance you’d have to travel to the future to see what the class might be used for. If you’re anything like me you’re probably challenged in the time travelling department, and so I postulate that you can’t really make a good decision in the matter. More often than not closing the system for change will be the wrong choice as requirements and environments change.

I do agree with Brian’s statement that sealing a class would take away options thus creating a simpler API. I would, however, also like to state that there are better ways of achieving a simple API. How about not exposing the type all? Why not create a simple interface, which exposes only what is needed for the task at hand?

Please don’t make the default choice for your classes sealed. Go with open classes and live a happy life with a system, which is open for change. Trust me I’ve seen systems, which adopted a closed stance and it wasn’t pretty. The team kept hitting the wall in the changes they wanted to make, simply due to the fact that the original developer had no time machine, which enabled him to foresee the changes, which future members of the team needed to implement.

posted on Tuesday, January 27, 2009 12:19:30 PM (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Sunday, January 25, 2009

logoThis is going to be the last post in which I mention Twitter… seriously. In fact I’m going to start right now by not talking about Twitter but instead I’m going to focus on a side effect of Twitter: Corporate Tweeting. (You would in fact be correct if you assume that I just made that term up :))

The Vertical Niche

Like Google Twitter has got the market for short public messages pretty much sewn up. Does that mean that there isn’t a market for short public messages anymore? As Google so clearly has shown sewing up the market does mean that others can’t compete in that same market. It’s all about the vertical niche, baby!

http://www.geocities.com/glendalelandmark/IMG_3947.JPG

Yammer is the New Black

What IMDB is for Google. Yammer is for Twitter. Before I dive into what Yammer is let me start out with a challenge we have at Vertica: As we spread to different geographical locations how do we keep the company spirit going strong? How do we make the departments one coherent company with the same values and a sense of collectiveness?

image

We spent a couple of meetings debating that very issue and of course the good old ones like doing company outings, shared social events, wax eachother’s backs all came up but for me the most interesting one, aside from waxing eachother’s backs, was to try and use Twitter and also allow for the usual private chit chat which goes on inside a company. Some jokes are best kept inside the company… like you know that waxing one. You get my point right?

Yammer

image Yammer has set up shop with a Twitter clone which is ideally suited for running private Twitter-like networks. Bascailly all you need are e-mail addresses on the same domain and you’re golden. Sign up is stupid easy: Enter your e-mail and you’re good to go.

From there is smooth sailed with a nice Adobe AIR client (surprise Adobe AIR is not just for Twitter clients!) which gives you the ease of posting new messages that you’re familiar with from that other netwokr which I won’t mention from here on in.

At Vertica Yammer is quickly turning into a questions and answer service which translates directly into increased productivity because A) You don’t have to know who knows what, you just ask the question and someone will chime in, and B) You don’t interrupt people who don’t want to be interrupted because if they’re not looking they won’t answer.

Now whether or not it will actually serve its original purpose remains to be seen. The new offices in Zealand is still under a month old and quite small so I guess we’ll just have to wait and see. What’s interesting though is that people at the first office were very quickly to adopt Yammer.

posted on Sunday, January 25, 2009 7:00:00 AM (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Monday, January 19, 2009

Twitter.com

In a previous post I wrote about Twitter and what it means to the Danish developer community. The real value of Twitter however does not come by visiting the site from time to time. You have to participate actively to keep the conversation going and that’s where the Twitter clients come into the picure.

I’ve been through a bunch of them and ultimately decided which one I liked the best. I’ll try and spare you from doing the same all over.

Digsby

Digsby gets honorable mention becayse it was my first Twitter client and because this program how I got started with Twitter and in no small way the reason why I still use it.

Digsby is labelled a social network client which gives you access not only to Twitter, in fact that’s the least of it, but also to Messenger, LinkedIn, Facebook, Yahoo Chat, Google Talk, the list goes on and on but you get the point. Digsby speaks with most social networks out there.

That was my reason for trying it out as I really didn’t feel that I needed a dedicated program to try out Twitter. I spent quite some time with Digsby and felt for a long time that it was the way to go. In fact the reason I dropped it was not so much Twitter related as it was Messenger related. It simply didn’t work as advertised, sending file for one was spotty.

As a Twitter client it performed admirably and for me at least it was a low cost to pay for trying out Twitter as I used it primarily as a Messenger client with the added benefit of being able to send out my tweets as well.

Twitterrific

imageTwitterrific is an interesting one as it didn’t start out on the desktop for me. It actually started out on my iPhone and went I got a Mac late last year it was the natural choice for the desktop as well as the iPhone experience with this thing is flawless as far as I’m concerned.

Now the application is pretty much the same on the Mac. Interestingly it turns out that the functionality doesn’t quite cut it on the desktop. Due to the nature of tweets messages need to be as compact as they can be.

http://www.dech.co.uk/wp-content/uploads/2008/07/photo.jpg http://estwitter.com/wp-includes/images/twitteriffic.gif

Imagine that you’re posting a link which can easily be 50 - 60 characters; at that point you really want to be able to shorten a link easily and post the short version insteand. Unfortunately Twitterrific doesn’t support this which is fine on the iPhone where cut and paste is not to be found so you tend not to post links. On the desktop though links are thrown left and right so not having the feature is a real pain point – at least for me.

Thus Twitterric was evicted from the Mac desktop but remain on the iPhone as one of the first apps I ever installed on that thing.

twhirl

image Before I delve into twhirl a word on Adobe AIR. Not so much because I find the platform interesting but because I find it interesting that as a platform a lot of the ecosystem is made up of … wait for it … Twitter clients. It’s interesting to me that a service like Twitter can drive a platform like AIR and not the other way around.

twhirl is pretty much like Twitterrific only the name is quite a bit easier to spell and it supports the link shortening feature I mentioned above. It being an Adobe AIR app also means that it’s cross platform for those us running cross ethnic platforms out there.

twhirl is like the girlfriend you can’t quite figure out if you want to spend your life with or leave for someone else. I left but ultimately came back so I guess it’s forever between us :)

And finally remember to follow me on Twitter once you get your favorite client up and running :)

posted on Monday, January 19, 2009 11:58:14 AM (Romance Standard Time, UTC+01:00)  #    Comments [3] Trackback
# Sunday, January 04, 2009

Community-People Back in May 2008 I wrote a short note about me trying out Twitter. At the time I just wanted to know more about what Twitter actually was as I heard about time and again on podcasts, blogs, everywhere really.

Interestingly whenever people talked about Twitter it was due to the service being down but still I felt compelled to take it out for a spin.

Twitter of course is the service which enables you to post little notices about what you’re currently doing which doesn’t sound all that useful until you actually sit down and think about it. In reality it turns out that there are numerous applications for a service like that. The notices are limited to only 140 characters which means that you have to be really short and sweet in the stuff you send to the service.

Fast forward to January 2009 with the experiment done and my conclusion is in: Twitter is indeed a service worth paying attention to. Read on to find out why.

Now what prompted this post is a question I got from Brian Rasmussen when I suggested that he take a look at it. Basically he asked why he should use Twitter, a question I didn’t quite know how the answer with anything but, “it’s cool”. Since that time I’ve been wondering what makes Twitter worth my while and yours as well, dear reader.

Jesper-Blad-Jensen-Twitter

Twitter is a lot of things to a lot of people. The value to me and our little community in particular lies in tying together everybody in a more coherent way than what is possible today. To me at least Twitter is a place where I get to keep in touch with a number of the Danish .NET developers in a far more personal way than what is possible at DotNetForum, ActiveDeveloper, etc. because the service is geared for throwing stuff out there without thinking too much about it.

 Morten-Jokumsen-Twitter

Why do I call it the back channel of our community? Due to the nature of the messages you stick on Twitter it quickly becomes just little notices about what’s going on right now. For example Mads used it to get an idea of which IoC framework to go with, I recently got a Mac and had no clue where to start so I elicited suggestions for apps to use, Niels uses it for communicating with the Umbraco team from time to time, recently Jesper wanted to know what to include in his ASP.NET MVC presentation coming up in ONUG in January, and Rasmus had a memory leak which he needed some input for fixing.

Mads-Kristensen-Twitter

Basically what you get is an inside look in the process leading up to a blog post, presentation, the solution to a giving issue, or whatever; something you don’t really get from reading the final product and often times much more interesting.

I would encourage you to go create an account with Twitter and follow a bunch a people from the Danish .NET community. Morten from DotNetForum was even kind enough to create a wiki with the Twitter names of a bunch of the Danish .NET guys which you can use as a starting point. You can follow me using my Twitter name  publicvoid_dk.

Of course there are a number of people which I’d like to see get Twitter accounts like Brian Rasmussen, Søren Skovsbøll, Mark Seemann, Kasper Bo Larsen, and Martin Bakkegaard Olesen,

posted on Sunday, January 04, 2009 1:41:19 PM (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Monday, December 29, 2008

I had grand goals for 2008 when we started out the new year last time around, only stuff happened and my activity level on this blog has not been up to the goals I initially set out to reach. In spite of that I'm very happy with my accomplishments for 2008. They just happen to have occurred in a slightly different way than I originally thought.

The Blog

Surprisingly the most visited and commented post on the blog during 2008 wasn't even written during 2008. It caters to the more mainstream internet users, was written in 2006, and is about an annoyance I had with Windows and the My Music folder which disappeared from time to time.

But we are looking back at 2008 here so it's fitting to mention the posts which I'm most proud of which were actually written during 2008.  First up is my Developing with Commerce Server 2007 series in which I dove into the the development experience of Commerce Server. Also on the topic of Commerce Server 2007 I wrote a post on a generic mapping piece I did for a project early in the year which turns CS objects into nice POCO object for nice testability.

Work

Of course there was real work to done and 2008 brought some really interesting challenges with me participating in one of the largest e-commerce projects I've ever had my hands on. Huge customer, international team of devs, traveling across the Atlantic to do some of the work. All in all a great learning experience and as a result I'm now able to provide even better service to our customers. Oh and it was kinda fun too :)

I got to attend a couple of conferences as well. First Daniel from Microsoft was nice enough to invite me to JAOO; a conference I enjoy a great deal and later in the year I had a unique chance to fly out to Los Angeles to participate in PDC 2008. I have to say that if you ever get a chance to participate in a conference like the PDC you really should jump at it. It's spectacular show to be sure. I did a couple of podcast episodes about it too; in Danish mind you.

Finally I'm happy to report that we managed to add a number of very talented people to both to my own team at Vertica and to the integration team as well. I'm proud to have such great colleagues and to be able say that every day I learn something new as a result.

Aarhus .NET User Group

Now as I started the post out by saying that I haven't spent as much time on the blog as I would have liked and there's a really good reason for that: Aarhus .NET User Group which has sucked up a significant part of my time.

During 2008 the core group and I organized thirteen meetings, indeed we didn't miss a beat the entire year and even managed to do a bonus meeting in December with my good colleague Daniel about unit testing. Additionally we pulled off a code camp in the beginning of the year, the ANUG 1 year old birthday dinner, and a Christmas Dinner. Not too shabby if I do say so myself.

Support for the user group during 2008 was tremendous and I couldn't be happier about where we're at after just one and half year of operation.

More importantly we've shown other .NET developers in the Danish community that a user group in Denmark is viable and as a result new groups have sprung up during 2008. As I write this groups are up and running in Odense (ONUG), Aalborg (AANUG), and Copenhagen (CNUG).

ANUGCast (www.anug.dk/podcast)

Ever since we started the user group we've had requests for putting the meeting content online somehow, be it video, audio, or something else entirely. What we did from the start was write meeting summaries which weren't really the ideal way to bring the content online. It's adequate and we'll continue to do so but it's been clear from the start that it was far from sufficient.

Late in 2008 it struck me that the podcast format might be the ideal way of addressing the requests. With that in mind I set out to create a podcast based on the topics of the meetings. With that ANUGCast was born with the initial goal: to bring out an episode once a month. This quickly escalated to one per week and so far it's gone really well. In fact episode thirteen was posted today and I've got a bunch of episodes already in the can just waiting to get released.

The podcast is my little baby and I guess most of the time which would otherwise have been spent on the blog got diverted there. I enjoy hosting the podcast a great deal, so much so in fact that I'd do it full time if I could :)

Since starting out the podcast I've gotten it registered with more than 50 aggregation sites, we're on iTunes, and we've have more than 4000 5000 downloads since the pilot episode in September 2008, a number I'm particularly proud of. We seen a steady climb of downloads since the pilot episode and the past couple of months saw more than a thousand downloads each.

I guess I should do a couple of posts on how ANUGCast is made and some of the tricks I picked up wearing the hats of producer, sound engineer, basically every damn hat needed to make it happen :)

2009

The coming year will bring a similar activity level on the blog as 2008. It is my every intention to keep up my work with the user group and the podcast and even step it up a bit. 2009 will bring more real marketing of the user group to reach new audience which I'll write more about after we hold the first meeting of 2009. There's something to look forward to for sure. 2009 will also bring our first IT pro related meeting and will cover Hyper-V. It's intended as a pilot to kinda try the waters for something like that.

Oh and I went and got myself a Mac so I guess I'm sort of a Mac switcher as of December 22nd... 2009 is going to be interesting for sure.

posted on Monday, December 29, 2008 10:41:45 PM (Romance Standard Time, UTC+01:00)  #    Comments [1] Trackback
# Tuesday, November 11, 2008

At Vertica we employ a wide range of Microsoft server products in our solutions to maximize customer value. To help us manage these often complex environments we rely heavily on virtualization. For the longest time the obvious choice was Microsoft Virtual PC simply because it was there and freely available to use and just being able to run a virtual machine was amazing in its own right.

Our default setup when developing in the virtual environment is to install everything needed inside the virtual machine and use that exclusively. Running IIS, a couple of server products with Visual Studio and ReSharper works well but we’ve found that performance leaves something to be desired.

The obvious answer is to move Visual Studio out of the virtual environment, do development on the physical machine, and deploy the code to the virtual environment and test it there. Basically I require two things from this: 1) Pushing the build to the server should be very simple, 2) Debugging must be supported.

Pushing Code

We’ve got a bunch of options for pushing code to another environment: Publish Wizard in Visual Studio, msbuild or nant tasks, Powershell, and my personal favorite bat files :)

I wanted to create a generic solution which doesn’t dictate the use of msbuild or any other technology so I went with a bat file which in turns calls robocopy. With this in place we’re able to push files over the network to the target VM. Of course a one-time configuration of the virtual environment is needed but that isn’t in scope for this thing.

Download my deploy bat file. Basic usage Deploy c:\MyWebSite \\MyServer\MyWebSiteVDir.

Robocopy is part of the Windows Server 2003 Resource Kit.

Remote Debugging

Second requirement is debugging. I want a solution which is on par with running Visual Studio inside the virtual environment and that means debugging people! :)

The steps for doing remote debugging are well documented but for completeness sake I will include them here with nice screenshots to go along.

1) Copy Remote Debugger from \program files\Microsoft Visual Studio 9.0\Common7\IDE\Remote Debugger to somewhere on the virtual machine, e.g. desktop.

2) Run Remote Debugger on virtual machine (msvsmon.exe).

3) Grab the qualifier from the Remote Debugger (You’ll need it in a second).

Remote-Debugger-Qualifier

4) Connect to Remote Debugger from VS on physical machine via Debug > Attach to Process (CTRL + ALT +P)

5) In the Qualifier input field enter the qualifier from Remote Debugger window.

Visual-Studio-Attach-To-Process 

Volia. Set a break point on the remote machine and see the code break in Visual Studio.

VMWare

I stated earlier that we’re using Microsoft Virtual PC which is true but it’s also true that we’re looking into VMWare Workstation. My first reason for doing so is the performance boost which comes from running in VMWare. I haven’t done any sort of scientific testing of how much faster we’re talking about suffice it to say that it’s enough that you notice it when you’re going about your business in the virtual environment. VS is faster, compiles are faster, everything is just smoother. In my book the best sort of performance metric there is :)

Additionally VMWare provides other interesting features. The first one you’ll see is that storing and restoring state of a VM is blazingly fast. Enough so that you’ll actually find yourself using the feature all the time. I know I am.

Secondly VMWare supports multiple monitors. That’s right. Simply select how many monitors you want supported and it’ll do it. You can even switch on the fly. In case you’re wondering, yes, we do have three monitors setup for all the developer machine in the office :)

VMWare-Workstation-Multiple-Monitor-Support

The final feature is significant enough for our story to warrant a paragraph of its own. I accidentally stumbled across it this morning when I upgraded VMWare to version 6.5.

Remote Debugging Support in VMWare

You read my earlier steps to get remote debugging working which will work for any sort of virtual environment. VMWare however brings some nice debugging features to the table available right there in Visual Studio.

1) Goto the menu VMWare and select Attach to Process.

VMWare-Workstation-Debug-in-Virtual-Machine

2) Select the VM you want to start debugging on and point to the Remote Debugger that you’ve got locally in \program files\Microsoft Visual Studio 9.0\Common7\IDE\Remote Debugger\x86.

VMWare-WorkStation-Attach-to-Process-in-Virtual-Machine

3) Click the Attach button and the Remote Debugger will launch inside the VM and you’re ready to debug.

No need to copy anything to the VM. It just works. You can even setup a config for this which enables you to attach to the debugger with F6. Nice!

In conclusion running Visual Studio outside of the VM is not only possible but with the right tools like VMWare in hand it’s even an enjoyable experience. Have fun!

posted on Tuesday, November 11, 2008 10:35:21 AM (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Sunday, October 05, 2008

Day 3 of JAOO was a potpourri of topics for me, everything from JavaScript as an assembly language, JavaScript for building an OS, developer best practices, and data synchronization with a shiny new Microsoft toy. If you didn’t catch my summaries of Day 1 and Day 2 please make sure that you check them out.

Five Things for Developers to Consider

Last year I attended a couple of developer best practices sessions and came away liking them quite a bit so I figured I should attend at least one this year as well. The first one this year was basically five things which Frank Buschmann and Kevlin Henney collectively considers to be important to developers.

Of all the things they pulled out of their hats I liked their points on expressiveness the most. They talked about bringing out concepts which are implied in both the architecture and the low level design of a solution; something we strive to do as well. One of the key aspects when writing code I find is that more often than not code is written once and read dozens of times which means optimizing for readability is not only a good thing to do but the only thing to do.

An example of the above are variables of type string. Usually these guys contain a lot more than just mere strings, e.g. XML, social security numbers, etc., instead of going with just the string you could go for a class of type SocialSecurityNumber which would be a lot more explicit. The little things count.

Developer habitability is a term they touched on which I quite like. The idea is that if we create nice usable solutions which are easy to understand and simple in their composition developer habitability is increased – basically the code place is a nice place to live :)

Keeping in Sync

image Two-way synchronization is a notoriously difficult challenge to solve. Mostly when I’ve come up against this thing I’ve gone for a simpler solution like selecting a data master which overrides the slaves. Naturally I was excited to learn that Mike Clark was giving a talk on Microsoft Synchronization Framework which tackles this very issue.

Sync Framework actually forms the backbone of tool you might know already: Sync Toy which sync files across the network, file system, or whatever. Certainly a neat feature Sync Framework is about much more than that. It basically enables us to synchronize our custom data stores which to me is very exciting.

Included in the box is support for all data stores which have an ADO.NET Data Provider so we’re talking all major databases here. Additionally the framework gives us rich hooks so we can grab any step in the pipeline and modify it to our heart’s content.

A JavaScript OS

Really? An OS done in JavaScript? Apparently so if Dan Ingalls has his way, Actually he’s already done some amazing work on this Sun project which aims to liven up the web by doing away with a HTML replacing it with vector graphics rendered by a  JavaScript engine.

Actually my words won’t really do it justice so instead take a look at this video; basically the entire talk. Once you’re done with that go play with Lively Kernel live on the web.

JavaScript as an Assembly Language

image Keeping in the same vain I decided to go take a look at Erik Meijer talking about his current project: Volta. Volta is a project aiming to allows us to defer decisions on deployment model to a much later point in the project than what we currently do today. The current state of affairs is pretty much that we need to decide very early in the project phase which might or might not make sense. In any event having the option to defer those kinds of decisions is always better right?

Now the part which Erik focused on is the piece which allows us to run our apps in the web without actually coding for the web. The premise here is that if we view JavaScript as a language which we can target with the JIT compiler and generate a web implementation for our app which then runs without in web specific code ever written by us a devs.

Last year Erik gave the keynote at JAOO and talked about Volta at which time I was skeptical to say the least thus it was interesting to actually see that there’s some meat on the project after all. The idea is interesting to say the least and I look forward to seeing where it goes from here.

With two “extreme” JavaScript session done I was all JavaScripted out for the day but I will say this: My days doubting JavaScript as a “serious” language are way behind me.

TDD Take 2

image One is the big topics for me last year at JAOO was test driven development so I was curious to see whether new stuff had come up in the intervening time from then to now. Giving the talk on TDD was  Erik Doernenburg. I won’t go into a lot of detail about the talk because as it turns out not much have changed in the span of a year.

What was interesting for me to note is that our work with unit testing and test driven development at Vertica has paid off handsomely as everything that ThoughtWorks, which I would describe as the thought leaders in this space (no pun intended), are doing is basically what we’ve spent the last year implementing and I’m happy to report that we’re at the point where the culture is basically sustaining that particular way of doing code.

So a year and a half ago I set the goal of become better at doing unit testing and my great colleagues have ensured success in that area. For the coming year the focus will be on builds, continuous integration, and release management. To me these are natural steps in our continued development of our way of doing things … and it’s fun too :)

posted on Sunday, October 05, 2008 9:48:05 PM (Romance Daylight Time, UTC+02:00)  #    Comments [0] Trackback
# Friday, October 03, 2008

JAOO-logo Day 2 of JAOO 2008 was all about architecture for me, agile architecture, architecture reviews, requirements gathering, architecture testing, and finally lessons learned in architecture, Be sure to catch my summary of JAOO 2008 Day 1 if you missed it.

Architecture Reviews

Frank Buschmann from Siemens in Germany was track host and also the first speaker of the day. I caught a couple of talks with Frank last year and it’s apparent that he knows his stuff. While hugely important the architecture talks tend to be quite difficult to follow because the very nature of the topic is fluffy.

Most of the talk was pretty run of the mill in terms of how to conduct an architecture review. I’ve never formally conducted such a review but we do do them at regular intervals at Vertica just not in any sort of structured manner. We do them when they make sense and they usually consist of peer reviews and initial design sessions.

image Most interesting to me were a couple of techniques which Frank brought to light to do á formal architecture review. It’s not something you do every day and it’s certainly not something which requires a lot of structure.

My key take away from the talk is the fact that preparation for an architecture review is essential. Basically you need to sit down and try and figure out what you or the client expect from the review as the goal will impact the process of doing the review. This highlights why we can get away we can get away with very informal reviews because our goal is usually to verify that the selected architecture.

Now the situation changes rapidly when we’re conducting architecture reviews for other companies. Here the objective is to both verify architecture but more importantly to figure out what went wrong after the fact when a new system doesn’t satisfy non-functional requirements, lacks adaption in their internal dev organization, lacks maintainability, or something else altogether.

So I took away the fact that I need to be a lot more conscious about what the client expects to get out of a review and I must admit that I’ve taken a lot of satisfaction from going in and pointing out all the deficiencies in existing systems without giving thought to the fact that more often than not systems due have something good to bring to the table in spite of its deficiencies, perceived or otherwise.

Requirements Gathering

image Next up a talk which I didn’t really know what to expect from. The talk though turned out to be one of my favorites at this year’s JAOO due to the fact that it was very different from what I’ve seen at any other conference and it covered a topic, the importance of which I can’t stress enough, Communication.

Chris Rupp from Sophist at which she is the CEO and business analyst. Before I get started with my summary I must mention the fact that she spoke flawless English; a feat I’ve rarely seen performed by a German person. No hint of accent, nothing, just perfect English.

The meat of the talk was all about understanding what your client is telling you and more importantly filling in the blanks. The premise of the talk was basically something that we’ve know collectively in the software business for a while: The client doesn’t know what he/she wants. She had a twist on this though that I couldn’t agree with more which went along the lines that we can’t expect the client to know what they want. Building software is a complex task and it’s our responsibility as a community to help our clients to figure out what they want.

Chris touched on quite a number of different techniques with which we can employ to fill in the blanks. I was very pleased with the fact that she decided to focus on just a single technique called Neuro Linguistic Programming (NLP). My very limited understanding of NLP is that it’s basically the theory of the programming language of the human mind. What I took away from the talk is that NLP might be the key to picking up subtle nuances in the conversations I have with clients. Is a sentence phrased imprecisely? Maybe the client doesn’t really know what the details should be in that particular case. Is the client using very general terms to describe a feature? That could mean that we’re lacking some details, maybe we shouldn’t really allow everybody to update everything.

As I stated my understanding of NLP is very limited at this point but I definitely see a lot of potential here so I went ahead and suggested they we get some books on the subject so we can investigate further. I’m hooked at this point no doubt about it.

Agile Architecture

image James Coplien did a talk on what I thought would be pretty standard only-design-and-build-the-architecture-you-need-right-now kind of talks. Indeed he started out like that but he quickly went on to blowing our collective minds with proposing a new style of architecture where we separate the What and the Why more clearly. Now I won’t state that I understood half of what he was saying but I got the general drift and I definitely need to look into this some more.

If I were to compare it with something I know from the domain driven design world I’d compare it with the Specification pattern on steroids but I feel that it’s a poor comparison as his ideas describe the overall solution architecture where the Specification pattern is just small bits of pieces of any given solution.

To better understand the concepts I need to see a lot more source code :) You can download the pre-draft book which James is writing on the subject I think you’ll enjoy the new ideas a great deal.

Software Architecture and Testing

…. zzZZZzz…. nuff said.

Top Ten Software Architecture Mistakes

Needless to say I was not in the most energetic of moods having sat throught the snooze fest which was the previous talk. The guy in front of me must have agreed as he actually nodded of there for a while during the testing talk. It was actually pretty entertaining watching him do battle with very heavy eye lids, the mightiest of foes :)

image At least Eoin Woods (cool name or what?) took up the challenge and turned the whole mess around at the next talk in which he discussed his list of top ten architecture mistakes. Being in the last slot of the day is no easy task but he manged to get the entire room going, lots of laughs, lots of good stories, and lots of good information.

His talk basically served to highlight some of the mistakes that we’ve all made and continue to make from time to time. I believe that talks like this are invaluable as they serve to keep us mindful of at least some of the pitfalls of software architecture.

I liked that fact that this talk contained nothing but concrete examples and real world tips and tricks which we could take home with us and use. My favorite take away is to always have a plan B. I think most good architects subconsciously have these hanging around but I like the idea of having plan B be very explicit. It helps the the team if and when to enact it.

Just formulating plan B and sticking it into a document to me is hugely valuable; it gives you pause and helps you think through plan A and ultimately helps build trust as the customer ultimately gets a better solution and should, God forbid, plan A turn out to be a dud we’ve got something to fall back on. Having plan B be visible leaves more wiggle room for the client and I firmly believe that it helps build trust.

Continue to JAOO 2008 Day 3…

posted on Friday, October 03, 2008 10:33:17 PM (Romance Daylight Time, UTC+02:00)  #    Comments [0] Trackback