# Wednesday, 03 June 2009

Microsoft-Commerce-Server-LogoBack in 2007 I was faced with designing a multi-currency catalog solution for Commerce Server. I knew from previous experience that we would need a general purpose solution, which we could employ for many clients.

Back then Commerce Server 2007 was new thus performance characteristics were new territory as well. I ended up designing a solution based solely on virtual catalogs. Basically one catalog for each price group/currency. You can read the details on multi-currency/price groups, which we’ve employed successfully numerous times since then.

Since 2007 we’ve gotten a new version of Commerce Server though no significant new features were added in the catalog system, and we’ve gained valuable new knowledge on both the pricing scenarios and limitation of the Commerce Server catalog system. With that information in hand I’m going to try my hand at redesigning the original multi-currency catalog structure from 2007 to both address performance issues and increased flexibility.

Virtual Catalogs are Slow

Virtual catalogs are slow and so they make a poor choice to expressing essentially a single piece of data. You might say that virtual catalogs perform perfectly well if you materialize them and you would be right. You gain in the order of 10x performance by materializing a virtual catalog but lose the ability to make any changes in them in the process.

Basing my original multi-currency solution on virtual catalogs seemed to make perfect sense, but with the added requirement of changing the category hierarchy two levels of modifiable virtual catalogs were needed, thus the ability to materialize was lost.

To make matters worse we rarely deal with clients who are inclined to go with an Enterprise license for their Commerce Server solution, so we don’t see very many full fledged staged solutions, which would take care of the challenge handily.

Virtual Catalogs Two Levels Deep

Virtual catalogs are by nature limited to two levels, i.e. you can have a total of three levels of catalogs, one with base catalogs and two with virtual catalogs. Are we to use one of these levels of virtual catalogs for pricing we lose it for other purposes, and gain very little other than the ability to store another price group per catalog.

Pricing is a Separate Issue

What I’ve come to realize over the years is that product pricing is a completely separate issue from the product itself. While the two might seem like one and the same; in reality they aren’t. Sure the customer needs to be told a certain price, but more often than not the price is determined by context surrounding the product, rather than the product itself.

An example would be a seasonal business, which is highly dependant on calendar time, the product would probably sell for a higher rate at specific times of the year and lower rates the rest of the year, e.g. ChristmasTreesOnline.com. The context in this particular instance is time.

Now for business to business scenarios the context might be especially convoluted as you might go for pricing granularity, which allow organizations, groups within the organization, even individual people in the organization to have specific prices, e.g. memberships to the gym provided by your company or framework agreements made between supplier and customer. The context deciding the price is who you are in this case, not the product itself.

Pricing is a Service

To handle the separate issue of pricing we need something akin to a service in domain driven design parlance. The service is responsible for looking up the right price based on whichever context is present for a given customer request.

Of course we need some sort of structure to maintain the pricing for individual price groups, and catalogs come in handy to solve this as I’ll show you next.

The Product Catalog

Our product catalog would in the new scheme continue to exist as we know and love it with one minor exception. The list price of the product is either to be ignored or used only to get an idea of what the pricing is like. The list price will not be picked up from the product catalog, which contains the marketing data for a product. Please note that I define marketing data in this instance as data used to display to potential customers, but other than that serve no purpose to the system.

The Pricing Catalog

In Commerce Server we have the notion of different types of catalogs, i.e. the product catalog and the inventory catalog. I’m going to introduce a third kind called the Pricing Catalog. As you might imagine a pricing catalog concerns itself only with price and as such contains only the bare minimum of data to identify a product.

The pricing catalog will have metadata to indicate that it is in fact a pricing catalog. Each pricing catalog reflects a single price group such as “Internet Users”, “Gold Customers”, whatever makes sense for the particular scenario.

Having pricing split out like this means that we can price products based on the calendar time context as a standard Commerce Server catalog has dates associated with it to allow us to display it within only a given period of time.

For a pricing catalog these fields are used to determine whether a price is valid or not, so you could have the seasonal pricing expressed as two different pricing catalogs for our ChristmasTreesOnline.com, one for holiday pricing and one for the rest of the year. The pricing service would then grab the pricing from the proper Pricing Catalog and display it to the customer.

Pricing Definitions

Finally I propose a new kind of definition called the Pricing Definition. What this is is a specialized Product Definition used for creating Pricing in the Pricing Catalogs for advanced scenarios, e.g. complex pricing matrices defined in external systems such as an ERP.

Products, i.e. Prices, created based on a Pricing Definition would contain at least the SKU, name, description, and of course the list price. These specialized products go into a Pricing Catalog as we discussed in the previous paragraph.

Tying it All Together

Another context we discussed in a previous section is the organizational context, which might also influence product pricing. Fortunately Commerce Server comes with CatalogSets as a neat way of bundling catalog together. CatalogSets leveraged with our Product Catalogs and Pricing Catalogs would allow us to do multi-currency and, incidentally, a bunch of even more interesting scenarios.

Imagine if you will a scenario where our online retail outlet would like to give Internet customers access to only currencies, which make sense for their particular region, e.g. here in Denmark Euro and our national currency Kroner would make sense, while UK customers should be enabled to shop in either Pounds or Euro.

Simple! Create two catalog sets one called Denmark and one called UK. For the Denmark catalog set select our one product catalog containing all products or the one which reflects the range available in Denmark and select the two Pricing Catalogs Kroner and Euro. For the UK catalog set select Euro and Pounds Pricing Catalogs.

By way of the metadata on the catalogs we’re now able to display the same range for both UK and Danish customers, but in three different currencies with Pound being available to the British and Kroner available only to the Danish customers.

posted on Wednesday, 03 June 2009 14:21:57 (Romance Daylight Time, UTC+02:00)  #    Comments [2] Trackback
# Monday, 30 March 2009

So I've been migrating my life over to the Mac that I bought late last year and briefly mentioned in my summary post of 2008. Mostly I'm there but one aspect keeps tripping me up: Which blogging tool to use for posting to the couple of blogs I maintain?

On Windows I'm very happy with Windows Live Writer and I figured that with all the creative writing people of the Mac it wouldn't be an issue at all to find a nice comparable tool on the other side.

Boy was I ever wrong in assuming that. For some reason there isn't really a very good tool which has feature parity with Live Writer on the Mac. The most prevalent tool out there is MarsEdit, which to me doesn't fit the bill. It does everything right in the technical department but lacks in one key area: The editor.

Over the years I've grown accustomed to having a couple of features which really help out in the process of writing a new post:

  • The tool must be desktop based. Web interfaces are handy but too cumbersome to work with
  • WYSIWYG editor
  • Auto creation of image thumbnails with links to the original
  • Image formatting tools like alignment and custom margins
  • Support for BlogEngine.NET and DasBlog (categories, upload images via MetaWeblog API)
  • Ideally rich image formatting features like drop shadow

MarsEdit 2
I don't know about you but I expect to be able to edit my posts in the WYSIWYG interface, which might occasionally require me to drop into HTML view to do some of the more tricky stuff (read: I've done this maybe four time in the five years I've kept a blog). MarsEdit, however, is built on the notion that the writer should have complete control of the HTML and thus provides nothing but raw HTML editing, even billing it as a feature, not a bug. I'm sorry but in 2009 I expect so much more from a tool like that. A tool which even requires me to spend $29,95.

I read a review which describes MarsEdit as being very windowy. I think you'll agree when you take a look at the screenshot below. Basically you've got a window for displaying previous posts, a windows for the raw HTML editor, and a preview window to display what your HTML looks like. Nastylicous!

Qumana
Qumana was my second attempt at reaching a blogging solution on par with what I have on Windows. It was even free so I was off to a great start. Qumana looks like Live Writer enough that I thought I was home free and stopped looking any further. Qumana is a pretty good tool which gets the job done. However, it lacks polish which turned me away from it in the end. No support for picture thumbnails was a huge point for me.

As far as windoyness it's far better than MarsEdit and it does provide a WYSIWYG editor, which was sorely lacking from MarsEdit 2. To sum up Qumana comes close but lacks thumbnail support.

Blogo
Now Blogo is a relatively new tool on the Mac as I understand it. I came across Blogo while listening to Leo Laporte's excellent Macbreak Weekly podcast in which he's got a segment where the panel picks their favorite tool. Blogo was in there and I decided to check it out.

My first encounter with Blogo was a nice one to a certain point when it failed one of my requirements miserably. Read on to find out how.

Hopes were not exactly high when I started using the tool the first time around but that quickly changed as I set out to create my first blog post. Sure image preview was sort of a strange feature in the sense that you get a little standard placeholder which shows you that an image is there. As for actual image preview you're out of luck.

Unfortunately Blogo doesn't support BlogEngine.NET 1.4.5 fully. It seems like it's almost there but posting doesn't happen when categories are in the mix. Editing a post after it's posted to BlogEngine.NET also presents some problems. Blogo "sees" the post but when it's pulled down no content is present inside it. Too bad it's really the only piece of puzzle missing for me to start using Blogo on the Mac instead of Live Writer inside my Fusion virtualized Windows 7 install.

A particularly nice feature of Blogo is its fullscreen editing, which basically allows Blogo to take over the entire screen to focus your attention on the blog post and nothing else. Love it!

All in all I'm not quite there yet. I'm hoping for support for BlogEngine.NET in a future release of Blogo, although I'm not holding my breath in that one. I already contacted the good folks at Drink Brain Juice (yeah I know :)) but nothing has happened as of yet. Crossing fingers and toes as it would see and end to that particular dilemma of my migration.


posted on Monday, 30 March 2009 12:37:30 (Romance Daylight Time, UTC+02:00)  #    Comments [0] Trackback
# Wednesday, 28 January 2009

feed-icon-96x96In case you’re wondering why you’re not receiving any updates from my blog in your favorite feed reader, wonder no more. First a little background.

Google last year acquired FeedBurner without much fanfare and everything has pretty much been quiet since then with the minor exception that some paid feature became free.

This all changed recently when the great FeedBurner migration onto the Google platform started, which screwed people up in a number of interesting ways.

My first attempts at migration were unsuccessful due to the fact that one of my feeds did get migrated in the first go, not completely, mind you, just a little bit, leaving me with my feed both at the old FeedBurner site and at the new Google FeedBurner site.

Of course it’s gets quite tricky to determine automatically what to do when someone tries to migrate a feed onto a new platform where another feed with the same name exists. Luckily I figured out what was going on, and being in control of both ends of the equation I removed the duplicate feed and tried again…

Now for the reason why you’re not receiving anything from this blog in your feed reader. The second time around the migration was successful, only Google for some reason can’t access my original feed URL at my ISP, which means that the FeedBurner URL gets a nice HTTP 502 error whenever you, dear reader, tries to access it.

Until this gets resolved between my ISP and Google (like in a million years) I’ve turned FeedBurner off for the site. Once it’s fixed you will automatically received the new feed. The down side is that you’ll have to update your reader to use the old feed URL in the meantime: http://www.publicvoid.dk/SyndicationService.asmx/GetRss

I think this illustrates nicely why you should be weary of the cloud computing trend. Indeed it’s a fine proposition; supposing that everything works as it should. It’s quite another matter when the cloud turns out to be filled with hot air and starts failing. All you can really do is sit back and wait for someone, somewhere to do something.

posted on Wednesday, 28 January 2009 15:27:26 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Tuesday, 27 January 2009

http://www.developer.com/img/2002/09/18/OOP01.gifToday I read a nice post by Brian Rasmussen in which he describes how to set up Visual Studio to generate class definitions which are sealed by default. I had to post my own point of view in the matter although it is going to be awkward. Not in the teenage, “define me”-sense but in my choice of language as I can’t really quote him effectively, so you’ll have to make do with me paraphrasing his post :)

Now I’d like to put myself in the I-could-not-disagree-more camp. The default choice in my humble opinion should be to leave classes open and have all members be virtual if you want to take it to the extreme. This would leave the system open for change just as the SOLID principles state. Java got it right in my opinion.

To be able to make the decision on whether a class should be open for inheritance you’d have to travel to the future to see what the class might be used for. If you’re anything like me you’re probably challenged in the time travelling department, and so I postulate that you can’t really make a good decision in the matter. More often than not closing the system for change will be the wrong choice as requirements and environments change.

I do agree with Brian’s statement that sealing a class would take away options thus creating a simpler API. I would, however, also like to state that there are better ways of achieving a simple API. How about not exposing the type all? Why not create a simple interface, which exposes only what is needed for the task at hand?

Please don’t make the default choice for your classes sealed. Go with open classes and live a happy life with a system, which is open for change. Trust me I’ve seen systems, which adopted a closed stance and it wasn’t pretty. The team kept hitting the wall in the changes they wanted to make, simply due to the fact that the original developer had no time machine, which enabled him to foresee the changes, which future members of the team needed to implement.

posted on Tuesday, 27 January 2009 12:19:30 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Sunday, 25 January 2009

logoThis is going to be the last post in which I mention Twitter… seriously. In fact I’m going to start right now by not talking about Twitter but instead I’m going to focus on a side effect of Twitter: Corporate Tweeting. (You would in fact be correct if you assume that I just made that term up :))

The Vertical Niche

Like Google Twitter has got the market for short public messages pretty much sewn up. Does that mean that there isn’t a market for short public messages anymore? As Google so clearly has shown sewing up the market does mean that others can’t compete in that same market. It’s all about the vertical niche, baby!

http://www.geocities.com/glendalelandmark/IMG_3947.JPG

Yammer is the New Black

What IMDB is for Google. Yammer is for Twitter. Before I dive into what Yammer is let me start out with a challenge we have at Vertica: As we spread to different geographical locations how do we keep the company spirit going strong? How do we make the departments one coherent company with the same values and a sense of collectiveness?

image

We spent a couple of meetings debating that very issue and of course the good old ones like doing company outings, shared social events, wax eachother’s backs all came up but for me the most interesting one, aside from waxing eachother’s backs, was to try and use Twitter and also allow for the usual private chit chat which goes on inside a company. Some jokes are best kept inside the company… like you know that waxing one. You get my point right?

Yammer

image Yammer has set up shop with a Twitter clone which is ideally suited for running private Twitter-like networks. Bascailly all you need are e-mail addresses on the same domain and you’re golden. Sign up is stupid easy: Enter your e-mail and you’re good to go.

From there is smooth sailed with a nice Adobe AIR client (surprise Adobe AIR is not just for Twitter clients!) which gives you the ease of posting new messages that you’re familiar with from that other netwokr which I won’t mention from here on in.

At Vertica Yammer is quickly turning into a questions and answer service which translates directly into increased productivity because A) You don’t have to know who knows what, you just ask the question and someone will chime in, and B) You don’t interrupt people who don’t want to be interrupted because if they’re not looking they won’t answer.

Now whether or not it will actually serve its original purpose remains to be seen. The new offices in Zealand is still under a month old and quite small so I guess we’ll just have to wait and see. What’s interesting though is that people at the first office were very quickly to adopt Yammer.

posted on Sunday, 25 January 2009 07:00:00 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Monday, 19 January 2009

Twitter.com

In a previous post I wrote about Twitter and what it means to the Danish developer community. The real value of Twitter however does not come by visiting the site from time to time. You have to participate actively to keep the conversation going and that’s where the Twitter clients come into the picure.

I’ve been through a bunch of them and ultimately decided which one I liked the best. I’ll try and spare you from doing the same all over.

Digsby

Digsby gets honorable mention becayse it was my first Twitter client and because this program how I got started with Twitter and in no small way the reason why I still use it.

Digsby is labelled a social network client which gives you access not only to Twitter, in fact that’s the least of it, but also to Messenger, LinkedIn, Facebook, Yahoo Chat, Google Talk, the list goes on and on but you get the point. Digsby speaks with most social networks out there.

That was my reason for trying it out as I really didn’t feel that I needed a dedicated program to try out Twitter. I spent quite some time with Digsby and felt for a long time that it was the way to go. In fact the reason I dropped it was not so much Twitter related as it was Messenger related. It simply didn’t work as advertised, sending file for one was spotty.

As a Twitter client it performed admirably and for me at least it was a low cost to pay for trying out Twitter as I used it primarily as a Messenger client with the added benefit of being able to send out my tweets as well.

Twitterrific

imageTwitterrific is an interesting one as it didn’t start out on the desktop for me. It actually started out on my iPhone and went I got a Mac late last year it was the natural choice for the desktop as well as the iPhone experience with this thing is flawless as far as I’m concerned.

Now the application is pretty much the same on the Mac. Interestingly it turns out that the functionality doesn’t quite cut it on the desktop. Due to the nature of tweets messages need to be as compact as they can be.

http://www.dech.co.uk/wp-content/uploads/2008/07/photo.jpg http://estwitter.com/wp-includes/images/twitteriffic.gif

Imagine that you’re posting a link which can easily be 50 - 60 characters; at that point you really want to be able to shorten a link easily and post the short version insteand. Unfortunately Twitterrific doesn’t support this which is fine on the iPhone where cut and paste is not to be found so you tend not to post links. On the desktop though links are thrown left and right so not having the feature is a real pain point – at least for me.

Thus Twitterric was evicted from the Mac desktop but remain on the iPhone as one of the first apps I ever installed on that thing.

twhirl

image Before I delve into twhirl a word on Adobe AIR. Not so much because I find the platform interesting but because I find it interesting that as a platform a lot of the ecosystem is made up of … wait for it … Twitter clients. It’s interesting to me that a service like Twitter can drive a platform like AIR and not the other way around.

twhirl is pretty much like Twitterrific only the name is quite a bit easier to spell and it supports the link shortening feature I mentioned above. It being an Adobe AIR app also means that it’s cross platform for those us running cross ethnic platforms out there.

twhirl is like the girlfriend you can’t quite figure out if you want to spend your life with or leave for someone else. I left but ultimately came back so I guess it’s forever between us :)

And finally remember to follow me on Twitter once you get your favorite client up and running :)

posted on Monday, 19 January 2009 11:58:14 (Romance Standard Time, UTC+01:00)  #    Comments [3] Trackback
# Sunday, 04 January 2009

Community-People Back in May 2008 I wrote a short note about me trying out Twitter. At the time I just wanted to know more about what Twitter actually was as I heard about time and again on podcasts, blogs, everywhere really.

Interestingly whenever people talked about Twitter it was due to the service being down but still I felt compelled to take it out for a spin.

Twitter of course is the service which enables you to post little notices about what you’re currently doing which doesn’t sound all that useful until you actually sit down and think about it. In reality it turns out that there are numerous applications for a service like that. The notices are limited to only 140 characters which means that you have to be really short and sweet in the stuff you send to the service.

Fast forward to January 2009 with the experiment done and my conclusion is in: Twitter is indeed a service worth paying attention to. Read on to find out why.

Now what prompted this post is a question I got from Brian Rasmussen when I suggested that he take a look at it. Basically he asked why he should use Twitter, a question I didn’t quite know how the answer with anything but, “it’s cool”. Since that time I’ve been wondering what makes Twitter worth my while and yours as well, dear reader.

Jesper-Blad-Jensen-Twitter

Twitter is a lot of things to a lot of people. The value to me and our little community in particular lies in tying together everybody in a more coherent way than what is possible today. To me at least Twitter is a place where I get to keep in touch with a number of the Danish .NET developers in a far more personal way than what is possible at DotNetForum, ActiveDeveloper, etc. because the service is geared for throwing stuff out there without thinking too much about it.

 Morten-Jokumsen-Twitter

Why do I call it the back channel of our community? Due to the nature of the messages you stick on Twitter it quickly becomes just little notices about what’s going on right now. For example Mads used it to get an idea of which IoC framework to go with, I recently got a Mac and had no clue where to start so I elicited suggestions for apps to use, Niels uses it for communicating with the Umbraco team from time to time, recently Jesper wanted to know what to include in his ASP.NET MVC presentation coming up in ONUG in January, and Rasmus had a memory leak which he needed some input for fixing.

Mads-Kristensen-Twitter

Basically what you get is an inside look in the process leading up to a blog post, presentation, the solution to a giving issue, or whatever; something you don’t really get from reading the final product and often times much more interesting.

I would encourage you to go create an account with Twitter and follow a bunch a people from the Danish .NET community. Morten from DotNetForum was even kind enough to create a wiki with the Twitter names of a bunch of the Danish .NET guys which you can use as a starting point. You can follow me using my Twitter name  publicvoid_dk.

Of course there are a number of people which I’d like to see get Twitter accounts like Brian Rasmussen, Søren Skovsbøll, Mark Seemann, Kasper Bo Larsen, and Martin Bakkegaard Olesen,

posted on Sunday, 04 January 2009 13:41:19 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Monday, 29 December 2008

I had grand goals for 2008 when we started out the new year last time around, only stuff happened and my activity level on this blog has not been up to the goals I initially set out to reach. In spite of that I'm very happy with my accomplishments for 2008. They just happen to have occurred in a slightly different way than I originally thought.

The Blog

Surprisingly the most visited and commented post on the blog during 2008 wasn't even written during 2008. It caters to the more mainstream internet users, was written in 2006, and is about an annoyance I had with Windows and the My Music folder which disappeared from time to time.

But we are looking back at 2008 here so it's fitting to mention the posts which I'm most proud of which were actually written during 2008.  First up is my Developing with Commerce Server 2007 series in which I dove into the the development experience of Commerce Server. Also on the topic of Commerce Server 2007 I wrote a post on a generic mapping piece I did for a project early in the year which turns CS objects into nice POCO object for nice testability.

Work

Of course there was real work to done and 2008 brought some really interesting challenges with me participating in one of the largest e-commerce projects I've ever had my hands on. Huge customer, international team of devs, traveling across the Atlantic to do some of the work. All in all a great learning experience and as a result I'm now able to provide even better service to our customers. Oh and it was kinda fun too :)

I got to attend a couple of conferences as well. First Daniel from Microsoft was nice enough to invite me to JAOO; a conference I enjoy a great deal and later in the year I had a unique chance to fly out to Los Angeles to participate in PDC 2008. I have to say that if you ever get a chance to participate in a conference like the PDC you really should jump at it. It's spectacular show to be sure. I did a couple of podcast episodes about it too; in Danish mind you.

Finally I'm happy to report that we managed to add a number of very talented people to both to my own team at Vertica and to the integration team as well. I'm proud to have such great colleagues and to be able say that every day I learn something new as a result.

Aarhus .NET User Group

Now as I started the post out by saying that I haven't spent as much time on the blog as I would have liked and there's a really good reason for that: Aarhus .NET User Group which has sucked up a significant part of my time.

During 2008 the core group and I organized thirteen meetings, indeed we didn't miss a beat the entire year and even managed to do a bonus meeting in December with my good colleague Daniel about unit testing. Additionally we pulled off a code camp in the beginning of the year, the ANUG 1 year old birthday dinner, and a Christmas Dinner. Not too shabby if I do say so myself.

Support for the user group during 2008 was tremendous and I couldn't be happier about where we're at after just one and half year of operation.

More importantly we've shown other .NET developers in the Danish community that a user group in Denmark is viable and as a result new groups have sprung up during 2008. As I write this groups are up and running in Odense (ONUG), Aalborg (AANUG), and Copenhagen (CNUG).

ANUGCast (www.anug.dk/podcast)

Ever since we started the user group we've had requests for putting the meeting content online somehow, be it video, audio, or something else entirely. What we did from the start was write meeting summaries which weren't really the ideal way to bring the content online. It's adequate and we'll continue to do so but it's been clear from the start that it was far from sufficient.

Late in 2008 it struck me that the podcast format might be the ideal way of addressing the requests. With that in mind I set out to create a podcast based on the topics of the meetings. With that ANUGCast was born with the initial goal: to bring out an episode once a month. This quickly escalated to one per week and so far it's gone really well. In fact episode thirteen was posted today and I've got a bunch of episodes already in the can just waiting to get released.

The podcast is my little baby and I guess most of the time which would otherwise have been spent on the blog got diverted there. I enjoy hosting the podcast a great deal, so much so in fact that I'd do it full time if I could :)

Since starting out the podcast I've gotten it registered with more than 50 aggregation sites, we're on iTunes, and we've have more than 4000 5000 downloads since the pilot episode in September 2008, a number I'm particularly proud of. We seen a steady climb of downloads since the pilot episode and the past couple of months saw more than a thousand downloads each.

I guess I should do a couple of posts on how ANUGCast is made and some of the tricks I picked up wearing the hats of producer, sound engineer, basically every damn hat needed to make it happen :)

2009

The coming year will bring a similar activity level on the blog as 2008. It is my every intention to keep up my work with the user group and the podcast and even step it up a bit. 2009 will bring more real marketing of the user group to reach new audience which I'll write more about after we hold the first meeting of 2009. There's something to look forward to for sure. 2009 will also bring our first IT pro related meeting and will cover Hyper-V. It's intended as a pilot to kinda try the waters for something like that.

Oh and I went and got myself a Mac so I guess I'm sort of a Mac switcher as of December 22nd... 2009 is going to be interesting for sure.

posted on Monday, 29 December 2008 22:41:45 (Romance Standard Time, UTC+01:00)  #    Comments [1] Trackback