# Monday, 17 December 2007

Ever since beginning my work with Commerce Server it was apparent that we needed some way to link the disparate subsystem with each other in a uniform way. Sure there are lots of links between catalog, order, and even profiles out of the box but the problem with them is that they're all done in different ways.

My colleague Brian found an excellent solution to this problem by introducing a concept he calls Extension Profiles which is basically a profile you tag on to other data objects in Commerce Server. With this in place you can use the extension profile in a number of ways like mapping objects or extending non-extendable CS objects like ShippingMethods and Payments.

I've been bugging Brian to write about them for a while and during the weekend it seems that he finally got around to it.

Check out How to extend Commerce Server Payment Methods and Shipment Methods

posted on Monday, 17 December 2007 08:29:05 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Friday, 14 December 2007

Commerce-Server-2007-Logo A while back a friend of mine posted a comment here asking me to describe what it's like developing with Commerce Server 2007. Initially I wanted to reply to him in comments but thinking more on it I really want to provide a different and real perspective on how Commerce Server is to work with as a product, a perspective in which I want to dig a deeper than the usual how-to and tutorials you see on the various Commerce Server blogs; mine included.

Check out part 1 Secure By Default where I discuss the security aspects of Commerce Server, part 2 Three-way Data Access in which I write about the various ways of getting data into your applications, part 3 Testability which not surprisingly is all about how CS lends itself to unit testing, part 4 Magic Strings Galore where I take on low level aspects of the APIs, and part 5 Pipelines where COM makes a guest appearance in our mini series.


When time comes to deploy your application we've got a number of options when it comes to custom apps crated purely on top of the .NET framework: Installers, xcopy deployment, automatic build processes, etc.. When it comes to Commerce Server the deployment procedure is a bit more involved but aspects of the deployment are supported by some interesting tools as we'll see in a minute.

Commerce Server comes with a handy tool for publishing your application in a single file called a PUP file. This works great the first time around and greatly simplifies first-time deployment unfortunately it only works for the initial deployment subsequent deployments are more involved because manual deployment is required unless you're fortunate enough to be working with the enterprise version.

Let's first deal with the manual deployment because that's frankly the most fun to write about :) Commerce Server is split across a number of different subsystems each running on top of a separate database. Each subsystem has different deployment requirements and steps that you'll need to follow. I won't bore you with the actual steps here just know that deploying a Commerce Server application requires a lot of steps and I do mean a lot involving a number of disparate tools.

Security requirements of authorization manager further complicates deployment because the business data is protected by an additional layer of security different from what's found at the system level. All this has to be created either manually or via a command line tool provided with the product.

One alleviating factor to the long list of manual deployment steps is the fact that Commerce Server is split across a number of different databases, one for each subsystem. You can isolate changes to each subsystem thus easing deployment by the fact that you can deal a subset of your Commerce Server application at a time.

It becomes really interesting when we start talking about the enterprise version which brings another tool to the table which will automate deployment of most of your application: Commerce Server Staging (CSS). The staging tool allows you to move business data and files from one server to the next. this means that you can enforce pretty much and hands off policy on your production server and only have your business users work in a staging environment for creating and testing purposes.

The only caveat to staging is that it doesn't support profiles, you can however use a more crude approach to deploying your profiles automatically. Notice that I wrote business data and files. You can basically have CSS move binary files to production and have it execute a command pre and post transfer which could be a bat file, a custom executable which essentially makes CSS a very unique and useful tool not just in conjunction with Commerce Server.

Just think about what can be done with a tool like CSS with a regular old ASP.NET app. You could basically have CSS move your compiled ASP.NET app and SQL script to production, have it move the assemblies in place, and finally execute your SQL scripts. Voila automatic deployment, only downside is that you need an enterprise version of Commerce Server around :) I really think that Cactus should look into making this a standalone tool.

posted on Friday, 14 December 2007 07:00:06 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Wednesday, 12 December 2007

Did you know that each field you post using the HTML form element is limited to 100KB? I sure didn't and it can cause trouble if you still have to deal with Commerce Server 2000 and 2002 because the Bizdesk rely heavily on XML islands on the client to create a rich client side experience.

Specifically this can cause trouble when you add a large number of variants to a product all at once. You can work around it by creating a limited number of variants at a time.

PRB: Error "Request Object, ASP 0107 (0x80004005)" When You Post a Form

posted on Wednesday, 12 December 2007 08:45:44 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback

Commerce-Server-2007-Logo A while back a friend of mine posted a comment here asking me to describe what it's like developing with Commerce Server 2007. Initially I wanted to reply to him in comments but thinking more on it I really want to provide a different and real perspective on how Commerce Server is to work with as a product, a perspective in which I want to dig a deeper than the usual how-to and tutorials you see on the various Commerce Server blogs; mine included.

Check out part 1 Secure By Default where I discuss the security aspects of Commerce Server, part 2 Three-way Data Access in which I write about the various ways of getting data into your applications, part 3 Testability which not surprisingly is all about how CS lends itself to unit testing, and part 4 Magic Strings Galore where I take on low level aspects of the APIs.


When I first encountered pipelines in Commerce Server 2000 they were a nice feature to have available and they made sense because they handle a much bigger load due to the fact that they're essentially COM objects executed in an ordered fashion. All this made a great deal of sense back in the day when we were dealing with plain old VBScript and ASP.

When Commerce Server 2002 came out it still made sense that they stuck around because the .NET support in Commerce Server 2002 came in the form of managed wrappers for the COM objects which came with the product.

Would you be surprised to learn that COM based pipelines stuck around for Commerce Server 2007 too? Well they did which means that you have to know a little something about COM to get it going. Especially when it comes to debugging problems with a server setup. Weird HRESULTS is something you still have to contend with although the situation is vastly improved from the older versions.

Fortunately you can go ahead and build your pipeline components in .NET and expose those to COM so all is not lost. It does however mean that you need to make sure that your pipeline components behave as expected at runtime in order to avoid cycling objects in and out of the GAC. The keyword is developer productivity, you don't want to spend too much time mucking about with getting everything good to go for every little change you make to your pipeline components.

Traditionally pipelines is the area where people ask the most questions because it's a pretty opaque topic to dive into at first. Every time I create a new pipeline component it pains me to know that we have the nice System.Transactions namespace available to us in .NET.

Luckily Cactus feels our pain and has a replacement on their roadmap for the next version of Commerce Server but until then you better get those interop skills up to speed and. Alternatively you can choose to forego the pipeline system altogether and do any custom business logic outside pipeline components but that's not always an option.

Developing with Microsoft Commerce Server 2007 Part 6: Deployment

posted on Wednesday, 12 December 2007 07:00:23 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Monday, 10 December 2007

Commerce-Server-2007-Logo A while back a friend of mine posted a comment here asking me to describe what it's like developing with Commerce Server 2007. Initially I wanted to reply to him in comments but thinking more on it I really want to provide a different and real perspective on how Commerce Server is to work with as a product, a perspective in which I want to dig a deeper than the usual how-to and tutorials you see on the various Commerce Server blogs; mine included.

Check out part 1 Secure By Default where I discuss the security aspects of Commerce Server, part 2 Three-way Data Access in which I write about the various ways of getting data into your applications, and part 3 Testability which not surprisingly is all about how CS lends itself to unit testing.

Magic String Galore

One of the first things I noticed when I started working with Commerce Server a number of years back was the extensive use of "magic strings" to access various custom properties of data objects. In fact the use of magic strings was pervasive across the entire product from the catalog system over the profile and orders system to the marketing system. With the current version (2007) that changed for the better with regards to the order system but still holds true for the other subsystems.

Magic strings of course is a term which describes the use of a string to identify a particular element in structures like arrays, dictionaries, and the like; an example would be myArray["myMagicString"].

The reason behind the use of magic strings is that each of the individual subsystems of Commerce Server offers a great deal of flexibility including the ability to define your own schemas for almost everything. This means that the actual structure of say an profile is known only at runtime. Employing magic strings is a nice and easy way of getting the job done but it does leave a lot to be desired when developing with the product.

One of my pet peeves is explorability; by explorability I mean the ability to use intellisense to get an idea of what you've got at your disposal on any given API. Commerce Server allows for this for the most part but when it comes to catalog objects and profiles sadly this is not the case which leaves you referencing other tools to look up the magic strings for accessing a particular property on an object. Not exactly a productivity booster. Of course the remedy is fairly straightforward: You simply build an abstraction layer on top of the weakly typed objects and work off of that instead. This does produce more code that we need to maintain ourselves and with that an increased cost to our customers. Alternatively you could go low budget and maintain a library of magic strings in a central class or even combine that with the abstraction layer.

Interestingly the orders system allows for strongly typed properties. As I wrote in part 2 Three-way Data Access the order system is an ORM in its own right which provides strongly typed properties on the object, all that is require is a little mapping via XML. With that in mind it seems strange that we have to create our own abstraction layers on top of the other subsystems.

The use of of using magic strings means that we end up with runtime errors instead of compile time errors because we can't rely on the compiler to check the validity of magic strings. Refactoring rapidly becomes more difficult at this point leading me to a second pet peeve of mine: Catching errors as early as possible. I really like me to be the first one to know about any errors in the code I write, especially when they're as trivial as misspelling a magic string.

Now one could argue that ORM isn't practical on top of the catalog system due to the very extensible nature of it. It's intended to be expanded by business users, a tool is even provided to enable them to do so making mapping of the catalog object less feasible. The problem however is that with most Commerce Server solutions you'd need a developer to actually use the newly added product definitions and fields for something, e.g. displaying them in the UI, leveraging them in the pipeline system, etc.. This leaves us with the original problem: Developer productivity.

From my point of view there's a single solution available to us: Code generation. To effectively work with the subsystems in a consistent and effective manner code generation could be employed but it requires some heavy lifting as you'd have to create specialized generators for the individual subsystems. The good news is that meta data is indeed available to allow for the process.

One might argue that a second option is available; namely to rewrite the data access schemes of the profile- and catalog system to more closely match that of the order system in order to leverage ORM principles. That however remains closed to anyone but the developers at Cactus Commerce.

posted on Monday, 10 December 2007 15:33:23 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Friday, 07 December 2007

Friday means weird stuff getting passed around and this Friday is no different. Sit back, relax, and enjoy some nice shadow play:

posted on Friday, 07 December 2007 12:50:12 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Monday, 03 December 2007

Commerce-Server-2007-Logo A while back a friend of mine posted a comment here asking me to describe what it's like developing with Commerce Server 2007. Initially I wanted to reply to him in comments but thinking more on it I really wanted to provide a different and real perspective on how Commerce Server is to work with as a product, a perspective in which I want to dig a deeper than the usual how-to and tutorials you see on the various Commerce Server blogs; mine included.

Check out part 1 Secure By Default and part 2 Three-way Data Access.


A topic which is very near and dear to my heart is testing. For a number of years I've been thinking about and trying to implement unit testing and test-driven development in Vertica but only recently have we been making progress on that front. Why then has it taken so long to get going with test driven development and unit tests in a shop where a lot of dedicated and enthusiastic people work?

The reason is simple: We' do most of our work on Microsoft server products like Commerce Server, Office SharePoint Server, and BizTalk and those products are not very well designed with regards to enabling unit testing scenarios. Now this post is entitled developing with Microsoft Commerce Server so naturally I'm going to focus on this.

Test driven development and unit testing require a lot from your architecture to make it work. In order to do effective unit tests in a system we need to able to divide the system up in just that, units. I'm not going to delve into the aspects of unit testing techniques here as that topic alone would require numerous posts to cover but suffice it to say that interfaces play a import role in enabling the scenario, class inheritance does as well.

Now how does this relate to Commerce Server? It turns out that you're going to have to look long and hard to actually find interfaces to enable mocking. Everything in Commerce Server is a concrete class, in some instances even classes which have a natural relationship doesn't share a common interface or even a base class which hampers our attempts at creating structured tests.

When developing with Commerce Server the Adapter pattern will become your friend, simply create an adapter for a particular piece of functionality in Commerce Server and work off of that instead of the real API and you're set. Unfortunately this means extra work for you as the developer if you want to create proper tests for your solution.

A lot of the functionality of Commerce Server is accessed via a class called CommerceContext, this works in much the same way as the HttpContext we know a love. Unfortunately it's heavily reliant on HTTP modules to initialize it thus making it tough to test. As a Commerce Server developer it's natural to go to the CommerceContext and access the various subsystems form there. Doing this however tightly couples you to HTTP which is a bad thing if you need your logic in a different context like say for unit testing. The remedy is simple but you need to be aware of this fact otherwise it will bite you in the ass at some point. I did a post on working with the profile outside of the ASP.NET context back when I was working with Commerce Server 2002; it just so happens that this particular technique is still viable today.

The bottom line with respects to structured testing in Commerce Server is that we're faced with many of the challenges we see when working with legacy code which has not been designed specifically for unit testing. By no means does this mean that it's impossible to do unit testing with Commerce Server; it does however mean that you need to be aware of the fact and design your architecture accordingly and that there will be areas which you won't be able to touch with your automated tests.

Developing with Microsoft Commerce Server 2007 Part 4: Magic Strings Galore

posted on Monday, 03 December 2007 07:40:26 (Romance Standard Time, UTC+01:00)  #    Comments [2] Trackback
# Friday, 30 November 2007

anug_logo_200x85 We know that the attendees at each meeting span vastly different levels of experience in .NET and our past topics such as Pragmatic SOA, BizTalk and ESB, LINQ and ORM have been pretty hard core so this time around we wanted to do something for the beginner.

I discussed the idea of doing a talk for beginners with Brian a couple of months back and fortunately he was keen to do so. What came from our talk wildly exceeded what I had had in mind at the time :)

Initially I was a bit worried whether we'd misjudged the interest in a meeting centered on the beginner because we had a pretty poor number of sign-ups but that took off the final week before the meeting and we ended up with around twenty people attending this one.

But before I get into the actually topic of our meeting let me first start out by summarizing what the core group has been doing since last time.

2007-11-28-Aarhus-NET-User-Group-1 2007-11-28-Aarhus-NET-User-Group-2 2007-11-28-Aarhus-NET-User-Group-3

Core Group One Person Down

Our core group and thus responsible people for running ANUG consists of Brian Holmgård Kristensen, Lars Buch-Jepsen, Peter Loft Jensen, and of course myself. You might notice that the list is shorter than it used to be. Morten Vadstrup sadly had to leave the core group because of time constraints. We're debating whether to bring on a fifth person again as we're pretty much covered with the people we have now.

Speakers and Meeting Places

As always we're looking for new speakers and places to hold the meetings. We've been very fortunate thus far to have very good support from the local companies but we're definitely starting to put more work into finding new place to hold the meetings.

As always I encourage you to ping me if the company you work in would be interested in holding a meeting at their offices. Getting a visit from ANUG is a great opportunity to market your company to just the right people if you're looking for talent. As a rule the company holding the meeting gets 30 min. to talk about their culture, development cycle, etc. in the interest of both giving something back to the company gracious enough to provide for us and also to give the attendees a better idea of the kind of companies that exist in the area.

Should you be interested in giving a talk on a particular topic please don't hesitate in contacting me. We're always looking for speakers. We're seeing tremendous interest in topics like Silverlight, Powershell, dynamic languages in .NET, even ASP.NET, basically most .NET related topics have come up during our discussions of what to bring up next so don't hold back :) Contact me today.

Schedule Meetings on Facebook

We looked into the various offerings out there to see what would support our needs when it comes to announcing the meetings in a structured fashion as it's quickly become clear that we need more than just the blog. With the blog it's simply not possible to schedule to far in the future as the post itself would get lost in the noise from the other posts.

We took a look at Facebook which is a site that I hadn't tried myself, luckily Jacob Saaby Nielsen took the initiative to form a Facebook group for ANUG. It turns out that Facebook provides just what we need to schedule our meetings so we've decided to go ahead and use Facebook for scheduling meetings and for signing up.

I realize that you need a Facebook account to get going and that that presents additional effort to get people to attend but in the long run I firmly believe that this way of doing things offers the best options for us as we need to leverage all the help we can get.

If you wish to attend a meeting I encourage you to sign up on Facebook as the companies giving us shelter usually are kind enough to also provide food and drink so be courteous to them and let them know that you're coming so they can order the right amount for us.

To make it easy to get to the group we've created an alias which is easy to remember. Just use www.anug.dk/Facebook

Become a Member of ANUG: Join LinkedIn

LinkedIn is now the official way of becoming a member of ANUG, we decided to stick with LinkedIn because of the professional aspects of the site. We really want our members to benefit from participating in the meetings and one way of doing just that is to expand the network of the attendees which LinkedIn is perfect for. Also we'll use the member list on the LinkedIn site to send out newsletters to keep you abreast of new meetings, updates on the group, and so forth.

As with the Facebook group we've created an easy to remember alias to get to the invitation to the ANUG group on LinkedIn. Just just www.anug.dk/LinkedIn

Upcoming Meetings

Our meeting schedule is booked until March which makes me very happy as my job is much more relaxed that way :) Coming this December we've got the Christmas dinner although it's too late to sign up now. Sorry about that.

January will bring a talk about C# 3.0 and VB 9 to be given by Henrik Lykke Nielsen, MS Regional Director for Denmark. The meeting will be held at the Vertica offices.

February brings us a talk on Team System and CMMI from Systematic, the meeting is to be held there as well. Details are still not completely in place for this one but I'm looking forward to it nonetheless.

March will be very interesting as it will bring us both our long awaiting Code Camp for .NET Beginners as a follow-up to this meeting. We haven't decided on a date yet but expect it to be sometime in the middle of March.

It's a two for one month in March so we'll give you a talk on Workflow Foundation by Henrik Kristensen from Scanvaegt International A/S as well.

More information will follow on Facebook.

New Concept for Meetings: Open Space

We've been playing around with the idea of expanding the social aspects of the user group for a while now and the way we intend to do is to employ the Open Space idea where the attendees themselves set the agenda and everybody participate in the discussion. What we see at the meetings is that people don't get nearly enough time to interact with each other as a consequence we'll do entire meetings which are about that and only that. I have high hopes for the concept and the attendees at the meeting did so too so that'll definitely be something to look forward to.


We got a suggestion at the meeting to snap more pictures at future meetings. We'll definitely try and do something about that. Everybody are welcome to bring a camera are fire away though :)

Professional .NET for Beginners, Brian Holmgård Kristensen

The main event of the evening was my colleague Brian who gave a very nice talk on .NET for beginners. The premise for the talk was to create a blog web application and in the process giving the attendees a look into some of the tools and techniques that go into creating such a thing.

Brian had a limited number of slides and instead chose to let the code speak for him. It's always an interesting proposition to do a lot of code on screen as a lot can go wrong when you choose to do so. We did get a first hand example of this as Visual Studio refused to play nice after only a couple of minutes of presentation. Ultimately it turned out that using beta versions of released software for a presentation is a bit too overconfident :) A break and a restart of Visual Studio made everything right and Brian could continue on with no further incidents to report.

One thing that is ultra important when doing a lot of code on screen is to run at a low resolution with nice big fonts which unfortunately Brian did choose to do. It's a technicality but it's shame that such a small thing deducts from an otherwise great presentation, and it was a great presentation. I was especially impressed by the level of interactivity, it was right up there with the LINQ presentation that Søren Skovsbøll did the last time around. People were very eager to know more about various ASP.NET 3.5 technologies such as Master pages and even the details of coding in the .NET Framework itself.

Brian paid great attention to detail and didn't leave anything hanging, every time he introduced a new concept he thoroughly explained it as to not confuse anybody. At times the attention to detail became also too much but with the fact in mind that the presentation wasn't intended for people like me I don't believe that that was a actual problem for anyone also but me :)

All in all Brian did a very good job in engaging the attendees and he covered nicely when Visual Studio started acting up. He'd made an entire story line to follow in which he started out with a static HTML page and gradually made it data driven; in truth a very compelling way of getting his points across.

The data driven web application is a demo that the Microsoft people are very fond of. You've probably seen it done numerous times but as always we try and do thing differently for the ANUG meetings. What sets Brian's presentation apart from the others is the fact that he actually did a properly n-tiered architecture, he even provided facilities to demonstrate the importance of encapsulation. No drag and drop of data source to be found anywhere, everything was done by hand. This part of the presentation is my very favorite because it not only sets what we do with ANUG apart from the Microsoft events which is our stated goal it also shows that doing a data driven demo app in the proper way is not only feasible but absolutely possible in the very short amount of time usually available to these kinds of presentations.

The guys present seemed very keen on getting their hands on the source code so we're providing it for download along with his slides.

Tour de Scanvaegt

Due to the many questions Brian's presentation did drag on a bit but luckily Henrik Kristensen was unfazed by this fact and gave some interesting insight into Scanvaegt which hosted the meeting. Scanvaegt is actually an old company which harkens back to 1932. I had no idea that they'd actually been around so long.

He went on to tell us about their way of developing the software supporting the huge and complicated machinery that they create. They've had a couple of bouts with agile methodologies and even tried to do a big bang implementation of XP as a process which was unsuccessful. They're father down the agile path agile now by adopting a slower pace and working with mindset instead of tools and techniques, something I wholeheartedly agree with as that's the way we're making it happen inside Vertica as well. We pick and choosing the pieces which make sense to us and implement them one at a time, it looks like we're pretty much on the same page there.

The highlight to me was when Henrik showed a video of one of the sorting machines which sorted chicken fillets into plastic trays, the fascinating part of this was that it placed the fillets facing the same way every single time it place a new on in tray.

He also told us about a cool sounding machine that's able to decide how large a particular piece of meat is by way of 3D photography. With a 3D model of the piece meat built up it'll then proceed to calculating how thick it must make each steak to get the desired number of steaks. Incredible. The videos really hammer home the coolness of this so I'm trying to get Henrik to put a couple on YouTube if possible.

Open Forum

We never got around to doing open forum this time around.

posted on Friday, 30 November 2007 20:30:53 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Thursday, 29 November 2007

On my way home this evening I had a most shocking experience: I encountered a ghost driver driving the wrong way on the free way on my side of the road. I was alerted to the situation by drivers in the opposite lane who were signaling with their headlights madly. Of course this got my attention but I had no idea what they were trying to tell those of us driven in the opposite lane. Moments later that became horribly apparent when I saw the car coming at us driven the wrong way on the free way. It blew past us in a matter of seconds; at first I didn't even understand how lucky I'd just been or even what happened.

This very moment as I'm sitting at home watching the news do I understand how lucky I actually was. It turns out that that very same ghost driver crashed into a car less than one minute after it passed me killing the ghost driver and seriously injuring the other driver who fortunately is outside danger.

I can't help but think that that seriously injured person could've been me if I'd decided to try and over take another car at the wrong moment or whether I could've done something to prevent the terrible accident which happened only moments after the car passed me by...

Read more about the incident (Danish)

posted on Thursday, 29 November 2007 19:40:12 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback