# Wednesday, 12 December 2007

Did you know that each field you post using the HTML form element is limited to 100KB? I sure didn't and it can cause trouble if you still have to deal with Commerce Server 2000 and 2002 because the Bizdesk rely heavily on XML islands on the client to create a rich client side experience.

Specifically this can cause trouble when you add a large number of variants to a product all at once. You can work around it by creating a limited number of variants at a time.

PRB: Error "Request Object, ASP 0107 (0x80004005)" When You Post a Form

posted on Wednesday, 12 December 2007 08:45:44 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback

Commerce-Server-2007-Logo A while back a friend of mine posted a comment here asking me to describe what it's like developing with Commerce Server 2007. Initially I wanted to reply to him in comments but thinking more on it I really want to provide a different and real perspective on how Commerce Server is to work with as a product, a perspective in which I want to dig a deeper than the usual how-to and tutorials you see on the various Commerce Server blogs; mine included.

Check out part 1 Secure By Default where I discuss the security aspects of Commerce Server, part 2 Three-way Data Access in which I write about the various ways of getting data into your applications, part 3 Testability which not surprisingly is all about how CS lends itself to unit testing, and part 4 Magic Strings Galore where I take on low level aspects of the APIs.


When I first encountered pipelines in Commerce Server 2000 they were a nice feature to have available and they made sense because they handle a much bigger load due to the fact that they're essentially COM objects executed in an ordered fashion. All this made a great deal of sense back in the day when we were dealing with plain old VBScript and ASP.

When Commerce Server 2002 came out it still made sense that they stuck around because the .NET support in Commerce Server 2002 came in the form of managed wrappers for the COM objects which came with the product.

Would you be surprised to learn that COM based pipelines stuck around for Commerce Server 2007 too? Well they did which means that you have to know a little something about COM to get it going. Especially when it comes to debugging problems with a server setup. Weird HRESULTS is something you still have to contend with although the situation is vastly improved from the older versions.

Fortunately you can go ahead and build your pipeline components in .NET and expose those to COM so all is not lost. It does however mean that you need to make sure that your pipeline components behave as expected at runtime in order to avoid cycling objects in and out of the GAC. The keyword is developer productivity, you don't want to spend too much time mucking about with getting everything good to go for every little change you make to your pipeline components.

Traditionally pipelines is the area where people ask the most questions because it's a pretty opaque topic to dive into at first. Every time I create a new pipeline component it pains me to know that we have the nice System.Transactions namespace available to us in .NET.

Luckily Cactus feels our pain and has a replacement on their roadmap for the next version of Commerce Server but until then you better get those interop skills up to speed and. Alternatively you can choose to forego the pipeline system altogether and do any custom business logic outside pipeline components but that's not always an option.

Developing with Microsoft Commerce Server 2007 Part 6: Deployment

posted on Wednesday, 12 December 2007 07:00:23 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Monday, 10 December 2007

Commerce-Server-2007-Logo A while back a friend of mine posted a comment here asking me to describe what it's like developing with Commerce Server 2007. Initially I wanted to reply to him in comments but thinking more on it I really want to provide a different and real perspective on how Commerce Server is to work with as a product, a perspective in which I want to dig a deeper than the usual how-to and tutorials you see on the various Commerce Server blogs; mine included.

Check out part 1 Secure By Default where I discuss the security aspects of Commerce Server, part 2 Three-way Data Access in which I write about the various ways of getting data into your applications, and part 3 Testability which not surprisingly is all about how CS lends itself to unit testing.

Magic String Galore

One of the first things I noticed when I started working with Commerce Server a number of years back was the extensive use of "magic strings" to access various custom properties of data objects. In fact the use of magic strings was pervasive across the entire product from the catalog system over the profile and orders system to the marketing system. With the current version (2007) that changed for the better with regards to the order system but still holds true for the other subsystems.

Magic strings of course is a term which describes the use of a string to identify a particular element in structures like arrays, dictionaries, and the like; an example would be myArray["myMagicString"].

The reason behind the use of magic strings is that each of the individual subsystems of Commerce Server offers a great deal of flexibility including the ability to define your own schemas for almost everything. This means that the actual structure of say an profile is known only at runtime. Employing magic strings is a nice and easy way of getting the job done but it does leave a lot to be desired when developing with the product.

One of my pet peeves is explorability; by explorability I mean the ability to use intellisense to get an idea of what you've got at your disposal on any given API. Commerce Server allows for this for the most part but when it comes to catalog objects and profiles sadly this is not the case which leaves you referencing other tools to look up the magic strings for accessing a particular property on an object. Not exactly a productivity booster. Of course the remedy is fairly straightforward: You simply build an abstraction layer on top of the weakly typed objects and work off of that instead. This does produce more code that we need to maintain ourselves and with that an increased cost to our customers. Alternatively you could go low budget and maintain a library of magic strings in a central class or even combine that with the abstraction layer.

Interestingly the orders system allows for strongly typed properties. As I wrote in part 2 Three-way Data Access the order system is an ORM in its own right which provides strongly typed properties on the object, all that is require is a little mapping via XML. With that in mind it seems strange that we have to create our own abstraction layers on top of the other subsystems.

The use of of using magic strings means that we end up with runtime errors instead of compile time errors because we can't rely on the compiler to check the validity of magic strings. Refactoring rapidly becomes more difficult at this point leading me to a second pet peeve of mine: Catching errors as early as possible. I really like me to be the first one to know about any errors in the code I write, especially when they're as trivial as misspelling a magic string.

Now one could argue that ORM isn't practical on top of the catalog system due to the very extensible nature of it. It's intended to be expanded by business users, a tool is even provided to enable them to do so making mapping of the catalog object less feasible. The problem however is that with most Commerce Server solutions you'd need a developer to actually use the newly added product definitions and fields for something, e.g. displaying them in the UI, leveraging them in the pipeline system, etc.. This leaves us with the original problem: Developer productivity.

From my point of view there's a single solution available to us: Code generation. To effectively work with the subsystems in a consistent and effective manner code generation could be employed but it requires some heavy lifting as you'd have to create specialized generators for the individual subsystems. The good news is that meta data is indeed available to allow for the process.

One might argue that a second option is available; namely to rewrite the data access schemes of the profile- and catalog system to more closely match that of the order system in order to leverage ORM principles. That however remains closed to anyone but the developers at Cactus Commerce.

posted on Monday, 10 December 2007 15:33:23 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Friday, 07 December 2007

Friday means weird stuff getting passed around and this Friday is no different. Sit back, relax, and enjoy some nice shadow play:

posted on Friday, 07 December 2007 12:50:12 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Monday, 03 December 2007

Commerce-Server-2007-Logo A while back a friend of mine posted a comment here asking me to describe what it's like developing with Commerce Server 2007. Initially I wanted to reply to him in comments but thinking more on it I really wanted to provide a different and real perspective on how Commerce Server is to work with as a product, a perspective in which I want to dig a deeper than the usual how-to and tutorials you see on the various Commerce Server blogs; mine included.

Check out part 1 Secure By Default and part 2 Three-way Data Access.


A topic which is very near and dear to my heart is testing. For a number of years I've been thinking about and trying to implement unit testing and test-driven development in Vertica but only recently have we been making progress on that front. Why then has it taken so long to get going with test driven development and unit tests in a shop where a lot of dedicated and enthusiastic people work?

The reason is simple: We' do most of our work on Microsoft server products like Commerce Server, Office SharePoint Server, and BizTalk and those products are not very well designed with regards to enabling unit testing scenarios. Now this post is entitled developing with Microsoft Commerce Server so naturally I'm going to focus on this.

Test driven development and unit testing require a lot from your architecture to make it work. In order to do effective unit tests in a system we need to able to divide the system up in just that, units. I'm not going to delve into the aspects of unit testing techniques here as that topic alone would require numerous posts to cover but suffice it to say that interfaces play a import role in enabling the scenario, class inheritance does as well.

Now how does this relate to Commerce Server? It turns out that you're going to have to look long and hard to actually find interfaces to enable mocking. Everything in Commerce Server is a concrete class, in some instances even classes which have a natural relationship doesn't share a common interface or even a base class which hampers our attempts at creating structured tests.

When developing with Commerce Server the Adapter pattern will become your friend, simply create an adapter for a particular piece of functionality in Commerce Server and work off of that instead of the real API and you're set. Unfortunately this means extra work for you as the developer if you want to create proper tests for your solution.

A lot of the functionality of Commerce Server is accessed via a class called CommerceContext, this works in much the same way as the HttpContext we know a love. Unfortunately it's heavily reliant on HTTP modules to initialize it thus making it tough to test. As a Commerce Server developer it's natural to go to the CommerceContext and access the various subsystems form there. Doing this however tightly couples you to HTTP which is a bad thing if you need your logic in a different context like say for unit testing. The remedy is simple but you need to be aware of this fact otherwise it will bite you in the ass at some point. I did a post on working with the profile outside of the ASP.NET context back when I was working with Commerce Server 2002; it just so happens that this particular technique is still viable today.

The bottom line with respects to structured testing in Commerce Server is that we're faced with many of the challenges we see when working with legacy code which has not been designed specifically for unit testing. By no means does this mean that it's impossible to do unit testing with Commerce Server; it does however mean that you need to be aware of the fact and design your architecture accordingly and that there will be areas which you won't be able to touch with your automated tests.

Developing with Microsoft Commerce Server 2007 Part 4: Magic Strings Galore

posted on Monday, 03 December 2007 07:40:26 (Romance Standard Time, UTC+01:00)  #    Comments [2] Trackback
# Friday, 30 November 2007

anug_logo_200x85 We know that the attendees at each meeting span vastly different levels of experience in .NET and our past topics such as Pragmatic SOA, BizTalk and ESB, LINQ and ORM have been pretty hard core so this time around we wanted to do something for the beginner.

I discussed the idea of doing a talk for beginners with Brian a couple of months back and fortunately he was keen to do so. What came from our talk wildly exceeded what I had had in mind at the time :)

Initially I was a bit worried whether we'd misjudged the interest in a meeting centered on the beginner because we had a pretty poor number of sign-ups but that took off the final week before the meeting and we ended up with around twenty people attending this one.

But before I get into the actually topic of our meeting let me first start out by summarizing what the core group has been doing since last time.

2007-11-28-Aarhus-NET-User-Group-1 2007-11-28-Aarhus-NET-User-Group-2 2007-11-28-Aarhus-NET-User-Group-3

Core Group One Person Down

Our core group and thus responsible people for running ANUG consists of Brian Holmgård Kristensen, Lars Buch-Jepsen, Peter Loft Jensen, and of course myself. You might notice that the list is shorter than it used to be. Morten Vadstrup sadly had to leave the core group because of time constraints. We're debating whether to bring on a fifth person again as we're pretty much covered with the people we have now.

Speakers and Meeting Places

As always we're looking for new speakers and places to hold the meetings. We've been very fortunate thus far to have very good support from the local companies but we're definitely starting to put more work into finding new place to hold the meetings.

As always I encourage you to ping me if the company you work in would be interested in holding a meeting at their offices. Getting a visit from ANUG is a great opportunity to market your company to just the right people if you're looking for talent. As a rule the company holding the meeting gets 30 min. to talk about their culture, development cycle, etc. in the interest of both giving something back to the company gracious enough to provide for us and also to give the attendees a better idea of the kind of companies that exist in the area.

Should you be interested in giving a talk on a particular topic please don't hesitate in contacting me. We're always looking for speakers. We're seeing tremendous interest in topics like Silverlight, Powershell, dynamic languages in .NET, even ASP.NET, basically most .NET related topics have come up during our discussions of what to bring up next so don't hold back :) Contact me today.

Schedule Meetings on Facebook

We looked into the various offerings out there to see what would support our needs when it comes to announcing the meetings in a structured fashion as it's quickly become clear that we need more than just the blog. With the blog it's simply not possible to schedule to far in the future as the post itself would get lost in the noise from the other posts.

We took a look at Facebook which is a site that I hadn't tried myself, luckily Jacob Saaby Nielsen took the initiative to form a Facebook group for ANUG. It turns out that Facebook provides just what we need to schedule our meetings so we've decided to go ahead and use Facebook for scheduling meetings and for signing up.

I realize that you need a Facebook account to get going and that that presents additional effort to get people to attend but in the long run I firmly believe that this way of doing things offers the best options for us as we need to leverage all the help we can get.

If you wish to attend a meeting I encourage you to sign up on Facebook as the companies giving us shelter usually are kind enough to also provide food and drink so be courteous to them and let them know that you're coming so they can order the right amount for us.

To make it easy to get to the group we've created an alias which is easy to remember. Just use www.anug.dk/Facebook

Become a Member of ANUG: Join LinkedIn

LinkedIn is now the official way of becoming a member of ANUG, we decided to stick with LinkedIn because of the professional aspects of the site. We really want our members to benefit from participating in the meetings and one way of doing just that is to expand the network of the attendees which LinkedIn is perfect for. Also we'll use the member list on the LinkedIn site to send out newsletters to keep you abreast of new meetings, updates on the group, and so forth.

As with the Facebook group we've created an easy to remember alias to get to the invitation to the ANUG group on LinkedIn. Just just www.anug.dk/LinkedIn

Upcoming Meetings

Our meeting schedule is booked until March which makes me very happy as my job is much more relaxed that way :) Coming this December we've got the Christmas dinner although it's too late to sign up now. Sorry about that.

January will bring a talk about C# 3.0 and VB 9 to be given by Henrik Lykke Nielsen, MS Regional Director for Denmark. The meeting will be held at the Vertica offices.

February brings us a talk on Team System and CMMI from Systematic, the meeting is to be held there as well. Details are still not completely in place for this one but I'm looking forward to it nonetheless.

March will be very interesting as it will bring us both our long awaiting Code Camp for .NET Beginners as a follow-up to this meeting. We haven't decided on a date yet but expect it to be sometime in the middle of March.

It's a two for one month in March so we'll give you a talk on Workflow Foundation by Henrik Kristensen from Scanvaegt International A/S as well.

More information will follow on Facebook.

New Concept for Meetings: Open Space

We've been playing around with the idea of expanding the social aspects of the user group for a while now and the way we intend to do is to employ the Open Space idea where the attendees themselves set the agenda and everybody participate in the discussion. What we see at the meetings is that people don't get nearly enough time to interact with each other as a consequence we'll do entire meetings which are about that and only that. I have high hopes for the concept and the attendees at the meeting did so too so that'll definitely be something to look forward to.


We got a suggestion at the meeting to snap more pictures at future meetings. We'll definitely try and do something about that. Everybody are welcome to bring a camera are fire away though :)

Professional .NET for Beginners, Brian Holmgård Kristensen

The main event of the evening was my colleague Brian who gave a very nice talk on .NET for beginners. The premise for the talk was to create a blog web application and in the process giving the attendees a look into some of the tools and techniques that go into creating such a thing.

Brian had a limited number of slides and instead chose to let the code speak for him. It's always an interesting proposition to do a lot of code on screen as a lot can go wrong when you choose to do so. We did get a first hand example of this as Visual Studio refused to play nice after only a couple of minutes of presentation. Ultimately it turned out that using beta versions of released software for a presentation is a bit too overconfident :) A break and a restart of Visual Studio made everything right and Brian could continue on with no further incidents to report.

One thing that is ultra important when doing a lot of code on screen is to run at a low resolution with nice big fonts which unfortunately Brian did choose to do. It's a technicality but it's shame that such a small thing deducts from an otherwise great presentation, and it was a great presentation. I was especially impressed by the level of interactivity, it was right up there with the LINQ presentation that Søren Skovsbøll did the last time around. People were very eager to know more about various ASP.NET 3.5 technologies such as Master pages and even the details of coding in the .NET Framework itself.

Brian paid great attention to detail and didn't leave anything hanging, every time he introduced a new concept he thoroughly explained it as to not confuse anybody. At times the attention to detail became also too much but with the fact in mind that the presentation wasn't intended for people like me I don't believe that that was a actual problem for anyone also but me :)

All in all Brian did a very good job in engaging the attendees and he covered nicely when Visual Studio started acting up. He'd made an entire story line to follow in which he started out with a static HTML page and gradually made it data driven; in truth a very compelling way of getting his points across.

The data driven web application is a demo that the Microsoft people are very fond of. You've probably seen it done numerous times but as always we try and do thing differently for the ANUG meetings. What sets Brian's presentation apart from the others is the fact that he actually did a properly n-tiered architecture, he even provided facilities to demonstrate the importance of encapsulation. No drag and drop of data source to be found anywhere, everything was done by hand. This part of the presentation is my very favorite because it not only sets what we do with ANUG apart from the Microsoft events which is our stated goal it also shows that doing a data driven demo app in the proper way is not only feasible but absolutely possible in the very short amount of time usually available to these kinds of presentations.

The guys present seemed very keen on getting their hands on the source code so we're providing it for download along with his slides.

Tour de Scanvaegt

Due to the many questions Brian's presentation did drag on a bit but luckily Henrik Kristensen was unfazed by this fact and gave some interesting insight into Scanvaegt which hosted the meeting. Scanvaegt is actually an old company which harkens back to 1932. I had no idea that they'd actually been around so long.

He went on to tell us about their way of developing the software supporting the huge and complicated machinery that they create. They've had a couple of bouts with agile methodologies and even tried to do a big bang implementation of XP as a process which was unsuccessful. They're father down the agile path agile now by adopting a slower pace and working with mindset instead of tools and techniques, something I wholeheartedly agree with as that's the way we're making it happen inside Vertica as well. We pick and choosing the pieces which make sense to us and implement them one at a time, it looks like we're pretty much on the same page there.

The highlight to me was when Henrik showed a video of one of the sorting machines which sorted chicken fillets into plastic trays, the fascinating part of this was that it placed the fillets facing the same way every single time it place a new on in tray.

He also told us about a cool sounding machine that's able to decide how large a particular piece of meat is by way of 3D photography. With a 3D model of the piece meat built up it'll then proceed to calculating how thick it must make each steak to get the desired number of steaks. Incredible. The videos really hammer home the coolness of this so I'm trying to get Henrik to put a couple on YouTube if possible.

Open Forum

We never got around to doing open forum this time around.

posted on Friday, 30 November 2007 20:30:53 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Thursday, 29 November 2007

On my way home this evening I had a most shocking experience: I encountered a ghost driver driving the wrong way on the free way on my side of the road. I was alerted to the situation by drivers in the opposite lane who were signaling with their headlights madly. Of course this got my attention but I had no idea what they were trying to tell those of us driven in the opposite lane. Moments later that became horribly apparent when I saw the car coming at us driven the wrong way on the free way. It blew past us in a matter of seconds; at first I didn't even understand how lucky I'd just been or even what happened.

This very moment as I'm sitting at home watching the news do I understand how lucky I actually was. It turns out that that very same ghost driver crashed into a car less than one minute after it passed me killing the ghost driver and seriously injuring the other driver who fortunately is outside danger.

I can't help but think that that seriously injured person could've been me if I'd decided to try and over take another car at the wrong moment or whether I could've done something to prevent the terrible accident which happened only moments after the car passed me by...

Read more about the incident (Danish)

posted on Thursday, 29 November 2007 19:40:12 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Wednesday, 28 November 2007

I was recently asked which podcasts I listen to and I've actually been planning to write about for a while. Commuting to and from work takes about 1½ hours of my time each day, my way of making an opportunity out of this time is to listen to a lot of podcasts. I learn something from the work related podcasts and I get to relax with the ones just for fun when my brain is fried after a day at the office :)

To mix it up I dabble with audio books from Audible.com but I've not yet gotten that routine down so I'm going to hold off with my commentary on that for now. Just know that if you're looking for richer content you can go there a get real books to listen to.

Work Related

Dot Net Rocks! w. Carl Franklin and Richard Campbell - Everything .NET. The first and definitely the best. You get a nice high level insight into many different products from Microsoft.

Hanselminutes w. Scott Hanselman - Lots and lots of information about .NET and its extended family. If you listen to only a few podcasts be sure to make this one of them.

ARCast.TV w. Ron Jacobs - Architecture on the Microsoft platform, higher abstraction level than .NET rocks.

Polymorphic Podcast w. Craig Shoemaker - the content is there but Craig doesn't keep a regular schedule which I dislike so I ditched it again. Still well worth a listen if the missing regularity isn't an issue for you.

ASP.NET Podcast w. Wally and Paul - geared towards the Microsoft .NET Framework and ASP.NET.

RunAs Radio w. Richard Campbell and Greg Hughes - a podcast for IT professionals working with Microsoft products. I listen to this to become better at understanding the issues which people of the development cycle has to deal with. Also I'm a hardware nut which is something they tend to talk about as well. Finally they offer up some nice tools and techniques for debugging problems which is something we as developers have to deal with due to increasing complexities and number of moving parts in the systems we build.

For Fun

Windows Weekly w. Paul Thurrot - news show that deals with stuff related to the Windows user, nice commentary on various news stories and they even talk about Apple from time to time.

Diggnation.com w. Kevin Rose and Alex Albrecht - Kevin and Alex chat about the new stories on Digg.com but mostly they are just fun to listen to. Caution! This one will make you laugh out load. I usually start my Monday morning drive to work with this one just to ensure that I get started on a good note. It's a video podcast but the audio version works very well.

Games for Windows - Talk about the newest games for Windows. Very funny. I actually don't get to play a lot of games due to lack of time so when I do I damn well make sure that they are the best. This is the place I go for that information.

1UP Yours - Talk about games for the console. I own a Nintendo DS and have a heck of a time to find good games for it. This is a great way to find them.

posted on Wednesday, 28 November 2007 13:11:33 (Romance Standard Time, UTC+01:00)  #    Comments [1] Trackback
# Friday, 23 November 2007

I had a problem today where a file exported from a system was showing up correctly in my text editor but whenever I tried to re-import it elsewhere my special characters where messed up; ÆØÅ showed up as garbled characters.

Of course the reason for this is the fact that the export sets an old code page on the file which isn't recognized by the receiving system so I simply had to change the code page of the file.

My instinct was to go with UltraEdit but I didn't have a license around so I thought that Visual Studio probably would get the job done for me and it did but it isn't too obvious how to do it; not that UltraEdit is intuitive in this area either :)

  • Open the file in Visual Studio.
  • From the file menu select Advanced Save Options
  • Select the code page


posted on Friday, 23 November 2007 11:32:44 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback

Commerce-Server-2007-Logo A while back a friend of mine posted a comment here asking me to describe what it's like developing with Commerce Server 2007. Initially I wanted to reply to him in comments but thinking more on it I really wanted to provide a different and real perspective on how Commerce Server is to work with as a product, a perspective in which I want to dig a deeper than the usual how-to and tutorials you see on the various Commerce Server blogs; mine included.

Check out part 1 entitled Secure By Default.

Three-Way Data Access

Commerce Server provides a lot of functionality out of the box; in fact it provides not one, not two, but three disparate data access schemes. Talk about getting your money's worth :) What this translates to is two things: Blazingly fast performance and a high bar of entry for developers.

Basically each subsystem in Commerce Server comes with its own data access system which means that you as a developer will need to learn and master them all to be able to leverage the features provided by the product. There are some general traits to the various data access model: You won't have to do much SQL, if any.

The Profile system is our general purpose data access system that we use whenever we have a custom piece of data we need to store; user profiles, addresses, cities, etc.. You can think of the Profile system as half of an ORM, it will help you get your data into your application in a nice fashion without you having to do SQL which is nice but it doesn't provide the last part of an ORM, the O, the part that actually turns the data in objects. You could call the Profile system an RM because what it provides you a general purpose name value collection that you can build on top of. When everything is set up the profile system is quite nice to work with, sure the API is a bit primitive compared to a full fledged ORM but it certainly gets the job done and a couple of nice abstractions on top will get you a nice domain model to work with.

The Order- and Catalog systems are a different story in that they are very specific in their purpose; they only deal with specific domain objects: Order- and catalog objects.

I would point to the Catalog system as the very best part of Commerce Server, it feels complete and very well done, both feature-wise and data access-wise. There are no nasty XML files, no having to map multiple levels of abstractions before getting the job done, it all works and works beautifully. The Schema Manager tool is to Commerce Server Catalog system what Enterprise Manager was to SQL Server 2000: Your one stop place to setup schema for your database. Everything is handled in a graphical manner which is translated to SQL statements, procs, and physical tables. The Catalog system is set up for performance which is clear if you go ahead and poke around the database, tables are denormalized, physical objects are created for new catalogs, and so forth. Interestingly you can actually learn a lot from the performance database design by going through the various databases of Commerce Server.

Finally we've got the Order system where we have the third and final data access system and also the newest one in our happy little family. CS 2002 introduced XML mapping files to map orders, lines, payments to actual objects. In that respect we are actually dealing with a full ORM here, only it handles orders and various associated object but nothing else. You'll have to get your hands dirty with multiple XML files and you'll have to do manual updates to the database to get everything going. You won't however have to do anything to actually use the subsystem that's nice squirreled away in the API.

It's interesting to note that across all the data access schemes Commerce Server actually contains some very powerful APIs, if only they were combines and unified to provide the full flexibility to all the subsystems. As a developer you'll have to wrap you head around all the models; on the upshot of this the individual models are not that complicated to work with so it's not exactly a Herculean effort you need to put into learning the APIs: It's mostly a matter of knowing where to look. There are good reasons for the subsystems not being unified but I'd really like a single architect at Cactus to sit down and consider what needs to be done to leverage either a preexisting ORM or to unify the functionality found in the disparate DALs.

Developing with Microsoft Commerce Server 2007 Part 3: Testability

posted on Friday, 23 November 2007 07:00:08 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback