# Tuesday, 25 September 2007

jaoo2007_1 Klaus Marquardt started out the session by stating that complexity is inherent in any system we build and as such we need to deal with the issue of managing it. What was interesting with this session is that Klaus laid out a number of rules for managing complexity. How do we know that complexity exists in a system? usually they way to tell is by way of anecdotal evidence, "I wouldn't touch that it's way too hard", "we don't ever go there", and so forth. There's a saying that goes, "The art of programming is the art of organizing the complex". So what do we do to organize the complex? What constitutes good rules for managing or even reducing complexity ion our solutions?

But first let us try and define complexity. Wikipedia defindes complexity as entanglement and intricacy, or in other words spaghetti code referenced back and forth in no clear way. That was the way I understood it anyway :) Another way of looking it at is like this "Complexity is in the eye of the beholder" which basically means that if you possess a different skill set than another person you might see complexity where in fact it is not. This definition comes from cognitive psychology.

The first rule is to acknowledge that our managers don't want to know. If they do want to know they're probably not a very good manager because they're focusing on the wrong thing, i.e. not managing but developing. If you get a manager involved in your job, developing software, you can be sure that you're going to face complexities due to lacking understanding of the solution.

Second rule is that you cannot get rid of intrinsic complexity in a problem area. Getting rid of intrinsic complexity would mean that you in essence don't handle the problem at hand thus creating useless software.You basically need to deal with the core business to be successful in your project.

Third rule: Hype is an indication what unresolved complexity exists at a large scale. Just think about SOA :)

Fourth rule states that process and technology can increase complexity found in your project. Process because it can lead to a lot of bureaucracy which will impede development and might lead to unnecessary compromises in your projects. Technology is obvious and we're all better for it if we pay more attention to that particular one.

Thomas McCabe invented the metric cyclomatic complexity in order to describe complexity in a piece of software. Klaus believes that this particular metric while useful is at the wrong level for architectural needs as it main deals with the control flow of the application rather than more high level stuff. He went on to describe a number of metrics proposed by Robert Martin defines for describing high level complexity. These include relational cohesion (are the right pieces in the right place), afferent coupling (external dependencies in the module), efferent coupling (dependencies on the module itself), instability (how concrete are our claseses, more concrete classes means a higher level of instability. It's important to remember that instability is not a bad thing), abstractness (are we based on interfaces, abstract classes). What we're looking for in an application is a abstract and stable or instable and concrete. Interestingly this explained to me what NDepend is trying to do. As I now understand the thoughts at the hear of NDpend I'm thinking that I might have to give NDepend another go :)

With all that talk about metric how do these impact complexity? It turns our that metrics is a bad thing for complexity as they have a tendency of forcing an organization on only the metric and nothing else which means a lot of things will fall between the cracks as it's impossible to define all important metrics in an organization or for a project. In essence metrics define their own reality. By all means measure all you want but keep the fact to yourself to avoid the metrics start setting the order of business.

Foundlings is a words I'd never heard before but what Klaus means with this are classes found in a namespace which has little or none relation to the namespace it's in but has a lots of relations to other namespaces. The classes just kind of ended up in that particular namespace for no apparent reason. Which leads nicely to the next rule: Carelessness creates complexity.

The carelessness rule not only tied in with what Robert Martin spoke about at the keynote: Leave the code base in better shape than you found it. What is in play here is the broken window theory where a single unfixed broken window will lead to more and more and finally create slum. So fix that window and avoid a big mess. It's very interesting to see the parallels between all the talks, I'm definitely starting to see trend emerge at this point.

Team dynamics is another area that creates complexity becayse you now have to deal with communication in the team, some kind of preparation before a project really starts is usually needed, and probably the most time consuming effort is the time you have to spend on reworking an application to make new features work. This of course is also tied to a lack of automated testing.

We ran out of time at this point in the presentation so he quickly fired off the remaining rules of complexity which are:

9: Where there is taboo there is complexity. WIf we don't talk about it it probably indicated a problematic area.

10: Local optimization adds complexity. Think higher optimized assembler code sprinkled around your high level C# code.

11: If previous project was complex for a team changes are that the next one will be too. The thinking here is that a team will keep doing what they know how to do, i.e. if they've gone done a significantly complex route previously chances are that they will do so again.

12: Indecisiveness yields complexity. Factors multiply, so if you have a lot of configurability, branches, versions, interconnected products, and options you'll have to deal with a lot of combinations of those factors making for a very complex system.

13: You can rid yourself of chosen complexity. Say for example that you've chosen some kind of standard product to solve a problem, you can get rid of the product or outsource development and rid yourself of the problem. Of course this might cause other problems but you got rid of the other one :)

posted on Tuesday, 25 September 2007 22:04:38 (Romance Daylight Time, UTC+02:00)  #    Comments [0] Trackback

jaoo2007_1 Mantra: Not all of a large system will be well designed. People don't behave as if they believe the mantra and will try to refactor the system to no avail.

Model: A system of abstractions that describes selected aspects of a domain and can be used to solve problems related to that domain though not as realistic as possible, it is useful to specific domain. In essence this is the ubiquitous language which we can use to talk between developers, business persons, and the computer.

There are always multiple models in a system because they need to satisfy different needs, e.g. a customer will not look the same in for CRM and shipping. Don't fight this fact by creating unified enterprise models across the business. Instead know the context you're operating in.We need to be clear about the boundaries we're operating within. Create a context map for your application which describes the various contexts and how they're interconnected, basically a number of context in which data is treated differently, in essence the number of different models available. Translation between the contexts will be necessary. Map what is there, don't map what you'd like to be there. Push translations to the borders and keep the interior unified and consistent with one model.

Benefits of context mapping: Align expectation between teams. Create environment with which models can be effective.

When a system is complex and encompass a lot of features you need to distill system into the core domain which is the interesting part of the system that makes it special. Everything else are generic subdomains, think the subsystems in an ERP system like accounting, purchasing,and so forth. Additionally supporting subdomains are there exclusively to support other domains of the system. The core domain is the part of the system that helps the business successful and succeed. It is usually the smallest part of the system. The core domain is defined by the business strategy, ask the questions: What makes your system worth writing? Why not buy it off the shelf? Why not outsource it? From the answer you'll be able to identify the core domain of the system and thus you main focus.

It's important to focus on the core domain what I see as an enabler for this is the use of standard software like we do in Vertica. Standard software will get you covered with regards to the generic subdomains and the support domain enabling you to work on the interesting parts of the system, or the core domain if you will.

Benefits of distillation: Focus effort and helps you see the forest for the trees.

Join strategy with implementation, escape the down-down/bottom-up dichotomy.

posted on Tuesday, 25 September 2007 18:01:42 (Romance Daylight Time, UTC+02:00)  #    Comments [0] Trackback

jaoo2007_1 Last session of the day is actually the one I'm looking forward to the most: How to brandish LINQ to SQL in a domain-driven design environment? I have some ideas of my own of course but I'm looking forward to stealing some from a couple of industry experts on the subject :) A little about Kim Harding. He's sold his part of Trifok and has founded a new company called Lystfisker.TV and does consultancy on the side. Jimmy Nilsson is of course the auther of a couple of DDD books.

Kim and Jimmy has two different ways of working with the product: Kim works more with the graphical designer where Jimmy starts with the code.

Avoid Anemic Domain Models which are data centered and very light on behavior. With this kind of model the behavior is usually found in transaction scripts instead of in the model itself. Queries often contain domain knowledge - they are just words without meaning in LINQ. The same query might be sprinkled many different places in the code. They have a VideoStore example they've implemented which includes a number of design patterns, associations, and aggregations.

Main focus of DDD is to work with the core of the applications and forget about distractions, don't worry about implementation details. Keep business rules separate from plumbing, e.g. keep data access code apart from business object. One of the main problems for a DDD process is that we have to deal with a relational database at some point; we need a way to bridge the gap between OO and ER and that is where LINQ to SQL comes into play. We like to start by creating the domain model and have the database take the backseat.

DDD is all about the ubiquitous language that we as developers share with the domain experts, it is developed with the domain experts in order to gain a shared idea of what we're trying to do. Many Entities are not fundamentally defined by their attributes, but rather by a thread of continuity and identity, e.g. the unique id. ValueObjects have no conceptual identity. These objects describe some characteristics of a thing, e.g. Address and Price. We can create a language from ValueObjects. An Aggregate is a cluster of objects that go together for data changes, e.g. Order-OrderLine. Repositories work as a starting point for a traversal to an entity in the middle of its life cycle; it has a collection-like interface.

Specification is used to express a business rule as a first class concept, i.e. a class. We can use it for validation, e.g are you satisfied by this validation and querying, e.g. retrieve all objects for query.

ISpecification<T>.isSatisfiedBy(T anObject)

UnitOfWork encapsulates the work we're going to do and describes the context we're working in.

UnitOfWork.Create<Repository>

Add to repository

Complete UnitOfWork

IUnitOfWork, IRepository, ISpecification<T>, LambdaSpecification<T>, IEntity<T> (T is used for key), IEntitySet<T>

Use specification as a query mechanism as well.

Good and bad about LINQ to SQL  when doing DDD; Very powerful for example regarding specifications. Makes it simple to develop. Generates very nice SQL.

Don't like lazy loading requires extra attention in the domain model classes (EntitySet). Default ctor is required. Readonly keyword can't be used. Problems with value objects are not supported in LINQ to SQL which is a problem because they are extremely important in DDD. Work around exists for value objects you can add the value object to another partial class which will be joined the class generated by LINQ.

We able to work in several different ways: We can start with a tables first approach, a diagram approach, or a classes first approach. However working with the classes first doesn't seem to be the main focus of the LINQ product. Diagram approach generates a lot of code. Classes don't follow convention over configuration everything is opt-in, counter argument is that with the designer everything is generated for you. You can have LINQ generate the schema automatically but you can only create and recreate the entire database. DataLoadOptions is used to describe what should be eager loaded but it can only be defined once per entity and context. LINQ to SQL was not designed for DDD but more for when you have an existing database. Leads to Anemic Domain Model because you get in the mode of scattering queries all over the place.

In conclusion LINQ to SQL looks like a good product and a step in the right direction but will be an eye opener for many developers doing DDD.

 

Streamlined Object Modeling book recommended.

posted on Tuesday, 25 September 2007 18:01:25 (Romance Daylight Time, UTC+02:00)  #    Comments [1] Trackback

jaoo2007_1 Developers != plumbers is a saying that Anders Hejlsberg has. We are spending too much time translating between multiple paradigms, be it XML, SQL, or objects. LINQ is the answer to this problem. We have two parts of LINQ that go to the database: LINQ to SQL and LINQ to Entities with LINQ to SQL as the simplest way of going to the database. LINQ to Entities is intended for the enterprise and provides more mapping features. LINQ to SQL works for both SQL Server 2000 and 2005.

So how does LINQ to SQL work? You basically write your query in the application and when you enumerate the query it will be converted to SQL and executed against the database. Adversely when you update everything is stored in the DataContext and is sent to the database on a call to the method SubsmitChanges.

How do you map a class to a table? You create your class like you would normally and decorate it with attributes. For a class you specify [Table(Name = "TableName"]. For properties you specify [Column] and additionally specify which property is the primary key like so [Column(IsPrimaryKey=true)]. Optionally you can store the mapping information in an XML file, though for simplicity and less errors keep the mapping info in attributes. No special base class is needed for the magic to work.

With the mapping done we create a strongly typed database context by inheriting a different class from DataContext where you can specify your tables. Running a query against the DataContext is where the mapping magic happens. You simply do a

var query = from c in db.Customers;

foreach ( Customer c in query ) // mapping magic happens here, not using the Table<Customer> type but the Customer type

{

// do stuff

}

Adding a where-clause will cause all the parameters in that clause will be generated to SQL parameters in order to avoid SQL injection. Generated queries are kept as simple as possible as to not confuse developers. Performing a select you can represent relationships which you cannot do in SQL; with SQL you can do only a flat resultset but with LINQ you can get back more natural OO hierarchies as results which are easier much to work with. The LINQ query language is a more expressive language than SQL and as such you are able to express every SQL query with LINQ. Only the properties you actually ask for are returned by the generated SQL query which is nice.

As an alternative to the attribute mapping you can use a graphical designer which allows you to drag and drop tables from the server explorer. It then creates the classes for you. the tools allows for overriding the generated values for the class name and various other stuff. It also allows for mapping to a stored procedure. Relationships are inferred from the database schema so get those foreign key constraints in place today :) Usually relationships are expressed by the object graph but in some cases the link is missing so you'll have to crack open the JOIN functionality of LINQ to get your job done.

One piece of LINQ to SQL that I'm not particularly fond of is the fact that interface based separation is not supported right out of the box so you need to enforce something like the Repository pattern and not allow direct access to the DataContext if you need that kind of separation.

Updates and deletions are very easy to work with. You basically update and create new instances of your objects like you would normally. When creating a new instance you also need to add it to either the DataContext or to the parent object to which it belongs. The only difference is that you must remember to call SubsmitChanges on your DataContext. Concurrency is handled via optimistic concurrency is used for updates, errors are raised if you encounter changed data. Autogenerated values like identities are automatically populated from the database into the object. If you want to use a stored procedures for selects, updates, or inserts you can configure a stored procedure to do so by modifying the behavior of the object in the designer. You simply drag the stored procedure you want to use into the designer and then configure it for the particular object. With the stored procedure is dragged into the designer you can also optionally call it directly on the DataContext.

For stored procedures returning a resultset you will default get a new type generated for the resultset of the proc. What's very cool is that you can configure the proc to not use the generated type and use a domain type instead. Additionally you can perform a query on top of the resultset returned by a stored procedure, something that is impossible with straight SQL.

An interesting feature I noticed in Visual Studio 2008 during the demo is a document preview when you're switching between open documents in the environment. Should make it much easier to find the particular code file you want.

Funny guy.

posted on Tuesday, 25 September 2007 16:50:48 (Romance Daylight Time, UTC+02:00)  #    Comments [0] Trackback

jaoo2007_1 Castle Project is an open source project created to make developers more productive and have more fun and enjoyable work. In the end to create a more maintainable app. Castle Projects consists of ActiveRecord, MonoRail, MicroKernal & Windsor (IoC containers like Spring.NET), Castle Contrib (people contributes here).

Common persistence approaches include spaghetti persistence with databinding in the UI, good for demoing, architecturally it's probably the worst way to do it. Second approach is ad hoc persistence where you write a stored procedure or SQL statement and use those directly, no fixed way of doing DAL. Next step is to have the DBA to handle persistence which leads to a lot of overhead in getting things done. After that we move to hand coded data mapping layer, the problem with the hand coded layer is that it's boring plumbing code. The answer to all these problems is an ORM where everything happens via a generalized framework.

Castle ActiveRecord is a framework which makes ORM really simple by abstracting away the complex stuff. ActiveRecord is built on top of NHibernate and leverages the work done there in the past. ActiveRecord maps to a row in the database. Attributes are used to map objects to tables.

Installer will create template projects in Visual Studio which will trigger wizard upon creating a new AR project, it will generate a main project and a test project to pull you in the right direction right off the bat. You then create your class like you would normally with properties and fields. By decoration the properties you start mapping the class to the database with the [Property], [PrimaryKey], [ActiveRecord] attributes. With the mapping done you need to set up the database context which will provide access to the actual database backend by instantiating the ActiveRecordStarter class. If the table you mapped to is not available you can have AR create the table for you automatically. This makes refactoring of the code and the database very simple because everything happens in Visual Studio instead of having you to go to both code and the database.

Relationships are where it starts getting complicated but AR has a way of dealing with this in an easy fashion as well. By adding a single attribute ([BelongsTo] / [HasMany]) to the link property AR will be able to figure out the relationship between two classes. AR will again generate the schema like before with the simple example of a single table.

Querying is handled by NHibernate query language (HQL) which is an OO query language. Unfortunately the queries are done in strings in the code. For an even better better approach you can use a fluent interface to express your queries in an intellisense supported manner. The fluent interface approach is a very nice way of doing querying because it produces a clean and readable code. Oren spoke about this very thing yesterday in his DSL talk.

Integration with Lucene.NET is provide out of the box which means that you can take any entity, index it, and do a full-text search on the object. Inheritance is another area which traditionally creates some interesting problems for us when doing ORM. AR supports this by doing join tables. AR provides validation ala what is found in Enterprise Library. You simply decorate your properties with attributes like you do when doing the mapping. Basically you can go nuts with this and doing all your validation in code rather than specifying the rules in the database.

Domain driven design is supported, you don't need to inherit from the special AR classes to get the thing going; instead you can use the Repository pattern. ActiveRecordMediator facilitates this approach. Lazy-loading is supported which is good news if you have rich object graphs in your application. Multi-level caching ensures that objects are only loaded once, basically the same concept as the DataContext in LINQ to SQL. You can even do distributed caching which I'm not sure is available in LINQ.

NH Query Analyzer is a tool like Query Analyzer from SQL Server...

No stored procedure support for CRUD. LINQ for NHibernate is under development and will go live within the next five to six months. Active development will proceed after this time as well so that's definitely good news.

posted on Tuesday, 25 September 2007 12:02:47 (Romance Daylight Time, UTC+02:00)  #    Comments [0] Trackback

jaoo2007_1 The keynote this morning was quite interesting as Erik Meijer detailed to us how he sees the future of software development from a technology perspective. Erik Meijer is oding research which according to him may allow us to defer important architectural decisions to a much later point than is possible today. His vision is to allow us to create applications without thought for the deployment scenario, they should just work based on the capabilities on the client. What this means is that in the future we might not have to select a web project over a winforms project. In essence this is deferred to either runtime or deployment. Enabling this is technology which will go into compiled IL and convert it to the which ever platform you wish to deploy to, adapting your code automatically to run on the platform you choose. He showed a couple of examples of this: UI and 3D. The UI part is not what I'd call trivial but he really blew my mind when he started talking about 3D. 3D is basically not a problem with technologies loike Flash and WPF but do you do when you need to support in DHTML? He showed an example of doing just that; they'd taken a model of a teapot and basically created than in pure DHTML using only CSS and DIVs, very impressive. Of course Javascript is no way near powerful enough to pull off a real time rendering but as we all know coding for the present is not the way to go, we need to code for the future and that's what they're placing their bets on.

Erik also mentioned parallel computing, a topic Anders Hejlsberg alluded to as being his next project last year at a TechEd 2006 in Barcelona. LINQ is basically a DSL for working with sets and as such is highly parallelizable. Interesting to hear that that particular project is moving forward.

I surely hope that Erik will be able to deliver on his project with regards to deferring platform to deployment but I fear the kind of autogenerated UIs we're going to see. No doubt a UI can be generated but different design paradigms exist for the web and the desktop are very different and as such I fear that it just won't work just porting the UI to the various target platforms.

posted on Tuesday, 25 September 2007 10:37:50 (Romance Daylight Time, UTC+02:00)  #    Comments [0] Trackback
# Monday, 24 September 2007

jaoo2007_1 Domain specific languages (DSL) are gaining in popularity thus I wanted to know more about them and how I can go about creating them myself. Oren gave an interesting talk on that very topic. So what do we need a DSL for anyway? Oren's main point here was that a DSL will ease communication with the business because you get concise code that you can show to the business users who can then verify that the code actually does what they want. The DSL is basically syntactic sugar which makes everything clean and easy to read.

While I certainly get what Oren is saying I've yet to meet a business person who can actually relate to IT in a deep way much less to actual code however simple it may be. In my experience business persons shut down whenever things start to get hairy, seeing actual code is definitely the hairiest of disciplines if you ask me.

Two types of DSLs exist: External and internal. An external DSL is created from scratch, a couple of examples of external DSls are SQL and Regular Expressions. Of course a DSL created from scratch needs a lot of work to get going thus we arrive at internal DSLs. An internal DSL is actually piggybacking on an existing language which makes it easier to get going. Boo is an example of one such language well suited to creating internal DSLs though usually one would use a dynamic language for these purposes.

So what is Boo? Boo is a CLR language like C# and VB; it compiles to IL. What Boo provides over C# and VB is an open compiler architecture which provides you with access to the compiler at compile time allowing you to change the output. Interestingly Boo is white space significant; I didn't get the concept before Oren's talk but all it means is that indentation decides which code blocks belong together. For developing Boo code we have the open source IDE SharpDevelop which actually supports C# also. I heard about SharpDevelop on DotNetRocks! but didn't think too much of it as a product. Who needs an IDE which does less than Visual Studio? Well Oren showed us why SharpDevelop is a big deal for DSLs. Basically you take the code for SharpDevelop and create a version for your code-savvy business persons and have them develop their code in an environment tailored to their needs.

All in all an informative presentation given by Oren Eini. I wasn't convinced about the value of DSls when I left the session but talking with Søren Skovsbøll who had a nice example of where you can use a DSL very effectively: Namely for creating questionnaires. Questionnaires can be notoriously complex to put together with various rules and sub-questions depending on previous answers. It would simply be much easier to express your intent with code in say a DSL :)

As a little addendum I can mention that Oren actually started out by talking about fluent interfaces which are the simplest way of doing DSL-like stuff. What's interesting here is that fluent interface doesn't require you to learn a new language to implement. You can do them in you language of choice and gain a lot of expressiveness. Unfortunately the plumbing of fluent interfaces requires a lot of work which is why you usually go for a fullblown DSL instead.

posted on Monday, 24 September 2007 21:35:33 (Romance Daylight Time, UTC+02:00)  #    Comments [2] Trackback

jaoo2007_1 Robert Martin gave a very entertaining keynote this morning. He basically spoke about defining our profession and what it means to be a software professional. As an industry we've traditionally been all over the map with each developer did what he thought best in a given situation and he tried to address some of the aspects we as an industry need to work with to reach a higher maturity level.

So how do we define professionalism? Robert Martin's idea of professionalism boils down to a shared mindset and a set of techniques. The mindset part of it is basically what the agile manifesto sets forth with the individual being responsible and generally behaving like "the good citizen". He elaborated on this by giving a couple of examples: If you write code try and leave the code base just a bit prettier than you found it. This resonated especially well for me because I inherited a large code base a couple of years back and actually adopted that very mindset. The result of this is that the code base is in much better shape than when I got back then. Furthermore he mentioned that the mindset you want developers to adopt is a that of "zero defects". Of course this is a utopian goal but if you strive for perfection you're sure to create a better product than you'd have done if you from the get go start out with a "shit happens" mindset :)

Another of his points that resonated with me is that of short iterations and no big redesigns. If you have a big mess in your system don't go down the road of a total redesign, chances are that the specs for the new system will change too rapidly making the new system obsolete before it's done. Instead try to tackle the mess one step at a time, much like the "leave the code base in better shape"-rule. This fits quite well with the idea of refactoring and as such is something I not only think is a good idea but something I've seen work in the trenches. Again this is not something which is set in stone as you will experience situations where a complete system redesign is not only appropriate but necessary; like for example when you're moving from one platform to another which is fundamentally incompatible with the other. Even in this case you could argue that technologies like web services can indeed enable an iterative approach to porting the system. But that's another story entirely :)

Test driven development is another topic he dwelled on for quite some time and with good reason too. I'm coming to believe that TDD is going to be an essential part of the modern developer's skill set and as such we need to start thinking about architectural support and guidance on it. With TDD as a natural part of development we simply put ourselves in a much better position to support our customers in the future. How many times have you experienced that development ground to a halt due to complexity, not necessarily business complexity, but complexity imposed by the fact that the system has become too much of a mess for you to be able to gauge the risks of adding a new feature, much less comfortably test it for release? With TDD this because a non-issue because you know that the system is passing your tests at all times. Thus you can actually achieve the "!refactor mercilessly" approach to software development that Martin Fowler et al advocates.

Finally apprenticeship is a topic which rung true with me. The premise of this topic was the fact the newly educated people simply don't possess the skill set to participate in a software development process right out of school. Therefore Robert Martin proposed a notion of apprenticeship for newly educated people getting hired into a software development organization. It's an area I've done a little bit of dabbling in but I've never actually gotten a completely fresh guy right out of college, my work focused more on getting people with experience up to speed in areas which they were lacking.

Robert Martin is a very skilled presenter who managed to keep the crowd entertained. His style of presenting is very active which makes it even more engaging. The keynote certainly bodes well for the rest of the conference.

posted on Monday, 24 September 2007 14:29:50 (Romance Daylight Time, UTC+02:00)  #    Comments [0] Trackback
# Sunday, 23 September 2007

jaoo2007_1 Next week marks the beginning of the JAOO 2007 conference and this year I'm going. JAOO is organized by Danish company Trifork (formerly East Object Space or EOS for short). JAOO is traditionally a Java conference with well known speakers such as Ted Neward, Martin Fowler, and more.

So why, might you ask, am I going to attend a Java conference? The answer is three fold: One I'd like to get a broader perspective on the business of software development and I'm confident that attending a conference not following the Microsoft sanctioned line will provide me that.

Secondly a conference like TechEd is geared towards actual products coming out of Microsoft rather than the ubiquitous ideas behind software development. Don't get me wrong it's great fun to learn about all the new toys coming out of Microsoft but really when you get right down to it the ideas behind are what is really interesting as they tend to stick around much longer. So what this boils down to is really that I hope to gain architectural insights for use on future projects.

Thirdly the .NET community gets a lot of inspiration from open source frameworks, ideas, and techniques. To me it seems that a lot of innovation is happening within the open source space, a lot of which we'll see a some point in .NET. Tools and frameworks like JUnit, Log4J, and Spring have been around for a long time in the Java space and they all have successful .NET ports; for NUnit so much so that it was included in Team System. With this I'm looking to learn more about the various frameworks and tools out there.

With all that said JAOO is turning away from the concept of a pure Java conference, this year two tracks are actually dedicated to .NET: The .NET Road and LINQ both of which I'll be attending. They cover Monday and Tuesday for me. Wednesday will be Professional Developer. I'm looking forward to cementing my ideas on LINQ and I'm positive that Tuesday will help me doing so.

Thursday is going to bring the part I'm looking forward to the most: The Test Driven Development tutorial. Basically an entire day of hands-on TDD. Although it'll be with a Java focus I'm sure that I can port the ideas directly to .NET. My only concern is that I'm going to be the only Visual Studio guy in there but I'll deal with that once I get there :)

posted on Sunday, 23 September 2007 20:48:55 (Romance Daylight Time, UTC+02:00)  #    Comments [0] Trackback

<a  href=dasBlog-2-Download-Now" src="http://www.publicvoid.dk/content/binary/WindowsLiveWriter/UpgradedtoDasBlog2.1_9EC4/dasBlog-2-Download-Now_3.png" width="500" align="right" border="0"> dasBlog 2.0 was released little over a month ago and I've been wanting to update to it for a while; yesterday I finally got around to doing it. If you're in the same situation and need to update an existing dasBlog install here are the steps to do for a 1.9 to 2.x update:

  • Copy bin directory
  • Copy root directory files, aspx, ascx, everything found in the root directory of dasBlog
  • Copy web.config
  • Copy DatePicker and ftb (this is just in case)

When you're done updating the code remember to reconfigure your IIS AppPool to run ASP.NET 2.0 as dasBlog 2.x is now a framework 2.0 application. Please note that if you have other framework 1.1 apps running in the same AppPool you'll need a separate AppPool for 2.0 as a single AppPool will, not surprisingly, run one framework version only.

With the updated version a couple of new feature are available on this blog: Paging on the main page, i.e. you can now move backwards through posts. Scroll to the bottom of the main page if you want to see how it works.

<a  href=dasBlog-2-Main-Page-Paging" src="http://www.publicvoid.dk/content/binary/WindowsLiveWriter/UpgradedtoDasBlog2.1_9EC4/dasBlog-2-Main-Page-Paging_3.gif" width="198" border="0">

Paging in the categories, instead of just displaying everything only five posts are displayed when you looking at a particular category.

<a  href=dasBlog-2-Category-Paging" src="http://www.publicvoid.dk/content/binary/WindowsLiveWriter/UpgradedtoDasBlog2.1_9EC4/dasBlog-2-Category-Paging_3.gif" width="377" border="0">

If you're running your own blog on dasBlog a nice little addition is found in the admin module. It's now very easy to switch back and forth between dates when you're viewing you referral stats. Very handy.

<a  href=dasBlog-2-Admin-Referrals-D" src="http://www.publicvoid.dk/content/binary/WindowsLiveWriter/UpgradedtoDasBlog2.1_9EC4/dasBlog-2-Admin-Referrals-D_3.gif" width="436" border="0">

posted on Sunday, 23 September 2007 14:18:21 (Romance Daylight Time, UTC+02:00)  #    Comments [0] Trackback
# Friday, 21 September 2007
# Thursday, 20 September 2007

http://people.uleth.ca/~roberto.bello/Let me take you back to a time before Windows Vista, before a DOS-free Windows OS even. Let me take you back to 2001 with Windows XP just ready for release. Back in 2001 I wrote a review of Windows XP for a Danish online site which I found the other day while digging through my documents folder.

With Windows Vista released and basically a very different landscape from what was the case in 2001 I thought it would be fun for you to see what I had to say about Windows XP back then. The original article was done in Danish so I've translated it for you here. If Danish is not an issue for you, you can check out the original article as well.

It was fun for me to read my reactions from back then. Some of the stuff just wasn't researched all that well on my part but I got other stuff right. My comments about Messenger are particularly funny to me because I'm an avid user of the program today; I do blame the integrated version of Windows Messenger for this though as I quite liked my ICQ experience back then. Also if you look closely on one of the screenshots you'll see evidence of my foray into Java. Like my Linux experience it's not something I speak too loudly about today :)

Finally a lot of the doom and gloom I wrote about never came to pass. Microsoft really transformed themselves between then and now. Product activation never turned out to be a problem and of course neither did Error Reporting. I do believe that Windows XP is one of the best releases of Windows ever, I like my Vista but we're looking at some of the same little things today that we saw back then too. Little bugs, unexplained Explorer crashes, stuff that's ironed out of Windows XP by now.

Windows XP: From DOS to Windows

Introduction

A long time has passed since the world was given the first version of Windows as we know it today. Many people describe the step from Windows 9x to XP as comparable to the step we took forward with the release of Windows 95 which replaced DOS. I have to admit that I don't see the release of Windows XP as quite that big a deal. Having used Windows XP for a while I just don't buy that the step from a command prompt to a graphical user interface should be comparable to the step from one graphical user interface to a slightly updated, easier to use version of the same interface :)

Windows XP ships in two versions: Home and Professional where Home is the cheap trimmed down version. The pro version pretty much matches the functionality which we know and love from Windows 2000 Professional; the pro version is the version to get if you're serious about your computing experience.

Installation

I won't comment too much on the installation process besides mentioning that it's very easy to deal with. Windows XP even recognizes a lot of RAID controllers which are becoming increasingly mainstream as they come integrated on many motherboards these days. A general observation about the new OS is that a lot of drivers come integration into the OS. I was able to get Windows XP up and running with all my hardware configured without adding a single driver to system. A very impressive feat.

The installation is reminiscent of the one found in Windows 2000 and Windows Me; it's basically a hybrid of the two.

Having completed the installation you notice how quickly the system boots. As a Windows 2000 user I'm used to turning on the computer and waiting at least a couple of minutes before the system is up and running; even with 2000 running you have to wait additional time for the various startup programs to launch. Windows XP boots at half the time and seems more zippy than Windows 2000.

The New Stuff

What constitutes this seemingly "great" leap forward for Windows XP? This is what I'm going to find out in this section by visiting some of the new features found in Windows XP.

Look and Feel

windowsXP-desktop The most obvious new thing about Windows XP is the user interface which has had a huge overhaul. From being gray, boring, and all business it's gone to exciting, colorful, and playground-y. It really is an interface you just want to explore and click your way through. As we all know a picture says more than a thousand words so take a peek at the screen shot of the new desktop I've included. Notice the gorgeous, modern feel of the desktop.

What lies at the heart of this user interface are skins. You can go out to the Internet and download new skins for Windows XP which change the look and feel of the OS. You only get a single skin out of the box but you do have the opportunity to go back to the classic look of Windows 2000 if you want to.

windowsXP-controlpanel Besides the very prominent new user interface there are a couple of nice little features which gives the OS a feeling of completeness to it - when it works, more about that later.

The control panel has gotten an overhaul as well and as a result we've gotten a different approach to managing the computer. Microsoft decided to go with a more task-based approach which to my mind works well for newbies but starts to break down for people who know what they're looking for. Luckily it's possible to revert back to the original view of the control panel just as was the case with the desktop. Very nice!

windowsXP-explorer Windows Explorer is another area of Windows which has gotten a face lift which means that the tips you get at the right side of the screen are actually useful. The area is now used to display relevant operations for the selected object, help, and much more. New users especially will find the new interface appealing because of the readily available help. For experienced users this can be turned off as well. Notice the highlighted blue area on the screenshot of Windows Explorer: It shows a selection of drive. It's little things like that which makes Windows XP feel so complete.

windowsXP-visuelleIndstillinger1 All these skins and flashy features come with a price of course. The price is a performance hit. The animated menus, the shadows on icons, etc. costs. It becomes even more apparent should you move to a old machine which doesn't complete live up the recommended Windows XP specs. Luckily Microsoft did foresee this scenario and included options to turn off the flashy stuff conserving system resources for the important stuff like getting your work done. You get a nice granular option list from which you can turn off individual visual features.

Most people are familiar with Windows Update at this point. The service is integrated right into Windows XP making it very easy to get those updates installed. Windows XP will now automatically notify you if there are available updates which by itself is nothing new as you've been able to do this with an add-in for Windows 2000 from Microsoft. What's new is the fact that Windows itself will download and install the updates for you without any intervention from you at all. Again something new users of Windows will find particularly useful with more and more viruses and other nasty stuff floating around the Internet.

Speaking of the Internet Windows XP now comes with a built-in firewall which is turned on with a single mouse click. It's important to remember that the built-in firewall is no where near as powerful as a standalone solution like ZoneAlarm from ZoneLabs. You're still free to install a third party firewall if you want to. On a positive note the built-in firewall is better than no firewall at all but some people feel that it can provide a false sense of security. For me personally it's a nice addition because it helps protect the less savvy people thus giving the users a better online experience. I did try to install ZoneAlarm to see if it would run at all. The only problem I encountered was the fact that ZoneAlarm wouldn't start with Windows on every boot even though I configured it to do so. A small error - whether it was a problem with Windows blocking other firewall applications or just an incompatibility between the products I can't speak to. For Microsoft's sake I hope it's the latter as the former would cause quite the outcry.

In the security department I need to mention driver signing, a process which been around for a while. User were worried for a while that Windows XP would be unable to use drivers not signed by Microsoft. Luckily I can attest that this is not the case, Windows XP has no problems with installing unsigned drivers at all; XP simply made me aware of the fact that I was installing an unsigned driver and proceeded from there. Although the unsigned drivers are accepted by Windows XP a number of warnings appear which might confuse the user; only time will tell whether this will become an actual problem.

Worth mentioning is the fact that many drivers from Windows 2000 are directly compatible with Windows XP which will ease the transition somewhat though not completely. As with any new OS release drivers will be an issue so watch out for hardware compatibility with Windows XP before you buy.

windowsXP-compatablity A lot was done to ease the transition and I've not experience any major issues other than the ZoneAlarm not starting with every boot. Windows XP itself has a trick up its sleeve which should make the most stubborn programs run: Compatibility. Basically the compatibility feature will make a program believe that it's running under a different version of Windows than XP thus allowing it to run.

When errors do occur Windows XP will provide Microsoft with information about the error which they can use to create a better OS in the future or release fixes more quickly. A general tendency for Windows XP is to do opt-out of things which provide Microsoft information. Privacy concerns aside I believe that you should at least be given the option of opting out automatically instead of requiring you to go digging in preferences to turn it off if it bothers you.

The Bad

Error Reporting is very telling as to how Microsoft decided to implement features in Windows XP. Personally I'm not very keen on the "I know what's best for you" mentality which permeates Microsoft these days. A number of programs are installed out of the box and integrated right into Windows. The first example of this was Internet Explorer which came with Windows 98. Officially the explanation was that it created the platform for a lot of the new features in Windows 98 - many didn't buy this explanation and saw it as a way to compete unfairly with then top company in browsers: Netscape. We all know how that particular piece of history turned out.

Now it seems the time has come for Microsoft to attack the instant messaging market head on. Microsoft has had a presence in the IM market for a while but hasn't made any particular inroads into it. This is about to change with Windows XP as Windows Messenger comes bundled into the OS, Messenger of course is Microsoft's idea of what an IM client should be. I've never used Messenger myself but thought I'd give it a shot with it already installed and good to go. after a couple of the days the verdict was in: It has to go!

Luckily for me every program and feature installed in Windows is still controlled in the add/remove programs portion of the control panel so I go there and start looking for Windows Messenger. Interestingly enough Windows Messenger is nowhere to be found in add/remove programs, not even under the OS features where it belongs. Further digging reveals that you can't even prevent Messenger from starting with the OS. Fine I try to close Messenger but Microsoft apparently finds Messenger to be such a useful tool that it automatically starts with Outlook Express as well, an application I use all the time.

Maybe it's just me but I'm starting to see an uncanny parallel with the Netscape-saga emerge here. I need to set the story straight and mention that I did find a way to remove Messenger some days later but the "fix" involved editing a text file hidden in the Windows folder itself. A lot of people don't want to or are not capable of doing this which effectively means that Windows Messenger is here to stay. I'm no expert on good marketing behavior but I'm pretty sure that this is isn't it.

As previously mentioned a couple of bugs crept into the RTM version. One such bug is the System Tray. The Systray doesn't seem to remember the settings you set for it and the program which couldn't launch it every boot. I tried associating an external program with a file type but inside the program itself but that didn't work and Windows XP still had control of the file types in question. To remedy this I needed to right-click the file and select always open with. Not the most intuitive way to it if you ask me.

The last area of critique is probably the most notable one: Product activation. We all know the key principle from games like Quake and Half-Life where the game simple refuses to launch if you don't supply it the correct key. Now imagine a world where you're only able to do a thousand changes to your game config after that you'd need to call up idsoftware or Valve to a new key for your game. This is the reality we're facing with Windows XP. Microsoft added product activation in an attempt to stop piracy of Windows. Unfortunately it's the consumer who pays the price in the form of a less flexible OS.

Microsoft slacked their security a little bit by allowing you to do a thousand minor changes to your config before requiring a reactivation. What constitutes a minor change then? Well, Microsoft is very tightlipped about that fact leaving the customers hanging. That Microsoft released another version of Windows XP which doesn't require product activation is a different story. How Microsoft can believe that hackers, crackers, and pirate won't go ahead and use this version instead of the one protected by product activation is beyond me, but they must know something that I don't.

All in all you end up with a strange feeling having shelled out the big bucks for a Windows license. You don't really own the product and you can't really do with it what you want. All the while the pirates are having their way with the enterprise version of the products.

In Conclusion

When everything is said and done, all argument weighed, I still end up with a pretty good feeling. Of course some things could be done better or differently. That's the reality of created a standard product used by millions, you simply can't hit the mark for everyone.

Technically Windows XP is the long awaited combination of Windows gaming OS Windows 98 and the more business-minded Windows 2000 and Microsoft has pulled a product which will appeal to the masses.

You will pay a premium for a license if you want the latest and greatest from Microsoft but in return you'll get a nice environment for you work and gaming needs. There's no doubt in my mind that a couple of hundred bucks are better spent on Windows XP than a piece of hardware which will be obsolete in six months anyway. An OS simply stays around for much longer.

Windows XP is expected to hit the streets October 25th 2001.

posted on Thursday, 20 September 2007 22:10:34 (Romance Daylight Time, UTC+02:00)  #    Comments [0] Trackback
# Monday, 17 September 2007

Announcing the next meeting in Aarhus .NET Usergroup. Be sure to mark your calendar for Wednesday 26/09 18:00. Please note the new time for the meeting which is 18:00. The last meeting ran pretty late with the discussion going strong when I left at 23:00 so to accommodate that we decided to move the time forward a bit.

Leave a comment to sign up for this meeting.

Practical Information

The meeting will be held:

Wednesday 26/09 18:00

at:

Kristelig Fagbevægelse

Sintrupvej 71

8220 Brabrand

Map

Agenda

Usergroup News

To keep everybody inform on the various stuff going on we'll begin with a short update on planned sessions, new initiatives, and so forth. This will also be your chance to give us some feedback on what you would like to see at future meetings or voice your interesting in presenting a subject matter yourself.

BizTalk and Enterprise Service Bus, Troels Riisbrich Underlien

The term Enterprise Service Bus (ESB) has as many definitions as SOA itself. Depending on the company, the person you'll get a different explanation of what ESB is. We'll take a look at what ESB can be, how it can help us, and how it fits into a service oriented architecture. Furthermore we'll dive into how Microsoft recommends putting together an ESB and how we at Vertica have gone about implementing the ideas in a concrete projects.

Tour de Krifa

The CTO of Krifa will give us some history on Krifa and how they go about developing their internal solutions.

The Nutcracker

Open mic. This is your chance to get the discussion going on a topic interesting to you. Last time we discussed how to go about getting certified on .NET, techniques for reading and retaining the information, and books to get. So feel free...

posted on Monday, 17 September 2007 14:59:29 (Romance Daylight Time, UTC+02:00)  #    Comments [13] Trackback
# Tuesday, 11 September 2007

To my delight the next beta of Windows Live Writer has been released. It's going to be the last beta before a final release too so go check it out. I switched from BlogJet to Windows Live Writer when the very first version was released and I haven't looked back since.

I was disappointed to find that the dictionary didn't work for me in beta 2 due to the fact that Live Writer reads the current locale of the computer and enables or disables the dictionary accordingly; of course the workaround did remedy this but it's still annoying to have to apply a third party app to every single Live Writer installation you do when you move from computer to computer as I do. Luckily beta 3 fixes this problem and no workaround is needed anymore. Yay!

Windows-Live-Installer Interestingly a new installer has been added to the product which I'm not too sure I like. Basically it's a Windows Live Installer which pimps the other Live products such as Messenger, Mail, etc.. Although I don't like it I do like the fact that it advertised Photo Gallery - a product I've been looking to get my hands on for some time. Photo Gallery doesn't really add anything on top of what you get in Windows Vista other than the fact that it runs on Windows XP as well. I had hoped that Flickr integration would be in there but that's sadly not the case. It does provide a Publishing feature but only Live Spaces is supported for pictures and Soapbox for videos. Bummer.

Other stuff includes:

  • Insert videos using our new 'Insert Video' dialog
  • Upload images to Picasaweb when publishing to your Blogger blog
  • Publish XHTML-style markup
  • Use Writer in 28 additional languages
  • Print your posts
  • Justify-align post text
  • Better image handling (fewer blurry images)
  • Resolved installation issues from last release
  • Many other bug fixes and enhancements 

Download Windows Live Writer Beta 3

posted on Tuesday, 11 September 2007 12:52:29 (Romance Daylight Time, UTC+02:00)  #    Comments [0] Trackback
# Monday, 10 September 2007

04092007 cleaned When Max announced that he was leaving Microsoft a while back it was a good/bad news kind of thing. Bad news because Max had been my access to the Commerce Server team for a while and very good news because he announced that he would be providing Commerce Server training for the masses; a market that has left a lot to be desired over the years.

A short while after the announcement Microsoft let us know that they would be running a training course on Commerce Server with none other than Max doing the training. Needless to say that I was sorely tempted to go but in the end we decided against it due to the traveling involved and the lack of information regarding the tech level of the course. It all worked out quite nicely as we enquired as to whether he'd be interested in coming to Denmark and do some training for the entire e-commerce team at Vertica which he accepted to do.

Let me start out by saying that I'm extremely impressed with the material and the way he handled himself the entire time both before he got here and when doing the actual training. There's no doubt in my mind that Max provides the single best source of training on Commerce Server today, bar none. Our team consists of people of varying degrees Commerce Server experience and he managed to organize the training in a way which kept both the proficient and less proficient interested. He did this by not only request specific areas of interest on Even before he got here he wanted us to come up with very specific areas on which to focus the training and he kept tweaking and tuning the training on the fly based on our feedback. Very nicely done indeed.

So what gives Max Akbar the edge over the competition? Well first of he's worked with Commerce Server on actual projects for customers such as CostCo, Costco.ca, GAP, and Banana Republic which definitely gives him a unique perspective on CS solutions. Additionally he was part of the CS team as a program manager. Both of these facts color his outlook which means that he's got a definite enterprise-y look on things. Not surprisingly enterprise in the US definitely doesn't equal enterprise in Denmark and getting a perspective on that part of the story was very interesting to say the least. Also it helped keep the training relevant and interesting because he was able to relate most of the material to real-world scenarios albeit on a much larger scale than we're used to. Finally we got some interesting insights into the inner working of the Commerce Server team something that helps us understand why a particular feature in the product is done the way it is. It almost felt like getting into the psychology of the product :)

The training consisted of three days worth of tightly packed information. Rather than regurgitating every note I took I'd much rather like to focus on the highlights; there were more than a few too :)

Cactus

The whole Cactus affaire left me a bit confused mainly due to the fact that Ryan Donovan posted that Microsoft is committed to Commerce Server as a product only to finish off that particular announcement with the fact that they're effectively outsourcing development of the product to Cactus Commerce. Now what's interesting here is the fact that Microsoft does this with other products it's just never clear which ones they are. In the case of Commerce Server Cactus has actually been involved in the development of the product even in version 2007 so what's happening here is the logical extension of that. Whether a good or a bad thing remains to be seen but the fact of the matter is that when Microsoft announces that they're committed to a product like they did with Commerce Server they're committed for years to come so I'm not too worried here. Now the story might be very different if Vertica was based in Canada which Cactus calls home because we'd be competing with the company which effectively controls the Commerce Server source code. Not exactly what I'd call equal footing and definitely not something that works very well with the Microsoft partner strategy.

Management API of Commerce Server

Shifting gears completely we learned that the old Commerce Server 2002 APIs are still available in 2007. The Commerce Server team just doesn't advertise this fact very loudly. Basically it's possible to use many of the well known management samples from the 2002 installation so be sure to take a look at that if you need to automate deployment of sites and stuff like that. You'll find the stuff you need in the SDK\Site Management directory under the Commerce Server 2002 installation directory.

Tools, Tools, Tools

Max has taken the time to write a lot of useful tools and utilities for Commerce Server 2007. Many of which he's already mentioned on his own blog like PackageThis for creating stand-alone versions of the documentation from the MSDN web site.

More interestingly he's created a tool called Secure Commerce Server 2007 Tool which will automate the entire security configuration process setting role membership on everything from the database, file system, to Authorization Manager stores. Unfortunately the GotDotNet page is down but hopefully he'll get around to creating a Codeplex site for it soon. It takes my own idea of simply scripting the database security permissions to a different level for sure.

How many times have you needed to extract a file from a PUP archive and had to do a custom unpack just to get at that single file? Whenever I've gotten in that particular situation in the past it's been a pain so I was very glad to learn about PUPViewer which will allow you to not surprisingly view the content of the PUP file but additionally it'll allow you to extract that annoying little file you were missing.

Secrets of Commerce Server

OK so not so much a secret as a good tip: Take a look at the contents of the installation directory of Commerce Server 2007. Chances are that you'll find some interesting stuff which is not listed in the start menu. I'd not even thought about doing so myself but in truth I've been missing out because of that. Among other stuff in the \tools folder you'll find tools for automating import and export, resynchronizing scopes in AzMan stores. You've probably taken a look at the \sdk folder but if you haven't you need to. Interesting stuff in there for sure.

Staging Service

The most under appreciated feature of Commerce Server 2007 is the staging service. What you can do with this thing is move data from one environment to the other basically automating a task which typically has been quite complex in the past. An example would be to allow business users to edit catalogs on a staging environment and then push the catalog into production once they're happy with their work. Not only does this alleviate some of the complexities of deploying business data but it also allows for some interesting deployment scenarios, e.g. have the staging environment on the LAN and the customer store front at a hosting provider allowing for a very smooth user experience for both the business users AND the actual customers. I'll definitely look into the various uses of this one some more. Unfortunately it's only part of the enterprise version and it doesn't support a truly flexible deployment model because you need an enterprise version on each of the servers you deploy it to.

Interestingly the staging service is useful for other solutions than Commerce Server ones because it also allows you to more files from one server to the other; add to this the fact that you can run tasks before and after moving the files and you've got yourself a very powerful deployment system for doing scheduled deployments. Basically you can keep your hands off of your production environment if you get this right.

Scopes in Authorization Manager

Role based security is a well known technique but AzMan introduces another layer on top of this (it actually introduces two but that's not interesting here). I call this additional layer Business data security. This is my own self invented term so bear with me if the meaning isn't clear. Basically what scopes allow you to do is to define security on the data itself instead of the functions of your application. This is hugely useful in scenarios where you want very tight control over your users and your business data. I've already got a couple of instances where this will be useful so I'm definitely glad I got it cleared up. There's no magic involved in the process, if we take the catalog system new scopes are created whenever you create a new catalog, properties, etc.. The secret sauce is a naming convention which means that the catalog subsystem knows whether a user has access to view a particular catalog, e.g. a user would have to be assigned to the CatalogScope_<CatalogName> scope. Easy, isn't it :)

Data Warehouse Demystified

The last day took us into the data warehousing capabilities of Commerce Server. It's an area we aren't too familiar with so it was great to get some insight into what makes this feature tick. What DW boils down to is a PUP package with existing cube and DTS definitions that's pretty much it. Having created those you need to run a little tool to get Reporting Services going by deploying the report definitions to the server. That's it. Having successfully done that you'll have access to the data warehouse capabilities. Do keep in mind that they're only available in the enterprise edition.

Debugging

Nothing new was revealed for us here but I still think it's valuable to know this so I've included it in this post anyway. Max had a couple of pointer on how to debug problems with Commerce Server. Two tools came up: The tried and true Fiddler and reliable Reflector. These two have helped us more times than I wish to count.

If you don't know already Reflector allows you to peek inside compile .NET assemblies by decompiling the IL to readable C# or VB. The only thing lost in this translation are the actual variable names but you still get the idea behind the code. What we use Reflector for is basically for finding the right places to plug into Commerce Server when we're doing generic extensions for the product.

Fiddler comes in handy due to the fact that Commerce Server 2007 introduces a web service API. Fiddler is extremely good for figuring out what goes wrong in a request or simply trying to understand how a particular feature works. Take for example the business user applications which provide access to also every single part of the CS API. The interesting thing here is that if you can do an operation from the business tools you can do them programmatically; very useful for figuring out how to accomplish some specific task.

If you're doing any kind of integration with Commerce Server you need Fiddler installed on your machine. Period.

In Conclusion

Having Max come to Vertica and do his training has been a very good experience. Both for the guys who's been working with Commerce Server for a long time and the less experienced guys. For me personally it means that I now feel very comfortable with the product because I was affirmed in my knowledge on the product at every turn. What Max provided me was insight into why some of the feature were done the way they were and some tips and tricks which I'd probably never have thought of on my own.

Not only is Max very solidly founded in Commerce Server he's also a great guy who's very easy to be around. The casual training session is certainly attests to that fact and I'm sure that we all learned a great more due to this fact. I'm certain that we'll have him back when our team grows even bigger.

So thank you Max and we'll be seeing you :)

posted on Monday, 10 September 2007 16:48:51 (Romance Daylight Time, UTC+02:00)  #    Comments [4] Trackback