# Friday, 24 November 2006

First a little background: iTunes has a limit of five on authorized computers that is computers which are able to play the tunes you've bought in the store. Apple provides a handy dandy function to deauthorize every one of your computers all at once. This is very useful for people like me who goes installing Windows every other day and forget to deauthorize the computers before reinstalling. Sadly Apple people has decided in all their wisdom to only allow deauth of all computers once a year. WTF?

Tonight I wanted to authorize my new Vista install but lo and behold every single auth was used... by deleted installs. Yay me!

I wrote to Apple support less than two hours ago about this. For the sake of fun I tried authorizing again just now and guess what, they'd actually already reset my.

Mad props to Apple support for dealing with this in such a timely manner.

posted on Friday, 24 November 2006 22:07:48 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback

Remember the post I did about Ryan Donovan's Commerce Server 2007 presentation at TechEd? Well here's a chance to read about how he experienced the whole thing in his post aptly named How NOT to Demo CS2007 (and gain TechEd Infamy). I actually think he did OK considering that everything was against him that day.

posted on Friday, 24 November 2006 15:05:05 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback

I posted a picture of Ron Jacobs doing an ARCast at TechEd Barcelona and today I stumbled across a video of that very same thing. In it Ron discusses the Loosy Goosy SOA anti pattern with Christian Weyer, a pattern which is near and dear to my heart as this anti pattern above all seems to rear its ugly head the most.

If you want to get a feel for the event check it out. Check out the original post as well while you're at it.

posted on Friday, 24 November 2006 15:00:42 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Wednesday, 22 November 2006

So my colleague Sune, who might eventually get around to updating SQLJoint, finally coaxed me into taking the exam even though I never did get around to actually preparing for it. The only help I had was a Transcender from the COM days. Turns out, not that much has changed in the exam format. Lucky me.

With this one down I'm only a single exam short of achieving MCSD.NET so I may just have to take a look at 70-316 Developing and Implementing Windows-based Applications with Microsoft Visual C# .NET and Microsoft Visual Studio .NET in the very near future. Who knows I might be able to do this one with minimal effort too; one can only hope. This MCSD project of mine has been going on forever so I just want it over with at this point. I really can't stand having unfinished projects running for years and years :)

In case you were wondering about my score it was a clean 700. Work you do to get beyond 700 is wasted right? Right??? :) The score is actually quite a bit lower than my usual ones which normally run in the high 800 to low 900 area but I guess the amount of work I put into this one shows. It also shows that my standard operating procedure of doing a test exam, reading material, doing another test exam really does work.

posted on Wednesday, 22 November 2006 11:58:30 (Romance Standard Time, UTC+01:00)  #    Comments [1] Trackback
# Friday, 10 November 2006

I've my fair share of posts from TechEd with I've found the technology interesting and wanted to learn more about it and so I did :) Workflow is something I see a need for in 99% of projects I've done up until this point and in coming projects I have on the horizon. I won't bore you with the details about WF here you can however go to the site for WF if you want to get some background information on the technology.

So this was a whiteboard discussion and lots of great questions were asked. The first one:


Why did Microsoft create WF? They did it because they identified a need for such a thing internally in Microsoft. Lots of their products have some sort of work flow built like Exchange (Assign Task to User), SQL Server, and of course  BizTalk. A lightweight framework for doing workflow was needed in order to allow for it to be used in smaller scale applications such as web- or win applications.

WF is already in use in some Microsoft products today examples are SharePoint server, nothing too surprising there, however it may surprise you to learn that Speech Server utilizes WF also. There are other examples but those are the ones he could remember off of the top of his head. Also BizTalk will use WF for orchestrations in the next version.

When should we use BizTalk instead of WF?

It's basically a build vs. buy decision. You can do everything with WF you can do in BizTalk it requires some custom programming though. I liken the situation with that of SharePoint Server and SharePoint Services. With services you get a lot of standard functionality to build on and with server you get lot of extra bits on top of that.

BAM can monitor WF in BizTalk R2

How do you expose WF to the end user?

There's a MMC snap-in called the Workflow Manager which can display and alter workflows. WF provides a design surface for workflows which can be hosted in any application. It can even be used in a web scenario as it can save itself as an image and be displayed to the user in a browser that way. You cannot edit in that case but there are some third party clients which uses AJAX to provide a similar experience to the design surface.

How do we create workflows?

You use the design surface or create it using code. A workflow is stored as XAML the workflow engine loads this file and processes it. As we are only dealing with XAML workflows can be stored anywhere.


When you start a new process the workflow definition is stored alongside the process. This means that a running process will be unaffected when you update a workflow definition. You override this behavior and have the definition updated for running processes as well.

How do we create a page flow with WF?

Page flow is a UI driven by WF. You create a page for each state and utilize a state workflow. Rules will determine which step is the next in the flow. A demo was shown of an internal prototype for creating navigation workflows. We might see it in the next version.

Exception Management?

Exception management as in errors in a business process not as in System.Exception. It's a good way to introduce workflow into legacy applications. When something happens in the existing process you can have it trigger the workflow to handle the exception. From here on in the workflow will do the heavylifting until the issue is resolved. At this time the workflow will pass a message back into the legacy process to tell it that the issue is resolved.


WF supports rules with its rules engine. Rules are a way of externalizing conditions on might a workflow acts. Complicated rules are more easily handled this way. They are stored in a separate file from the workflow definition. Visual Studio includes a rules editor when WF is installed. The rules engine can be employed outside the WF framework and be run against objects if you only need the rules engine. it supports chaining meaning that if a rules passed in step 1 but step 3 causes the rule to not validate any longer the rules engine will detect this and fail the process. The behavior can be turned off if you so desire. The rules should be employed if they change a lot. There is some deadlock detection present but this is an area which will be expanded in the future and a rules API exist if you need to do advanced deadlock detection today.


A tracking service is provided which writes results to the SQL Server datastore on top of this you can build reports which can be used to display workflows which might be having issues.

How to create business processes and longrunning processes?

I actually asked this question because it's something which has been on my mind as this is the area where I see an immediate need for WF.

Some kind of host is needed, typically either IIS or a Windows service. Choosing one over the other is a question of how much processing the workflow will do. IIS may be busy serving up pages and it would be unfortunate to burden it further with heavy workflows. You would use IIS if the workflow is operating in a request/response environment.

Event driven workflows are the way to go with longrunning processes. Basically a nobrainer there.

Restarts are handled gracefully if the reboot is done properly. Turning off the machine without allows it to shutdown will of course not work the workflow will restart from the last save point in this case. A proper shutdown will cause the workflow to persist itself using the persistence service (which you can create yourself). Powerouts and other situations which can't be handled by WF can be dealt with by using System.Transaction.

Multiple consumers can work on a single store of processes to allow for scaleout scenarios. Processes will be logically locked to prevent them being pulled into multiple consumers.

posted on Friday, 10 November 2006 23:18:40 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback

Jeff Prosise, Wintellect and Andres Sanabria, Lead Program Manager for AJAX did a great presentation on AJAX patterns and what we need to look out for in the brave new world of ASP.NET AJAX. As I've mentioned earlier Nikhil Kothari called in sick which was unfortunate.

It's great to see that Microsoft not only throws cool new stuff out there and sees what happens. They also take responsibility for the new technology and provides guidance on how to use and how not to use it. This presentation is a great example of that.

UpdatePanel is a great way to get started with AJAX and it gets the job done in 95% of all cases. That's my guess anyway. What we need to be aware of the fact that we are only doing partial rendering with UpdatePanel, not partial postbacks as you could be lead to believe when seeing the control in action. What actually happens is that the entire ViewState is sent back to the server for processing in order to allow for an experience very similar to that of a normal postback, i.e. the full control tree will be instantiated and the entire page life cycle is processed. The only difference shows up when we get to the rendering of the page. Here ASP:NET is smart (a header is sent back stating the fact) about the fact that it's dealing with an asynchronous postback and only renders the parts of the page which are needed. What this translates to is lots and lots of traffic going across the wire and we need to very aware of this fact.

Another thing to be aware of when working with UpdatePanel is that it has a property called UpdateMode. This property determines what triggers an update of the control. The default setting is to trigger an update whenever an asynchronous callback happens. Suppose you have four UpdatePanels on a single page. The default would cause each of these to get updated whenever an update is triggered within one of them. To fix this you would set the UpdateMode to Conditional allowing for more granular control of the panel. You can then either specify a number of triggers in the control definition on the page or you can call the Update() method of the panel you want to update on the serverside.

Now what can we do about the ViewState being sent back and forth? The answer is webservices. When invoking webservices in ASP:NET AJAX data is sent across the wire serialized using JavaScript Object Notion (JSON). So you either expose an ASMX endpoint on the server or create a pagemethod and then call that from the client. The framework will provide a proxy you can use to interact with the service but you'll have to act on the data coming back from the server. Not too much of work and it's a very valuable technique if you need to minimize traffic. In case you were wondering AJAX enables standard ASMX endpoints to serialize their output using JSON notion because of an extension installed with ASP.NET AJAX Extensions. You can try this out if you have the extensions installed by adding adding /js to the end of the endpoint address, e.g. endpoint.asmx/js.

A thing that has been on my mind regarding all this AJAX goodness is how useful it really is if we cannot maintain browser history to make the back and forward button work. Also not having URLs which the user can copy and paste from the address bar is a pretty critical issue. Have no fear. Nikhil Kothari to the rescue with his HistoryControl which tracks events in our AJAX apps. It'll even keep the browser URL updated with state information so that the user can use the URL in a portable manner like she is used to. You can read more in Nikhil's post Back Button Support for Atlas UpdatePanels. This addresses my biggest concern with AJAX as we just can't break the conventions people have gotten used to over the past years. Epically not in a customer facing web frontend such as a retail site. An internal business site is a bit more forgiving due to the fact that we can perform user education but it would still be better not to break anything.

So they also introduced a new concept called structured scripting which basically is a fancy term for wrapping the AJAX code in a custom control or behavior where a control is a full blown client/server control while the behavior is some wrapped JavaScript you can attach to another element, basically an ExtensionControl as I understand it.

One interesting thing is that the AJAX library introduces a client-side page life cycle kind of like what we know from standard ASP.NET. It's complete with events we can hook into and do all sorts of interesting stuff.

posted on Friday, 10 November 2006 22:28:19 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback

OK so we are at our final day of TechEd and I thought it pertinent to go listen to a session on security with focus on web app security so much in focus these days. Keith Brown of Pluralsight did the presentation which amounted to a good talk about well-known attacks which we need to be aware about like SQL Injection, cross site scripting, and finally SQL Truncation attacks which is a new thing. Really not much new there but he did provide some techniques for dealing with it.

Of course we all know that the way to go with SQL Injection is to always use parameterized queries and that's the end of it. Basically what we need to be aware of is that dealing with string concatenation will cause problems at some point so try to avoid it. We don't have many instances in our applications where we absolutely need to use dynamic queries and I think the same thing goes for many others.

Cross site scripting is actually an area which I haven't given much thought, probably because of the inherent nature of the systems I do. Cross site scripting is what happens when we allow a user to input unsafe values into an application such as HTML and script code. If we persist this data in say a forum application, we'll end up in a situation where the unsafe input may be presented to other users of your site and that will cause problems. The answer to this of course to filter unsafe data. Microsoft patterns and practices has a tool which will help you do so it's called Microsoft Anti-Cross Site Scripting Library; quite the mouthful too :) It's a basic class with two methods for filtering and replacing in a string before you store it or send it to the user.

Also Microsoft has a tool for doing threat modeling. I thank we all do threat modeling at some level but seldom we get around to putting the information into structured form. Microsoft Threat Analysis & Modeling v2.0 allows us to store the information in a structured manner and it will even help us by analyzing and comparing the information to a threat database which contains known exploits in order to do a visualization of the application and it's weak spots.

Main points of the talk:

  • Consider user input dangerous
  • Place input values into strongly typed variables, i.e. do int age instead of Request.Form[ "Age" ];
  • String concatenation in concert with any form of SQL is dangerous
  • Consider user input dangerous, it really gets down to user wanting to exploit our applications
posted on Friday, 10 November 2006 21:51:40 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Thursday, 09 November 2006

I attended another whiteboard discussion on the subject but elected not to post anything about it because it really was a waste of time. I chose to try my luck one more time as the moderator of the session would be Nikhil Kothari. Unfortunately it turned out that he has fallen sick and so we had a couple of other guys from the team take his place. Now this wasn't a bad thing by any means. Their names elude me right now but we had a Product Manager for AJAX and the developer who did UpdatePanel which if you've done anything with ASP.NET AJAX will know about. Both were very knowledgeable on the subject and we had a great discussion. We were kind of all of the map with a lot of focus on both ASP.NET 2.0 in general and of course ASP.NET AJAX.

They set the record straight about the composition of the AJAX package something I've already done a post about. Other than the general discussion I took away three things:

ControlState is a new construct of ASP.NET 2.0 which I'd forgotten everything about. If you are a control developer you need to keep this one in mind. We all know that ViewState can be rather bloated so we turn if off in places we don't need. Now I don't know about you but I've done my fair share of controls which will explode if someone turns off ViewState on them, simply because that's the only option we had in the past for saving temporary data. With ControlState however the story is different. You simply cannot turn off ControlState. This means two thing for us: 1) We can happily go on creating our state bound controls, 2) We should take care to only store what is absolutely critical for the control to work in ControlState. Think about the DataGrid, what if someone decided to place its state data exclusively in ControlState. Not a pretty picture I'm sure.

== is not ==

What do I mean with that? Well it turns out that Javascript has a very funky implementation of our dear == and != operators. What it does is that it tries to do some magic for some reason to create maybe a better comparison? Whatever the case the end result is that you get some very weird results when using == and != in Javascript. The rule of thumb is to use === (yes triple equals) and !== because to behaving more like what we are used to in C#. That's certainly something to keep in mind.

Finally Nikhil has some very cool tools. Where I'd like to point your attention in the direction of his tools page and Script# in particular. Script# is a tool written in frustration with Javascript. Nikhil wanted a way to write Javascript without writing Javascript, the solution? Create a tool you write C# code in and then let it convert it to Javascript. For a person like me who's done very little clientside development and certainly never any systems with an actual Javascript strategy it looks like a great tool. I certainly will be doing some experimentation in the very near future.

Someone asked a question about who you deal with multiple concurrent HTTP connections from an AJAX frontend. Browsers today pretty much limit you to two connections per domain as a default. So what what do you do if you need more than that? A trick suggested is to create multiple subdomains and have your website respond to requests on additional host headers to trick the browser into believing that the request goes to different servers. A technique used by local.live.com for example.

Regarding performance there's one caveat that you should know about when dealing with AJAX and even ASP.NET: Never deploy a site with debug set to true. It's always been true for ASP.NET now even more so. The reason is that AJAX comes with two sets of scripts: One for debug mode and a more efficient one for release mode. They are equivalent in functionality but the debug script include lots more code and as you can imagine it's in hum readable form. What AJAX will do to optimize performance in release mode is to replace variable names with shorthand names in order to speed up parsing of our scripts. It will also remove some type checking in the library which under debug mode checks whether the correct types are passed to the AJAX Library functions. Finally whitespaces will be removed to keep the script lightweight and again to speed of parsing.

posted on Thursday, 09 November 2006 20:02:25 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback

Next up Fritz Onion with a presentation on the different tooling we can use for debugging AJAX apps while we wait for the Orcas release of Visual Studio. What the session amounted to was an overview of the different extensions for IE and Firefox available in this space. What was great about the session is the fact that he gave a very good presentation. He speaks in an engaging manner and gets the points across easily. Not much to write home about contentwise we're were basically given a demo of the stuff I list further down. It did give a great starting point as to which tools to try out thus getting me started relatively easily with creating our AJAX apps. Also a good thing to have a couple of more tools on your radar should you end up in a situation where you'll need it.

The tools available today include:

Visual Studio Javascript Debugger

Microsoft Script Debugging

Firefox Javascript Debugging

Firefox DOM Inspector


  • You will need this one

JavaScript Shell

  • A shell for exploring Javascript code as well as executing little snippets


  • HTTP proxy which intercepts the traffic on port 80 and displays it. You can even modify the request to do some testing or hack proofing if you so fancy.

ASP.NET Development Helper

  • Main function is to decode ViewState and show its contents. You also have a DOM explorer in there.


  • Does the same thing as Fiddler but it doesn't install itself as a HTTP proxy which is both a good thing and a bad thing. A good thing if you don't want some debugging tool to take over your system, a bad thing because you will have to do a couple more steps to get it working. I'd probably stick with Fiddler.

IE Developer Toolbar

  • Probably doesn't need further introduction

Web Developer Toolbar

  • Same for this guy. Though one thing to note is the fact that it has so much stuff that it gets hard to find what you're looking for in the menus.

Tamper Data

  • Surprisingly enough it allows you to tamper with the request, like Fiddler and TcpTrace does.
posted on Thursday, 09 November 2006 19:31:38 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback

First session of the day was Anders Hejlsberg on LINQ to SQL. As the conference has proceeded my interest in LINQ has increased and this session was the culmination of that. LINQ to SQL is basically an OR mapper and a very impressive one at that. As I wrote in my post regarding the LINQ project in general you get the ability to do integrated queries in the .NET languages.

What LINQ to SQL provides you with is the following:


You'll derive from this class in order to create a strongly typed representation of your own database. This effectively becomes the manager for your database for stuff object references and primary keys. More on this later.

Entity classes

For each table you get a class representing that table. You can choose a number of ways to attack the mapping of tables to objects. You can either use the visual designer in Visual Studio to drag and drop tables from your SQL Server datasource onto the design surface this creates the actual code for the class, you can go class-first where you create your classes first and then map to your datastore either by adding attributes to the the members of the class or you can externalize the mapping in a XML file, finally you can opt for the full blown mapping provided by a tool called SQLMetal which connects to your datastore and creates classes for every single table in there.


Of course relationships are observed and properties are created to map those as well. For example you have the classic customer and orders mapping where a customer has many orders. The orders would be translated into a collection available on the customer class. Very nice indeed.


This wouldn't be LINQ if you weren't able to query your data. LINQ provides are very rich model for querying data be it SQL, XML, objects, DataSets, and more. You can do all kinds of crazy stuff not available in SQL, e.g. you can use the output of a stored procedure to further query that data. It's all handled by LINQ which does some additional manipulation in memory. Ad hoc joins are available like you would expect. Finally LINQ uses a notion of deferred execution for executing the queries. What this basically means is that the query won't get executed until you actually need the output. So defining the query doesn't execute it, it won't get execute until you actually use the results it produces.

Of course the language constructs you use to create your queries are the same regardless of the data your are querying so you only need to learn a single set of operations in order to work with all the supported data stores.


For updating and inserting data LINQ provides a mechanism for keeping track of the changes. You can go nuts and do a whole bunch of updates and nothing gets updated until you call the SubmitChanges() method. Pretty nice to be able to batches in this manner.

Stored Procedures

Procs are supported by creating methods representing each stored procedure on the DataContext class. Each method returns a strongly typed resultset which contains the output of the proc. Say for example that you do a proc which returns CustomerName and CustomerAddress, your result would then have properties with the same names. No more looking at a method which returns a DataSet, then having to go to the database, and look at the output of the stored procedure in order to figure out which columns you have to work with.

I was very curious as to how LINQ to SQL goes about getting data across and I've very pleased to report that all my concerns were put to shame. For example I was wondering whether data projection would yield lots and lots of different anonymous types flying all over the place thus lessening the value of our business objects. This turns out not to be a problem as LINQ to SQL is strongly typed which ensures that you only get your business objects returned. Also we use lots and lots of lazy loading to allow our web apps to scale well. This fortunately is also supported in LINQ, it's the default behavior in fact, but you can override it if you want. You can be very explicit about what you want LINQ to bring back to you.

If you haven't picked up on it yet I can tell you that I' very excited about LINQ at this point. Now all I need is for someone to create Query Analyser for LINQ for me and I'm good to good. What's even more interesting about such a tool is the fact that you would be able to target not just SQL but all the supported data store, although the queries wouldn't be interchangeable.

posted on Thursday, 09 November 2006 19:24:23 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback

Ron Jacobs did the last session of the day and while I did get around to finding my pad to take notes I never did get around to actually doing any notes. Ron spent the idle time before the session asking the audience what they think about ArCast, the podcast he hosts. While I appreciate the technique of engaging the audience I really didn't care much for it. It's basically my equivalent of waking up naked from the dream where you walk down a crowded street only to find that you are completely naked. Yes I know, it's all very friendly or should I say fiendly? :) Maybe I'll be more susceptible next time around.

Ron is a very engaging speaker and he did a very good job to describe the various anti-patterns of SOA. He also mentioned a couple of real patterns for us to use in our architectures. It's nice to see a very authoritative figure on the subject give his opinion. Particularly because it gives me the ammunition in future meetings to pick people off when they suggest some sort of half-assed integration mechanism.

Anti-patterns include:

  • CRUDy interface: Database CRUD methods instead of business processes.
  • Enumeration:  Stateful servicecalls that muddies the datasource responsibilities.
  • Chatty interface: You call multiple interfaces to accomplish a single operation. I'd like to add that this is a bad idea under normal circumstances as well. You basically delegate the responsibility to do the right thing to the user of your interface even though you might as well do the work yourself with a single interface. Always try to keep your interfaces as simple as possible.
  • Loosy Goosy AKA the Über Serice AKA XmlSerializer Execute(XmlSerializer request) in the quest to create a very flexible interface you end up creating a very thightly coupled interface instead. Because the consumer doesn't know what to send to interface he's afraid to change his implementation on the client. This is my personal favorite by the way as I've seen it once too many on solutions I've done consulting on.

Patterns include

  • Well basically you should think about how transaction were done in the old days before there was such a thing like computers. People transferred documents (messages) between departments (explicit boundaries), and the processing was done in an asynchronous manner (people didn't stand around waiting for the person responsible to process the document). He calls this the Document Processor pattern.

What he failed to mention was how to keep up developer productivity while doing all this kind of stuff. What I've found is that we don't really have a problem doing stuff "the-right-way(tm)", however doing so is very costly. We effectively spend a lot of time doing what I would categorize as infrastructure code. How do we deal with this added complexity? Now one could argue that we should use the WS-* functionality provided by WSE or WCF. My argument against this is that we don't yet have an RTM version of WCF and WS-* is really just a layover for the release of WCF and really this is only a partial solution to address the added costs of implementing a service oriented architecture.

posted on Thursday, 09 November 2006 01:35:32 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback

My reason for going to this session is that I am looking at a rather larger big project which will need both Commerce Server 2007 and some sort of Content Management System. I've been looking at SiteCore, Synkron, and of course SharePoint 2007. Now we've been trying to set up a meeting with Microsoft for ... well it feels like forever since I need the info like 1½ months ago. Fate would have us wait until TechEd to get at the info but bout was the wait worth the wait.

I basically got all the information I needed and the answers I wanted. Patrick Tisseghem really knows his stuff and it's obvious why he was give a speakers slot at TechEd. I wouldn't want to go up against him in a market that's for sure :)

The session revolved around showing us the "out-of-the-box" experience of the CMS features of SharePoint 2007. I've done some work on Microsoft Content Management 2002 and as far as the developer experience goes it pretty much delivers. However it leaves a bit to be desired on the business user side. You basically need to do a whole lot of Visual Studio development in order to get the business users going.

SharePoint 2007 delivers a very different story. Visual Studio is only required for adding new functionality to SharePoint. You will not need it to do the basic CMS tasks like you did in the past. I was very impressed with what you get out of the box. And I'm proud to say that we as company did "the right thing(tm)" when we went along with SP07 before we had all the facts.

Patrick demoed how we'd go about doing various common tasks in SP CMS. While everything was available to customize I do have second thoughts on the usability of the product. You have all kinds of switches to pull and my concern is that it will be too much for the business users. You will need a larger number of clicks to just enable CMS functionality of SharePoint. When you have enabled that, you will need to do a number of additional tasks in order to display custom information on your site.

With that said I think that we as developers will be faced with the task to simplify the job for business users. SharePoint adds everything you'll need to create a CMS site (yes the kitchen sink is included) thus we'll need to do everything in our power to present the available feature in a manner that will stimulate the users rather than scare them and ultimately alienate them.

posted on Thursday, 09 November 2006 01:04:22 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback

First session this morning was C# 3.0 with Anders Hejlsberg. Basically the talk revolved around the features they've added to the language in order to enable LINQ which I talked about yesterday.

Type inference

  • A new keyword called var
  • Type is inferred from the right side of the expression: var customer = new Customer(); var = 10;

Anonymous Types

  • Creates a new type based on the data projected to it
  • Unspeakable types, only works with the var keyword

Extension methods

  • A way of extension existing objects with static methods
  • You could for example add a new methods to object and have it available on every single object in your application

Lambda expressions

  • Nice way of avoid having to create delegates and passing those to a method

Variable initializers

  • Customer c = new Customer() { Name = "Søren", Age = 28 };

Automatic Properties

  • Not yet included in CTP
  • string Name { get; set; }
  • Need to have both getter and setter.
  • Readonly possible by using { get; private set; }

Expression Tree

  • Data represenation of a query instead of IL
  • Better at showing intent than IL


Also we got a few tidbits about where the language will go after the Orcas release. Of course he couldn't commit to anything but he did mention a couple of areas they're looking at: Dynamic languages and multi-threaded programming. They are keenly aware of the fact that people have regained interest in dynamic languages and so they are thinking about how to go about incorporating dynamic-like features into the language. Anders mentioned that he would try to adhere to a static typed environment where you don't need to specify type but the types are still there for the compiler to use while checking code at compile time and to do various refactorings more effective. It's good to hear that Anders is sticking up for the strongly typed environment.

My two cents on the matter are that while dynamic languages are great and very RADy, the argument that they are more viable today due to the fact that more unit testing occurs I just don't buy. Yes unit testing has become more prevalent but the fact of the matter is that you simply cannot do 100% code coverage when you need to get a product out the door. You'd basically need 100% code coverage in order assure the same levels of quality if you don't have the compiler to catch stupid code errors like type mismatches and that stuff. We as developers are lazy, we don't always get around to doing all the necessary unit tests, it sad but it's fact and so stupid errors will start to occur in our code.

I couldn't get an answer to my Orcas timeframe question yesterday but today Anders did mention that we are looking at maybe a year, maybe more, maybe a bit less to use his own words. Nice to have some idea of when we can expect all the new stuff in final form. Of course much of what is being demoed here is available in CTP form.

posted on Thursday, 09 November 2006 00:37:05 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback

I was looking forward to this session due to some confusion I've experienced due to the fact that Microsoft has chosen to create multiple packages for the ASP.NET AJAX solution. The reason for this, as I learned, was to enable a more agile and transparent development process on the team involved with creating the actual AJAX controls. So my understanding of the situation is this: We have one team which creates the core ASP.NET AJAX Core and a second team, the agile one, creating controls based on the work the core team performs. This is the ASP.NET AJAX Toolkit. They release the toolkit as shared source on CodePlex and the community participates in the development, three Microsoft developers are on the team along with ten community guys, so definitely an interesting effort for sure.

The talk was mostly about explaining the ASP.NET AJAX Toolkit and to show how to go about creating new AJAX controls and behaviors (client-side behavior). Shawn Burke did a good job of remedying my confusion on the various downloads available under the ASP.NET AJAX name . The toolkit basically sits on top of the ASP.NET AJAX Server Extensions and a set of its own baseclasses which include both Javascript- and .NET classes. Of course all of this runs on ASP.NET.

Crossbrowser support is provided for Internet Explorer 6 and 7, Firefox, Safari, and Safari with partial support for Opera.

They have a very cool notion of Extender controls which, as you may have guessed, extends existing controls. For example the AutoCompletion extender. Think Google Suggest and you have the gist of it. More importantly you can employ the control without writing a single line of Javascript. This is the case with many of the controls you simply add some XML to the definition of the control and you are good to go. This will allow for more rapid deployment of the AJAX technology. I already have a couple of places where I could improve usability dramatically by only employing the serverside controls.

posted on Thursday, 09 November 2006 00:35:51 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Wednesday, 08 November 2006

... aka Free Beer and Tapas night :) OK, so with the day's session overwith we proceed to the exhibition hall where Microsoft has kindly provided us with beer and tapas along with lots and lots of great technical content provided by various exhibitors. I got a nice demo of CodeRush and had them answer some critical questions about the product. Also I got a chance to see the Jetbrains guys showing off Resharper and DotTrace. Two of my favorite products.

Finally Carl and Richard was hosting Speaker Idol which is a contest amongst the attendees in which you can will a speaker slot for next year's TechEd. Afterwards they hosted the 64-bit Question Game Show where some great prizes were given away. Oh and did I mention the free beer? :)

I also got to see another Microsoft podcaster in the flesh: Ron Jacobs was there doing a live ArCast from the floor.


posted on Wednesday, 08 November 2006 00:22:49 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback

There's not much I can add to the slides presented by Bradley Millington other than the fact that I was unable to get him to commit to a time frame for Orcas, I did try though :) Also there are some very cool stuff coming for Visual Studio to aid us in developing AJAX enabled web applications such as intellisense for JScript along with type inference which lets Visual Studio glean the type of a variable or the return value of a method by analyzing the code. Also new "hacks" are added to allow for XML comments in JScript and referencing .js files with other .js files. I call them hacks due to the fact that they really are comments formed in a special way which Visual Studio looks for; in essence they only work in VS. It's still nice to have though :)

Interestingly the CSS designer from Expression Web Designer will be available in Visual Studio which basically renders the entire Express Web Designer SKU moot in my opinion. In any case I'm glad that we'll have the designer in VS as I was actually thinking that Web Designer would be a product I'd need to install to get at it. The design surface is codenamed Sapphire by the way, you can always count on Microsoft to come up with some really cool codenames only to have them screwed up by marketing later on :I)

Finally we have Blinq which to me is the single biggest innovation in VS Orcas. If you are unfamiliar with Blinq it's basically a tool for generating a data entry web frontend based on the schema of your database. In the future Blinq will be able to generate a partial frontend based on your selection but for now it generates for the entire database schema. Also it will be able to target object to leverage our existing business objects. As you might have guess from the name Blinq utilizes LINQ to get at the data. Blinq allows for very  rapid prototyping of a web site kind of like what Ruby on Rails does. It's very nice to see Microsoft taking cues from other communities than the .NET one. Sadly this might mean the end for Subsonic before it even gets off the ground for real. Pity.

All in all a solid presentation. As it was the last presentation of the day a measure of quality would be my yawn rate which was way down in spite of the time of day :D

posted on Wednesday, 08 November 2006 00:07:06 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Tuesday, 07 November 2006

Eric Rudder gave a pretty standard peptalk about how we should start using Windows Vista and Office 2007 the moment they are released. Actually Office should have gone RTM at this time. The keynote did have it's effect though, I was very exited to get to the actual sessions afterwards. Mostly due to the fact that all the attendees were gathered for the keynote and I really got a feeling of just how large an event TechEd Europe really is.

I leave you with some pictures of the keynote to enjoy.


posted on Tuesday, 07 November 2006 23:30:10 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback

I was very much looking forward to this session as it involves my main area of expertise. Ryan Donovan was presenting and he pretty much gave a presentation which reminded me very much of the one I do for our customers. With that being said he did provide some cool insights into the future for the product as well as some clarification on stuff I'd missed previously. Lets face it Commerce Server 2007 isn't the hottest development platform yet, it certainly doesn't compare well to stuff like SQL Server 2005, .NET Framework 3.0, so I was left attending a pretty intimate session which was actually very nice. Compared to the pre-conference the day before the conference center was packed full so attending a session with around 50 people was actually very nice and provided for an environment where people were engaged in the talk.

Now we can expect to see a service pack to Commerce Server 2007 in early 2007, you figure out how that correlates to the 2007 version number of the product :) It will focus on providing support for Windows Vista for the business tools, it will contain hot fixes, and quite interesting possibly some new features, although we couldn't get him to divulge any more information on that.

I learned a little something about the StarterSite. It would seem that Microsoft is backing it as a production ready site which should be used as a basis for further development. I guess I have to do some reviews of the code to decide for myself, needless to say that I'm wary based on my experiences with the retail and retail2002 sites of the previous versions. Also quite unexpectedly I learned that the final version of the StarterSite actually includes unit tests. Very cool.

Wishlists is a feature which is new to Commerce Server 2007, but I didn't know what the actual feature behind it was called. Well now I know: Named Baskets.

We all know that the predictor feature was removed from the product. Interestingly enough SQL Server 2005 BI contains a similar feature and what happened is that the feature was moved to the SQL Server team where it was evolved into what we have today in SQL Server. A great little tidbit.

Probably the most exciting piece of information for me personally is that Microsoft will be releasing a whitepaper on how to integrate Commerce Server with Office Sharepoint Server. What I've come to realize is that Commerce Server is only half the story. In order to provide customers with the complete experience a CMS is needed to edit and present the "soft" data should a layouts, images, and so forth. This is what we'll have a whitepaper on in early 2007.

I was very glad to hear the answer to Ute question, "what about pipelines?". To that he simply replied that this will be the last version which includes pipelines as they are working on a new architecture for this part of the product.

Microsoft does not have any plans to release a web based business for say Sharepoint.

Another very interesting piece of information is that they are doing work on enabling Commerce Server as a platform in much smaller scale scenarios which is great for Denmark where not too many large scale opportunities exist. It'll probably revolve around some licensing magic to get that done but that's just me speculating at this point. No further information on it was given.

Some interesting tidbits of information but I wish that Ryan would have gone a bit more into detail with his presentation. Also more real world examples would have been very nice. Unfortunately he was plagued with technical difficulties during the presentation which didn't help at all, although he was very good at handling it. It was hilarious when his demo PC actually rebooted automatically because Windows Update had downloaded updates and was installing them automatically :)

posted on Tuesday, 07 November 2006 23:19:43 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback

This session seemed like a good idea at the time when I was looking over the slots but in actuality it turned out to be quite the disappointment. It would seem that the AJAX part was included only to lure people in. Brad Adams did the session which was on cooperation between designers and developers. A topic which is relevant to most of us doing web applications. Unfortunately the session turned out to mostly be a demo of Express Web Designer where Brad Adams did a draggy droppy demo of the product. People who know me know how I abhor drag and drop :)

Express Web Designer is what I would call a light version of Visual Web Developer Express targeted at the design crowd out there and my humble opinion is that it fails miserably as such. What it's got going for it is a very good design experience for CSS but that's it. The premise for the product is to cut out the step in the design process where designers create image mockups of a web site and hands it over to the developers who then create the actual CSS and HTML layouts. The problem is that you loose fidelity in the process and while Microsoft is right on that I don't believe that they are going about solving the problem in the right fashion. They want the designers to deliver CSS and HTML layouts complete with ASP.NET controls embedded but I just don't see that happen. As far as I can tell graphical designers need to target as wide an audience as possible and they just don't do that but learning to use a tool which creates a product which basically only caters to ASP.NET developers. That's one problem the second is that the designers didn't become designers to muck around with ASP.NET controls and code. In my eyes they want to be able to set up color palettes, images, and other graphical elements. They simply don't care for the the nitty gritty of a framework like ASP.NET ... and why should they?

Now back to the demo. I mentioned that we were given a demo of the Expression Web Designer. To that I just want to add that Brad Adams should know his audience; the conference is called TechEd Developers for Christ sake, not TechEd Designers. People don't want to see him futzing around with margins, padding, divs, and so forth. We're developers, give us some code! As a result people started to ooze away around the middle of the presentation, leaving a jampacked room to only two thirds.

He did actually get around to doing some code, showing what the developer does with the product the designer delivers. A very cool thing he showed was how to AJAX enable an ASP.NET site in a very simple manner by using the UpdatePanel control which is part of the ASP.NET AJAX framework. While that was very cool it just took up too little of the session. Also we got got see a little bit of the ASP.NET AJAX Toolkit which contains some very cool controls for extending existing controls and all new controls too.

So to Brad Adams I give two thumbs down for the session, and a friendly reminder: Know thy audience.

posted on Tuesday, 07 November 2006 22:56:35 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback

I just had to see Anders Hejlsberg do a presentation; the fact that it was on LINQ wasn't too shabby either. I was excited about the technology in the first place but having actually seen Anders do LINQ queries against various data sources really has me firing on all synapses :)

LINQ or Language INtegrated Query is basically a strongly typed way of querying data directly from a .NET language like C# or VB. LINQ consists of three providers: LINQ to objects, LINQ to SQL, and finally LINQ to XML. If you have had to write a program which needs to extract data from a XML file you know how much of a pain it is to do thus paving the way for a need for LINQ to XML. LINQ to SQL simplifies access to relational data and also provides mechanisms to do OR mapping. One question on my mind with regard to this feature is support for stored procedures and LINQ to SQL does indeed support them which to me is very good news indeed. LINQ to objects is the general purpose tool of the bunch which allows you to do queries against in-memory objects on any collection which implements IEnumerable which basically means every single collection in the .NET Framework. Anders did a couple of demos on this, first querying diagnostic data on his machine listing every running process, later querying reflection info for the String class and doing a couple of group by's. Pretty neat to see how many overloads of the CompareTo() method that exists. Think of the possibilities.

Now what is even more exciting is that the technology is domain agnostic basically enabling you to do a single query across multiple disparate data sources, e.g. you need to cross-reference some data from Active Directory with employee data from a database. Today it takes a pretty significant amount of code to do so, tomorrow with LINQ you will actually be able to do the correlation with a single query producing a strongly typed resultset which you can then perform further queries against if you so desire.

People might ask the question, "Why do we need to reinvent SQL?". To that the answer is that LINQ is so much more than SQL as I've described above. You cannot do querying across disparate data sources with SQL, you cannot query object, you cannot do OR mapping. It's pretty obvious that LINQ is here to stay and that we need to invest in the technology simply because it makes development more intuitive and more effective. Combine LINQ with Blinq and we're in serious business but that's a story for another post :)

posted on Tuesday, 07 November 2006 22:33:07 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback

I don't know about you but this is just uncanny to me:

posted on Tuesday, 07 November 2006 22:31:52 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback

When eating out in Barcelona, be sure NOT to go to a place called EL GLOP.

or you just might end up like this.

posted on Tuesday, 07 November 2006 08:00:53 (Romance Standard Time, UTC+01:00)  #    Comments [2] Trackback
# Monday, 06 November 2006

So ... first day at my first TechEd. It's been quite the ride already. As you've probably already seen I've gushed about seeing Mr. Franklin and Mr. Hanselman of DotNetRocks fame and of Hanselminutes fame.

More importantly I actually decided on a session, albeit 30 seconds before it started :) I decided to watch Roy Osherove's Agile Development with or without Team System as methodology is near and dear to my heart these days.

As I said this is my first TechEd and as such I had no idea what to expect. I've been to my fair share of MSDN Events and while they were decent Roy blows them all out of the water. He knows his stuff and it shows. He gave lots of pointers of generic stuff surrounding agile development and more importantly backed it up with actual personal experience. Instant authority. Mixing in some actual tools to get people started is great; especially for a person like myself who sees a tool for every problem :D

I could tell that he's very passionate about the subject. Sadly his real enthusiasm only shone though a couple of times. Don't get me wrong, the session was great but I can't help but feel that he was holding back a little for some reason. Maybe it's just his style.

Anyway great presentation, I'm certainly hungry for more. The hours went by without me noticing it. A good sign :) Tomorrow will for my part bring sessions on Commerce Server 2007, AJAX, a DotNetRocks panels discussion, and finally C# Language Innovations(with Anders Hejlsberg no less). I may not be able to sleep tonight :)

Finally here are some pictures of Sune (yes, Microsoft actually stole his name and made an MP3 player out of it, I guess Sune will be hearing from Microsoft Legal in the near future) and I lounging outside the conference center. Notice the ridiculously large badge around Sune's neck. Also check out included are a couple of pictures from the session.

posted on Monday, 06 November 2006 23:02:53 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback

OK, it's official. I'm in nerd nirvana here at TechEd. I've just attended a session on continuous Integration by Roy Osherove and right now I'm sitting across from Carl Franklin and Scott Hanselman doing a Hanselminutes. I tell you it doesn't get better than this :)

posted on Monday, 06 November 2006 19:06:00 (Romance Standard Time, UTC+01:00)  #    Comments [1] Trackback

I arrived in in Barcelona yesterday and have been without net access for almost 24 hours. Yikes! :) Now, however, I've arrived at the conference center with my colleague where Microsoft kindly provides net access for the pre-conference attendees. We're currently waiting for the pre-conference sessions to start and I'm still undecided whether to see the AJAX session or the Agile Development with Team System session.

Stay tuned for "Tough  Choices in Barcelona". Coming up next :)

posted on Monday, 06 November 2006 11:35:31 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback
# Friday, 03 November 2006

So I've wanted to try out Windows Live Writer for a while now but of course DasBlog 1.8 doesn't work with Windows Live Writer. The other night I got around to finally updating my blog to version 1.9 which should have support for Windows Live Writer. If you are reading this it works :)

Hey it actually does work and it includes a thumbnail feature as well. It even configures itself with DasBlog so I'm able to upload images. I've used BlogJet up to this point but I actually think Live Writer is turning out to be a better more polished product. And it's free too :)

posted on Friday, 03 November 2006 13:41:47 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback

A colleague (thanks Troels) sent me this link this morning. What a great way of starting out the day  It gets even more interesting when I consider the fact that we share building with a telemarketing company. Should I go downstairs and hand them the link?

Without further ado here’s How to Prank a Telemarketer. As an added bonus here’s Bush singing Bloody Sunday.

And here are some more prank calls for you http://www.eugenemirman.com/showandtell.html


posted on Friday, 03 November 2006 08:39:19 (Romance Standard Time, UTC+01:00)  #    Comments [0] Trackback