Thursday, 24 December 2009

(!) All conflicts resolved but no files checked in due to initial conflicts

A minor annoyance with VS2008 SP1 and TFS 2008 SP1 today when attempting a check in. Here’s the stack trace:

  • Delete a folder and a bunch of images from VS Solution Explorer yesterday (right-click, Delete)
  • Check in the deletion and explicitly check for the delete action for the relevant files during the check in process
  • Arrive this morning to do a final cleanup and make sure everything in the project was checked in before Christmas… hey, all those images are still wanting to be checked in!
  • Attempt check in from Solution Explorer (specify comment, work item as per our policy configuration); Resolve Conflicts dialog pops up with the message “(!) All conflicts resolved but no files checked in due to initial conflicts”. Try again… believing, from memory, I’d seen this problem before and a second check in attempt had solved it. Same deal.
  • Resume Conflict Resolution from File –> Source Control. Nothing found, “All conflicts resolved”
  • Attempt check in from Source Control Explorer window in VS; same deal.
  • Drop into the VS 2008 Command Prompt and navigate to the solution; tf checkin and tf resolve produce the same results as VS. tf checkin tells me “Conflict: The latest version of item $/… is deleted.
  • Try tf resolve . /r /auto:AcceptTheirs but the command tells me “There are no conflicts to resolve”
  • Confirm folder and images are definitely not present in either the project (via Solution Explorer) or the source control tree (via Source Control Explorer).
  • Get Specific Version with ‘Overwrite all files even if the local version matches the specified version’ option checked; no change to the project or file system; same error message on check in.
  • Undo Pending Changes… from Solution Explorer context menu. Success! Now I have nothing left to check in (any other files not involved in the deletion process were checked in separately) and TFS is tangle-free yet again.

Installing Live Mesh Beta on Windows Server 2008 R2

It’s never easy… the downloader doesn’t seem to work from the web or somesuch, or at the very least not with Windows Server 2008 R2; running locally spits out an error about local policy not allowing the installer to run.

To work around this, add a new key named Installer below


and then create a DWORD named DisableMSI. Set the value to 0.

In my case I’d already downloaded the installer so I ran it locally, no change to UAC, no reboot, no –force switch on the installer. I’m meshed.

Ps. I deleted the key after install but it’s probably not necessary to do so.

errorData during SharePoint 2010 Products and Technologies Configuration Wizard

When specifying database settings for my new SharePoint 2010 beta farm, I came up against an error popup I’ve never seen before under 2007:

An error has occurred while validating the configuration settings.  An exception of type System.ArgumentNullException was thrown.  Additional exception information:  The errorData argument cannot be null or zero length.

Parameter name: errorData

M’kay. What does that mean?!? I wonder to myself.

Checking the application log revealed a critical error that set me straight:

SQL database login for 'master' on instance 'SP2010BETA' failed. Additional error information from SQL Server is included below.

Login failed for user 'SPDEV\Administrator'.

I’m trying a least privilege install and of was still logged in as the administrator, instead of the setup user (criminal); I configured the setup account as an admin during the SQL Server 2008 install but not the administrator account I was using.

Login as setup and I’m back on track. Of course,
by the sound of it, I shouldn't be doing least privilege in 2010.

Dirty Words (Michael Hanes) -

Monday, 21 December 2009

SharePoint Saturday Perth Announced

The very first SharePoint Saturday event is coming up in the new year (early February). Check out the details here and submit your speaker submissions by 4 Jan:

See you there!

Wednesday, 16 December 2009

Flushing the Blob Cache

Sean McDonough has a really comprehensive post on several options for flushing the MOSS 2007 blob cache. Worth a read if you’re in need:

Inner exception details displayed instead of wrapper exception

Wrapping an exception using the Exception (string, Exception) constructor and throwing the new exception in place of the original isn’t behaving as expected for me today: ASP.NET 3.5 displays the inner exception message in its default exception display; I expected to see the wrapper exception message. Oddly enough, details pertaining to the wrapper are logged to the Application event log.

So this:

try {
// Setup to make something go wrong
throw new Exception ("Inner exception");
catch (Exception ex)
// Wrap the inner exception and throw
// a new exception instead
throw new Exception ("wrapper exception", ex);

spits out this (note both the message and source error clearly pointing at the inner exception line):

Inner exception

Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.

Exception Details: System.Exception: Inner exception

Source Error:

Line 18: try
Line 19: {
Line 20: throw new Exception ("Inner exception");
Line 21: }
Line 22: catch (Exception ex)

Despite what the message and highlighted code reveal, the stack trace does imply the wrapper exception was involved along the way.

[Exception: Inner exception]
_Default.Page_Load(Object sender, EventArgs e) in Default.aspx.cs:20

[Exception: wrapper exception]
_Default.Page_Load(Object sender, EventArgs e) in Default.aspx.cs:24 …

Adding a second try/catch to catch any exception of type Exception and inspecting the results locates the inner exception property where I’d expect it to be. Alternatively, “swallowing” the inner exception in the original catch block and throwing a new exception using the Exception (string) constructor does just that—no sign of the inner exception. I’ve additionally tried catching different types of exception (I was originally throwing ApplicationExceptions and catch Exceptions) and constructing my own exception classes with the same effect.

Maybe I’m missing something but this seems like pretty basic stuff and a few other vaguely related search results hint that I’m not alone with this one. Of course, I can always handle all exceptions myself to prevent anything killing the application but the current approach is convenient at the moment for debugging outside production where it’s less convenient to attach a debugger or check the event logs.

If you’ve got any ideas or have come across this yourself, I’d love to hear your thoughts!

Sunday, 6 December 2009

The Keywords Meta Tag Doesn’t Matter

Every wondered if the major search engines still consider the keywords meta tag? Well most don’t and haven’t for a very long time. Here’s the latest from the search engine driving the majority of search-based traffic to your sites:

Manual keyword maintenance, automated keyword builders, and so on: goodbye and good riddance!

Wednesday, 2 December 2009

TFS Screen Captures with Snagit

If you’ve been working with TFS and TFS Web Access (Team Plain) you’re likely aware the product is graceless about screen captures—a critical part to describing many types of bug.

With products like JIRA you can simply PrtScn and paste the resulting screen grab directly into their web-based interface; TFS offers nothing of the sort (maybe in 2010?) and you must use a third party tool to save out your screen shot and upload it to the work item as a file attachment.

I’ve seen a few attempts to write controls to make the necessary magic happen within the Visual Studio environment but that doesn’t cut it where the majority of TFS users interact with the system through a browser. I have yet to find a solution for the browser end but in my travels did today come across a free Snagit output from TechSmith that will attach a Snagit screen shot to a work item for users with Team System installed.

As an extra output, you simply Finish your capture to the Team System output and a new work item window appears so you can fill in the basics. It’s not web-based but does work quite well, dropping the screenshot under the File Attachments tab and it is free if you’ve already got Snagit installed (which isn’t free).

Failure serving a file with a percentage character in the file name

An unexpected 404 error cropped up today while attempting to serve an image file with a percentage character in the file name. Windows has no issues with percentages but apparently IIS or something else in the pipeline refused to serve this file. Interestingly, the '%' character URL-encodes as '%25'... go figure.

Friday, 13 November 2009

CMYK .jpg images don’t render in IE and Firefox

For the second time in recent memory I was today faced with a "broken" image in IE 8 and Firefox 2.x due to the image being saved using the CMYK colour mode instead of RGB. Interestingly, Chrome was quite happy to display the image as it was; I had to open it in Photoshop, change the mode to RGB, and save it back for the other browsers to respond. Apparently saving for web does the same thing.

Here’s the image in CMYK:

CMYK And here it is again in RGB:


My Network Places in Windows Server 2008

Where has My Network Places gone in Windows Server 2008? I'm not quite sure but you can add a new network place by right-clicking on Computer in Windows Explorer and selecting Add a Network Location from the context menu. For example, connect to a picture library in a SharePoint content database using this address: http://my_site:30000/Lists/MyPicturesList

Add a Network Location in Windows Server 2008

After stepping through the wizard, the network location will appear alongside your other mapped drives, etc.


Wednesday, 11 November 2009

After Event Receiver Doesn't Fire

Here we go again... note to self: because "after" events like ItemAdded and ItemUpdated fire asynchronously, testing the receiver wiring by doing something with a side effect (like throwing an exception) won't have any visible result within the context of the site (do check the event log though!).

Throwing within a before event like ItemAdding will bring the operation to a screeching halt.

Tuesday, 10 November 2009

Adding the Edit Control Block (ECB) to an additional/different list column

This is how I add the ECB menu to a different column. All changes are made within the context of my list's Schema.xml file.

To start, I already have a custom text field defined within the Fields element:


To this I add a new computed field that references the existing field:

<!-- Edit Control Block (ECB) Context Menu -->
AuthoringInfo="(with menu)"

<!-- First FieldRef points to parent field -->
<FieldRef ID="{1ac83cea-5b25-4e2f-ae43-5116e005ca97}" Name="Caption" />

<FieldRef ID="{3c6303be-e21f-4366-80d7-d6d0a3b22c7a}" Name="_EditMenuTableStart" />
<FieldRef ID="{2ea78cef-1bf9-4019-960a-02c41636cb47}" Name="_EditMenuTableEnd" />
<Field Name="_EditMenuTableStart" />
<HTML><![CDATA[<a onfocus="OnLink(this)" href="]]></HTML>
<URL />
<HTML><![CDATA[" ONCLICK="GoToLink(this);return false;" target="_self">]]></HTML>
<!-- Points to parent field -->
<Field Name="Caption" />

<Field Name="_EditMenuTableEnd" />

Note DisplayNameSrcField points to the original field, as does the <Field> element in the DisplayPattern section.

Finally in my view's ViewFields element, I modify the FieldRef pointing to my original field to point to my ECB column:

<FieldRef Name="CaptionContextMenu" />

A few things to note:

  • When I first tried this with the original field set as type HTML, it didn't work; I ended up changing the field to a text field. I haven't tried other field types.
  • /_layouts/sitemanager.aspx presents the view in a similar but slightly different way to that presented by the list itself (e.g. /Lists/MyList/AllItems.aspx); Site Manager tends to always add the ECB on the second column whereas the latter tends to do a better job doing what you tell it to do. [Update: I use the DocIcon field in position #1 to work around this issue... it shows up as a little document icon with a column name of Type--which I haven't been able to hide. It's clickable in the AllItems.aspx-style view at least so not completely useless and it obviously shoves the ECB to a potentially meaningful field in the second position (you can alternatively use ID but this may not not make sense if you're ordering list items).]

Monday, 9 November 2009

ContentTypeRef vs ContentTypeBinding

There seems to be some uncertainty around the use of the ContentTypeRef element in Schema.xml and the ContentTypeBinding element in your elements file. The master—Andrew Connell—indicates both should be used but various discussions (see links below) suggest

you can use the ContentTypeRef to create a list with a content type, or you can use a ContentTypeBinding to add it later.

I personally found this discussion somewhat hard to follow and Andrew's post just says DO IT without any additional explanation; this post documents my own findings around the why.

While I always use ContentTypeRef in my Schema.xml files because my custom lists are backed by a custom content type, I hadn't come across ContentTypeBinding in a list context until recently—or if I had, I assumed it was unnecessary since everything just works without it. My understanding of content types leads me to believe each list has its own internal content type or at the very least, its own set of fields (the latter is certainly evident in the duplication of fields in Schema.xml and the file used to provision custom fields).

All's well, or so I thought until I had to programmatically enumerate a list item's fields: I noticed my custom fields were listed twice. Creating a new list from the Custom List template and assigning my custom content type did not result in this duplication so I knew something was up with my custom list definition. For some reason, the problem was not evident when inspected using my best friend SharePoint Manager 2007.

As expected, removing the fields included in Schema.xml that are duplicated in my custom content type/fields broke the list.

It seems telling Schema.xml about my custom content type in the ContentTypeRef doesn't cut it—the ContentTypeBinding element is required to effectively map the list fields against the fields referenced in the content type. After adding a ContentTypeBinding element and enumerating a new list instance, the duplicates are gone.

Like Andrew illustrates, I add the ContentTypeBinding element to my ListDefinition.xml file between the ListTemplate and ListInstance elements (your file may be named differently):

<ContentTypeBinding ContentTypeId="0x0100CB568E1363E18245810A4EF25B057CCE" ListUrl="Lists/MyList" />

All's well once again... until next time!

Ps. Andy Burns asks himself:

I wonder what happens if you try to create a list definition without a content type reference?

Andy, you can’t leave me hanging like that man!!! I dropped the ContentTypeRef element from my Schema.xml file and while no errors were reported and a list instance only reported using the custom content type specified in ContentTypeBinding, NewForm.aspx was a dog’s breakfast with some aspects of the content type ignored. For example, I’ll often hide the Title field in my list definitions and yet there it was (sure, ShowInNewForm/ShowInEditForm might address this but that’s not the point). Fields also weren’t ordered the way they were in the content type definition.


Thursday, 5 November 2009

Sorting and filtering doesn't work on custom HTML list field

After deploying a custom list today I noticed, much to my annoyance, list items were not respecting the OrderBy FieldRef specified in my view and sorting the list manually in either direction on the field in question had no effect. Worse, the Show Filter Choices menu item led to the display of a #Render Failed error message in the central pane and the Application event log and SharePoint logs reported the following:

Unknown SQL Exception 306 occured. Additional error information from SQL Server is included below.  The text, ntext, and image data types cannot be compared or sorted, except when using IS NULL or LIKE operator.  The ntext data type cannot be selected as DISTINCT because it is not comparable.

The #Render Failed issue seems to be a fairly well-known bug and was apparently been fixed in the June 2009 WSS Cumulative Update although it's unclear whether that problem is related to my general sort/filter problem.

The problem field did have an edit control block attached in my case and while I expected that to be the culprit, changing the field type from HTML to text resolved the problem for me. Not my preferred approach but acceptable in this particular case.

Wednesday, 4 November 2009

ListTemplate Name attribute doesn't resolve $Resources

Further to my previous 0x81070215 post, I've since discovered this error code also crops up when specifying an invalid ListTemplate value for the Name attribute.

In my case I was attempting to supply the value from a .resx file something like Name="$Resources:List_Name;". While resources are successfully resolved on other ListTemplate and List attributes, this doesn't seem to be the case with Name and its value must be specified explicitly; incidentally, the Name must match the name of the folder containing the list definition. The OOB 12-hive examples I've cross-checked also specify a literal value.

While this error code was presented in the UI, the SharePoint log files revealed how SharePoint was parsing the attribute value:

Cannot open "schema.xml": no such file or folder.
Failed to retrieve the list schema for feature {1B3FC94C-2FC6-4528-B968-4E91843C2005}, list template 13133; expected to find it at: "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\12\Template\Features\TeMCorporateBannerListFeature\$Resources:List_Name;".
Unknown SPRequest error occurred. More information: 0x81070215

No biggie since DisplayName does handle resources for localisation however in this case I'm attempting to use resource files for consistency and to simplify maintenance across my list definition.

For reference, 0x81070201 seems to mean about the same thing (a similar error message I received after reverting the Name value to a literal with a typo).

Friday, 30 October 2009

0x81070215 Means Dodgy List

After some more hacking around with a custom list SharePoint gave me this error code: 0x81070215

Basically I’d mangled a list definition while an earlier version of the list still existed. The easy solution in my case (dev) was to simply fire up SharePoint Manager 2007 and use the UI to delete the offending list (stsadm –o forcedeletelist –url http://spsite would have probably done the same thing). In a more serious environment, you’d probably need to redeploy the list bits as they were when the list was created.

Control template "ListForm" does not exist

I have no idea how I caused ListForm not to exist but after some clumsy tinkering with a custom schema.xml file, the main section of NewForm.aspx page was coming up blank and the AllItems.aspx page for all lists in all sites across my dev farm refused to render the normal menu.

The SharePoint log file was telling me:

Control template "ListForm" does not exist

Examining a list’s NewForm.aspx file in SharePoint Designer (which of course makes me feel dirty all over) you’ll note it holds a ListFormWebpart. “Control template” would further point me to the _controltemplates virtual directory—mapped to \12\TEMPLATE\CONTROLTEMPLATES, but this directory doesn’t contain a ListForm.ascx or ListFormWebpart.ascx file.

Thankfully the Application event log entry was more precise:

Load control template file /_controltemplates/DefaultTemplates.ascx failed: The file '/_controltemplates/DefaultTemplates.ascx' does not exist.

A quick check of \12\TEMPLATE\CONTROLTEMPLATES and of course DefaultTemplates.ascx was nowhere to be found. Copy/paste from a functional server and we’re back in business. Now to figure how what caused it to be deleted in the first place…

Wednesday, 28 October 2009

Activating a feature containing a content type generated from Andrew Connell's stsadm commands generates 100% CPU load

I've used AC's WCM Custom Commands for STSADM successfully in the past and they're extremely handy. In short, I create my site columns and content types in the SharePoint UI and then run Andrew's GenSiteColumnsXml and GenContentTypesXml to dump the XML required for incorporation in a feature. Today I hit a wee snag when attempting to activate a feature containing fields and content types generated from these commands: the operation wouldn't complete until terminated forcefully and the CPU was meanwhile running at 100% (infinite loop anyone?).

Luckily JoeB came across the same problem in the past and was kind enough to post his fix in the comments on Andrew's blog (although I also noticed another commenter noted the issue in production after implementing JoeB's fix...). Andrew says these commands were built using the 80-20 rule: they'll get you 80% of the way there and you're on for the rest. Fair enough—the commands are extremely handy and I reckon they go well beyond the 80% mark. That said, these are some of the things I fix up after generating the output; I'll list JoeB's fix here too:


This makes updating the feature easier:

  • Add the DisplaceOnUpgrade="TRUE" attribute

I've read these aren't meant to be included explicitly but the documentation, useful as it is, indicates they're optional anyway:

  • Remove the SourceId attribute
  • Remove the StaticName attribute

These can go without any consequence I've noticed:

  • Remove the Version attribute
  • Remove the Required="FALSE" attribute

These seem to come out if you muck around with the fields too much in the UI:

  • Remove the PITarget attribute
  • Remove the PrimaryPITarget attribute
  • Remove the PIAttribute
  • Remove the PrimaryPIAttribute
  • Remove the Aggregation attribute
  • Remove the Node attribute
  • Remove the ContentType field altogether (SharePoint Manager 2007 indicates it's added automatically even if not specified)

Content Types

JoeB's fix to the 100% CPU issue:

  • Add NamespaceURI="" to the XmlDocument element

(Note you may also be able to remove the XmlDocuments node altogether).

Friday, 23 October 2009

No SharePoint MVPs in Western Australia

This is an unfortunate situation… there are so many top-notch SharePoint guys (and gals) working proactively in the community and doing great work in WA with major clients. That said, with only five SharePoint MVPs in Australia I’d say we’re under-represented as a country!

You can nominate yourself or someone else here:

Western Australia SharePoint MVPs

Tuesday, 20 October 2009

Running ISO Recorder on Windows Server 2008 x64

When trying to copy a DVD to an image using ISO Recorder v3.1 on a Windows Server 2008 x64 machine, the Next and Finish buttons were greyed out despite correctly selecting the From and To locations. As I have to run Nero Express Essentials as an admin (right-click, Run as administrator), I tried setting the ShellExec.exe file below C:\Program Files\Alex Feinman\ISO Recorder to run as admin on the Compatibility tab to accomplish the same thing. When that didn’t work, I found I had to click the Show settings for all users and select Run this program as an administrator on the second Compatibility for all users window.

This doesn't seem to work for all functions; for example, attempting to burn the created ISO to disk fails to populate the Recorder drop down with the local optical drive and I can't complete the operation.

In general I must say ISO Recorder looks like it's going to do the job I want it for quite nicely. I created an image from a DVD by right-clicking on the DVD drive icon in Windows Explorer and it popped an 1.7GB .iso out the other end in a few minutes. I've burned ISOs with Nero in the past but can't seem to find the right settings on the version I'm using currently (I'm lumped with .nrg files). ISO Recorder hits the nail on the head so I'm sticking with it ;-)

Tuesday, 13 October 2009

Approval workflows: approval by any approver

Configuring an approval workflow to request approval from a group of approvers and deem the workflow complete after any single person has completed their approval task is trivial but it's not that obvious. I certainly missed it on my first pass through today!

The Assign tasks to radio button options on the workflow customisation page focuses on the parallel versus serial distinction. If you've got multiple people configured as approvers, you're either assigning tasks to everyone in the list or you're assigning tasks sequentially to everyone in the list; in both cases every approver has to complete their task before the workflow is deemed complete.

In the past, we've noticed a behaviour where approving a page doesn't seem to do anything—the page status stubbornly remains as Waiting for approval no matter how many times or how many different ways you attempt to approve the page and the fact that you’re a site collection administrator web application administrator, and farm administrator makes no difference!  (Without email configured on the server and perhaps some SSPs bits in place, as was the situation in our case, no one will ever know additional input is required unless you're checking the workflow status). In all likelihood, the root cause of this "problem" was an approval pending completion by virtue of being assigned to twenty people. Doh!

But I digress. Adding approvers to a dedicated Approvers group is the logical thing to do in most cases to simplify approver management. Instead of changing multiple workflows as employees come and go, it's much easier to manage a SharePoint group or an AD group. Selecting the approvers list is easy but how do you get the "any approver" bit happening?

The answer lies in the check box directly below the Approvers selection input field: Assign a single task to each group entered (Do not expand groups). Think about it long enough and it actually starts to make sense—only one approver needs to complete their task before the workflow is deemed complete, or to paraphrase: “assign the task to the Approvers group and no one person in particular.” This option isn't selected by default so select it and you're golden.

SharePoint Approval Workflow - Any Approver

Wednesday, 30 September 2009

Web site performance monitoring tools

Having gone through the pains of learning about web site performance and applying that understanding to a MOSS site (, the latest post on the SharePoint Team Blog about optimising caught my eye. I’ve been meaning to cover the performance nitty gritty from our experience for ages now but, for the moment, suffice it to say we fixed most of the problems faced by our global audience and now use a commercial performance monitoring service called Gomez to keep an eye on things. Gomez competes with the likes of Keynote if you’re in the market.

Commercial solutions like this cost a lot of money because they’re essentially distributing test agents all around the world and measuring full page load times from your site at a configurable interval. Response metrics are stored for trending analysis and comparison with other sites. Despite the cost, these tools are worth the money if you know your site is experiencing performance issues (if not the data gets boring really fast).

Firebug, YSlow, and Fiddler are great tools for analysing performance and will give you both page load time and page weight but they’re all executing from your desk; if your web server is down the hall or in the same city (or country) you may not have a clear picture of how latency is affecting your site users on the other side of the world. If you’re targeting a domestic audience that’s obviously not a problem but if you’re targeting a global user base and you’re attempting to do so with SharePoint you need to ensure everything about your site is optimised—not just just the server configuration. The SharePoint Team Blog post highlights the fact SharePoint (MOSS 2007) is not optimised for internet sites out of the box.

As the cost of these performance measurement services is prohibitive—especially in this tough economy, it’s interesting to see where the free services are going feature-wise. I mention the SharePoint Team Blog post specifically because the author cites a new tool I hadn’t yet come across: I’ve previously evaluated Pingdom but their offering was still developing a year ago (they offer both free and paid services).

The site is painful on the eyes but the data they provide at no cost is comprehensive. The site currently allows you run tests from one of three nodes (the US, UK, and New Zealand), meaning an adequate global coverage (we test from seven locations matching the site’s target markets).

The tool reports the results from a full page test, meaning the page and all of it’s supporting resources are requested, providing a realistic picture of how long it takes to load the page and all CSS/Javascript/images/XML files/Flash files/etc. Some of the freebie offering I’ve seen in the past only reported the page HTML load time, which will often be negligible.

In addition to giving you a screenshot of the page, which is often useful to compare what the world sees versus what you think they’re seeing, you get full waterfalls of data for a first visit and what they call a repeat view (a subsequent request for the same page with a primed browser cache), and an item by item optimisation checklist.

webpagetest waterfall and optimisation

I’m impressed!

Thursday, 24 September 2009

Removing the shutdown details prompt in Windows Server

If you’re anything like me, chances are you’ve got dozens of virtualised dev servers hanging around and you shut them off at the end of the day; you also have no idea what to put in the Shutdown Event Tracker Comment box. The event and comment do get written to the event log for future reference so I usually comply and specify an exact reason where applicable or with my standard comment “a” for “all good” or “a-team” or something like that!

In practice, I’m the only one using my dev environments and I’ve never had the need to remind myself why I shut down the server. The specific event is likely buried among the shut down/start up events anyway and it’s probably safe to say this feature was intended for multi-admin production environments.

The shutdown event tracker is just an extra hassle as a developer but it can be turned off:

  • Run –> gpedit.msc
  • Browse to the Local Computer Policy / Computer Configuration / Administrative Templates / System node
  • Locate and open the Display Shutdown Event Tracker policy
  • Set it to Disabled

Life = that little bit easier.

Tuesday, 25 August 2009 site re-launch on MOSS 2007 today gets a facelift on the coattails of the corporate site relaunch a few weeks back. The site is Tourism Western Australia’s premier consumer-facing web site and targets visitors seeking quality, unbiased information about Western rebrand

This is the first major visual change to the site since TWA launched WA.COM on the MOSS 2007 platform in May 2007. The site is regularly cited as an early, shining example of how far you can take custom branding and SharePoint. Under the covers, the site employs all the SharePoint tricks you’re used to: master pages, page layouts, web parts, content types, lists—you name it. The only thing we don’t make use of that might otherwise factor into a WCM site is SharePoint search and the BDC (we’ve built a custom solution for that).

The English-language sites alone delivered more than eight million page views in the last twelve months to an international audience. We currently run the site on the back of a single Windows 2003 x86 web front end (which also hosts the site as a separate web app) and SQL Server 2005 database server shared by all of our MOSS-based web sites. Site traffic is additionally accelerated by Akamai and cached by regional Akamai nodes to ensure visitors can access the site quickly and reliably from all over the world.

The new changes are part of an interim visual shift as Marketing prepare for a new brand launch later this year. Apart from the look and feel, which was aimed at reducing clutter and softening existing brand elements, the site is moving towards dynamic (i.e. social) content (see the twitter feed!) and the home page in particular has been positioned to display fresh content like events and deals.

Want more? For some more examples of the heavily branded sites we’ve rolled out on the MOSS platform over the last year, check out the side bar on this page (“Some of the MOSS sites I’ve worked on”) and watch the videos (1, 2).

Disclaimer: I’m a contractor working at TWA. Mediawhole is not directly affiliated in any way with the agency or the web sites; when I say “we”, I mean the royal We.

Friday, 21 August 2009

Seeking an Experienced DBA with Strong T-SQL Skills

We’re looking for an experienced database administrator to start initially on a short term (three month) contract at Tourism WA’s Perth office. If you match this description, you do quality work, and you’re not a cocky git, please get in touch asap:

Required Personal Skills

  • The ability to work independently with initiative and self-drive
  • Excellent interpersonal skills
  • Excellent verbal and written communication skills
  • A sense of humour

Required Technical Skills

  • Practical experience writing Transact SQL scripts and applying relational database concepts
  • Practical experience installing, maintaining and migrating SQL Server 2000, 2005, and 2008 on 32 and 64-bit platforms
  • Practical experience configuring and maintaining SQL Server 2008 clusters
  • Knowledge of general performance testing and environment optimisation approaches
  • Practical experience performance tuning existing code (stored procedures, user defined functions, views, other queries)
  • Practical experience tuning indexes and configuring maintenance plans
  • Practical experience implementing a database server health monitoring and status alerting system
  • Practical experience with high availability techniques for SQL Server (mirroring, log shipping, etc.)
  • Knowledge of deploying C# CLR assemblies within SQL Server

Desirable Technical Skills

  • Experience maintaining product-specific database environments (specifically SharePoint/MOSS 2007)
  • A working understanding of network concepts as they relate to database administration (firewalls, TCP/IP, performance)
  • A working understanding of SAN and RAID technologies
  • Experience tuning storage subsystems
  • Experience installing and maintaining SQL Server Reporting Services (SSRS)
  • Experience creating and maintaining SSRS reports
  • Basic understanding of source control concepts

Tourism WA Corporate Site Relaunched on MOSS 2007

With the agency's first sites launched on SharePoint 2007 more than two years ago, it was recently time to give the flagship Tourism WA sites a facelift! The corporate site ( is the first site to move across to an interim look and feel as Marketing prepare for the formal brand re-launch; the consumer site ( will be following next week.

Tourism WA Corporate Homepage MOSS 2007So how much effort was involved on the Corporate side? With no major content changes, it was really a question of modifying the existing CSS and and some of the images and other resources; all up, about a week's work—including a stack of minor content tweaks. Consumer has been more involved but I'll save that story for another post ;-)

Tuesday, 18 August 2009

Open Command Window Here in Windows 2008

It used to be a Power Toy now it’s baked into both Vista and Windows:

Hold down shift and right-click a folder and behold, the Open Command Window Here menu item is revealed.

Monday, 17 August 2009

Zoom for presentations

Sysinternals: what’s not to love?!

I was setting up for a demo today and noticed the client’s projector was too close to the screen and the overall result was a hard-to-read display. The Windows screen magnifier (whatever it’s called) doesn’t really cut it so on a whim I hit Google and came up with ZoomIt.

129KB later and no installer I can hit CTRL+1 and this thing zooms in to my mouse; wiggle my mouse around and the zoom follows. Esc to zoom out again.

Simple. Beautiful. Perfect.

Sunday, 16 August 2009

Invalid data has been used to update the list item. The field you are trying to update may be read only.

Apparently, in my specific case today, this error message actually meant the equivalent of a runtime type mismatch.

In the context of a list event receiver, I was attempting to update one of the list item’s properties. AfterProperties is an SPItemEventDataCollection, which seems to really be a collection of strings. In trying to cram a Guid in there, I forgot to call ToString () or Convert.ToString (); the compiler didn’t complain but at runtime my friendly Guid was jailed in the bowels of SharePoint’s exception handlers. SharePoint simply returned

Invalid data has been used to update the list item. The field you are trying to update may be read only.

with nothing but a menacing smile on its horrible, scarred face.

Sunday, 9 August 2009

Event receiver doesn’t fire

It’s been a long, busy weekend and the brain’s gone to mush—so I’m making silly mistakes.

Starting with an existing, functional SPItemEventReceiver attached to a content type, I needed to add an extra event handler. I had ItemAdding and ItemUpdating wired up and debuggable but wanted to add a new ItemAdded handler.

On my first attempt, I simply added the extra handler method to my existing class derived from SPItemEventReceiver. When that didn’t work, I recalled originally wiring the individual handler methods within the content type definition so I added an extra Receiver element.

I realised my mistake when it dawned on me I was testing my changes against an existing list instance: having been provisioned using the original content type definition with only the ItemAdding and ItemUpdating event handlers, SharePoint will never update that instance retroactively without some custom code.

After creating a new list from the modified template, all three event handlers fire correctly.

Retrieving a list item’s attachments

As you might expect, accessing a list item’s attachments in SharePoint isn’t straightforward. While SPListItem.Attachments returns a string-based list of attachment names, you’ll probably need to translate that into a URL at the very least, if not manipulate the attachment itself.

It helps to understand where SharePoint stores attachments after they’re uploaded. As usual, SharePoint Manager 2007 provides some insight:List Item Attachments Location

Notice in the screenshot, I’m inspecting a list named “Announcements” and that list has one item (Hello Attachment—highlighted at the bottom). You’ll also notice there’s an Attachments folder sitting among the list items and it’s been expanded with to reveal another folder with the odd name of 2; below that you can actually see the attachment itself (My First Attachment.txt).

Before I get any further, I’ll point out the magical “2” folder isn’t magical at all: it relates to the list item in this list with the Id of 2—in this case, the Hello Attachment item (the first item in the list was deleted but, if it had attachments, there would be a corresponding “1” folder). It’s sensible to assume this structure exists to accommodate multiple attachments.

If you were to now try to access the Attachments folder in code using SPListItem.Folder, you’d be faced with a null reference exception. Instead, you need to ignore what SPM tells you about getting to the Attachments folder and arrive there instead using the parent list’s RootFolder property.

Assume you’re working within the scope of a foreach or have retrieved a list item via some other means:

SPList announcements = …Lists [“Announcements”]

foreach (SPListItem currentItem in announcements)
// we’re here

Before you do anything else, you’ll probably want to muck around and ensure you can actually access a list of attachments in some form. To identify the attachments by name, SPListItem.Attachment (SPAttachmentCollection) will return a collection of strings:

foreach (string currentAttachmentUrl in currentItem.Attachments)

At this point you’ve retrieved, from the example discussed earlier, a string like “My First Attachment.txt—not of much use on its own. As mentioned, you’re probably after the full URL at the very least.

To access the actual attachment and not just it’s name, retrieve the list item’s parent list (SPListItem.ParentList) and then the parent list’s RootFolder. From there you can use the SubFolders collection to retrieve the “Attachments” SPFolder. Beware the SubFolders indexer expects the URL of the folder, not its name.

Finally, you need to retrieve the numbered folder specific to the list item. Here’s the complete mess:

SPFolder attachments = currentItem.ParentList.RootFolder.SubFolders ["Attachments"].SubFolders [currentItem.ID.ToString ()]

The SPFolder object’s URL property can now be prefixed to the attachment’s name retrieved previously or you can enumerate the SPFiles within the SPFolder.

Back in the day, Eric Shupps posted a way to do this by walking down the structure from the Lists folder. I think the above code is a slight improvement since the code’s a bit more manageable and you don’t need to retrieve the Lists folder or the parent list by name.

Ps. RootFolder is actually quite interesting as its Name property is actually the internal name of parent SPList object.


Saturday, 8 August 2009

Tourism WA CIO Position Advertised

Having worked at Tourism WA in various capacities for three and a half years, I’m currently working for my third TWA CIO. As the current CIO is acting in that position, Corporate and Business Services has decided to advertise the position again (it was advertised earlier in the year until being withdrawn after the second round of interviews). If you’re interested in working at Tourism and has what it takes to fill the CIO role—and you want to work with moi), then be sure to apply:

Friday, 31 July 2009

Storing SharePoint List meta data

When I recently wanted to store some information about a list (i.e., meta data) I couldn’t find an obvious way using just SPList. SPList does expose the PropertiesXml property but doesn’t expose a setter. What I wanted was a property bag or hashtable of key/value pairs. I’ve seen suggestions to add an extra column for this purpose but that could lead to redundant data, extra maintenance, and wasted space and essentially becomes meta data about the list items. In my opinion, columns define the structure of the data and aren’t meta data (but feel free to debate that among yourselves!).

Alex Angas was kind enough to provide an answer to a question I posted on StackOverflow on this subject. His solution was to make use of the Properties hashtable hanging off the list’s RootFolder object (SPFolder) and here’s my code to do just that:

// Set the custom property
SPList myList = myWeb.Lists [“MyList”];

myList.RootFolder.Properties [“MyKey”] = “MyValue”;
myList.RootFolder.Update ();

// Retrieve the custom property
myList = myWeb.Lists [“MyList”];

string myValue = myList.RootFolder.Properties [“MyKey”] as string;

Nice. It’s worth pointing out that using SharePoint Manager 2007 the RootFolder can be found within the site collection root; expand the RootFolder node to locate the folder named after the list in question.

SPFolder RootFolder Properties

Update: Although RootFolder.Properties is a System.Collections.Hashtable, you’ll likely receive a SPInvalidPropertyException if you attempt to use anything other than a string, int, or DateTime. The debugger told me so ;-)

In answer to the same question Paul-Jan also suggested using a separate, hidden list to track meta data about a parent list (or lists, I suppose). I quite like this idea as a centralised “list meta list” and it could be as simple as three column list to identify a list by ID and store a key/value pair. On the other hand, I prefer keeping stuff like meta data with the object in question so my preference would be SPList.Properties; the RootFolder solution seems like a passable alternative.

Saturday, 25 July 2009

Content Type ID Structure

Creating a custom content type in a feature requires you specify the ID attribute.

Content type IDs are constructed hierarchically and you can trace the lineage of a content type through its ID. A custom content type will often derive from the Item content type—the most basic content type you can use. Item is derived from the System content type and contains the ever-present Title field (Title can be hidden if necessary). Note the System content type is sealed, meaning it can’t be derived from in your code.

The ID of the System content type is 0x and the Item content type appends 01, giving you 0x01. A separator must be used between all subsequent IDs appended to the base string; 00 is used for that purpose, giving you 0x0100.

A custom content type will need its own ID appended to the base system-item-separator ID and the best way to do that is using a GUID without the braces and hyphens (use guidgen.exe, included with Visual Studio).

Here’s an example with the base system-item-separator ID combination bolded for clarity:


To create additional custom content types derived from an existing custom content type, add an addition separator and GUID:


Content type IDs are limited in length to 1024 characters so if you’re running out of room or just don’t like using GUIDs, a content type ID can optionally be specified with two digits other than 00. WSS uses this convention so be wary of any clashes with your own custom content types.

If you’re working with pages, note the Page content type derives from System Page content type, which in turn derives from Document which in turn derives from Item. The Page content type ID is therefore


Additional Resources:

Tuesday, 21 July 2009

DHCP-assigned IP address and DNS host name clash - duh

Mental note #1: if using host headers with MOSS (or IIS) ensure the web server is configured with a static IP address or the NIC’s MAC address is reserved within DHCP.

Mental note #2: Just because it’s SharePoint, don’t forget the easy things! No, really!!!

I came across an unusual problem today after a MOSS server was imaged and restored to a new physical blade: Central Admin, the default SSP web app, and a specially-created sanity check web app were all working beautifully but the single site configured with a host header refused to load.

In addition, the Application event log was full of errors suggesting a database server connectivity issue:

The start address cannot be crawled. The item could not be crawled because the crawler could not connect to the repository.

The database connection wasn’t the problem but nevertheless, SharePoint couldn’t find what it was looking for. The WFE’s NIC changed as part of the migration, which meant a new MAC address, and therefore a new DHCP lease for a different IP address that DNS knew nothing about.

Friday, 17 July 2009

PDF iFilter not indexing content

Don’t forget: installing and configuring an iFilter to allow SharePoint to search PDF documents won’t index graphical PDF content. In other words, if search is returning PDF results based on title or other meta data but not PDF content, double-check whether that content can be highlighted and copied to notepad as text—and watch out for scanned documents!!!

Conditional Post-Build Event Command for x64 sgen

I’ve bitten the bullet and decided to run Windows Server 2008 Hyper-V on my workstation; in addition to virtualising my MOSS dev environment, I’ve also chosen to virtualise my workstation environment on Windows Server 2008 x64; work won’t let me get away with Windows 7 RC1 and there’s no “supported” upgrade path to the RTM anyway.

You may have read my previous post on the subject of x64 targeting with Visual Studio setup projects; today, however, is all about just getting VS to compile a project with a post-build event where Visual Studio 2008 is running on Windows Server 2008 x64 with SP2. In our case, the post-build event needs to run sgen.exe from the correct location since some of our devs are still on Vista x32. And yeah, I know, post-build events are old school but it’s an existing solution…

Windows Server 2008 with Visual Studio 2008 SP1 and .NET 3.5 SP1 installed does not include sgen.exe for some reason. Why not?!? Good question—you can install it by downloading the Windows SDK for Windows Server 2008 and .NET Framework 3.5. v6.0a does seem to be installed by VS2008 on 32-bit/W2k3 environments (more about this). Kicking off the SDK installer, I chose to install the .NET Development Tools only (Developer Tools\Windows Development Tools\.NET Development Tools).

This all seems good but sgen will be installed to a different location than you may be used to: %programfiles%\Microsoft SDKs\Windows\v6.1\Bin\x64 (the 64-bit Program Files folder, that is). To accommodate this difference from the 32-bit world, I need some conditional logic in my post-build event.

At first glance, using the %processor_architecture% environment variable makes sense—it returns “x86” in my current XP 32-bit environment, presumably the same in Vista x32, and “AMD64” in Hyper-V… not quite “x64” but enough to branch on. As we all know, however, Visual Studio is only available as a 32-bit application and seems to do some additional environment variable setting of is own: echo %processor_architecture% in a post-build event prints out “x86”, obviously the same as our 32-bit environment. No good.

To work around and take advantage of this, my post-build event compares the %ProgramFiles% variable against “C:\Program Files”; when queried from a post-build event in a 64-bit environment, it reliably returns “C:\Program Files (x86)” which I know is the 32-bit Program Files folder; in that case, my script uses the 64-bit sgen. Note the same variable will return “C:\Program Files” in a normal command window so be careful and test in VS directly.

In the end, here is my post-build event:

REM Use the 64-bit sgen from the Win 2008 and .NET 3.5 SDK in a 64-bit dev environment
REM ProgramFiles variable is set to 'Program Files (x86)’ in a x64 environment
REM Processor_Architecture variable returns x86 in both an x86 and x64 environment within VS

if /I "%ProgramFiles%" == "C:\Program Files" (
set ToolPath="C:\Program Files\Microsoft
) else (
set ToolPath="C:\Program Files\Microsoft

%ToolPath% /compiler:"\"/keyfile:$(ProjectDir)
MyKeyFile.snk"\" /force "$(TargetPath)"

Thursday, 9 July 2009

Joel Oleson at the Perth SharePoint User Group

After a great showing from Greg Low last month, the Perth SharePoint User Group boys have pulled SharePoint superstar Joel Oleson on his way home from the NZ Community SharePoint Conference 2009.

Joel’s presentation is “Preparing for Upgrade to SharePoint 2010.” Here’s some pre-reading:

This promises to be the local SharePoint event of the year so get your RSVP into SharePoint Sezai via the user group web site if you haven’t already and block out 12:30pm in your calendar!

Wednesday, 8 July 2009

YSlow for Firefox 3.5 Lives?

YSlow has been the tool for measuring page payload and providing the metrics you need to optimise web site performance beyond the web server. Shock horror when I recently upgraded Firefox to version 3.5 and couldn’t install YSlow.

Initial rumours confirmed the current version of YSlow doesn’t work with FF3.5 but also suggested development by Yahoo! had ceased. Meanwhile Google launched Page Speed—a likeable YSlow competitor (maybe?).

A recent posting from a Yahoo! dev would suggest YSlow will actually be updated but “the team has run into some integration issues with firebug 1.4.” If you want to install FF 3.x and 3.5 side-by-side, check out this guy’s article.

Monday, 6 July 2009

New Sites, New Widgets

Rottnest Island Authority Re-launch

The second-last site to be launched Home - Rottnest Islandon Tourism WA's fully branded, MOSS-based site provisioning platform went live last Thursday; I reckon it’s one of the best looking sites they've launched to date (the Rottnest Island photography helps, of course). Check it out:

[Update 22 July: As mentioned in my profile to the right, I'd like to clarify the fact the new web site is provided by Tourism Western Australia under the Tourism eMarketplace program; although I was involved in the technical construction of the web site as a contractor working for Tourism Western Australia, Mediawhole and were NOT involved in the launch of this web site. Whereas I previously used the terms "we" and "our" in this post, I was referring to Tourism WA and teams working with the Rottnest Island Authority.]

Booking Exchange

On the subject of all things new, the new online booking capability has also launched on The system integrates with our existing search function and provides live availability information from V3’s Open Booking Exchange. The politics around this apparently simple change were massive but the technology side was relatively straightforward by comparison (web service calls from within SQL Server are as complex as this got from our end). If you’re a tourism operator, find out more.

Travel Planner

WACOM Travel Planner

The online travel planner was also finally launched after nearly a year of work with a Sydney-based agency. Luke was our man on the ground with this one and he did a great job integrating drop after drop of this widget.

The travel planner “helps visitors collect, organise and share their WA travel itineraries.” You can sign up from the homepage and add items from across the site to your travel wallet.

Sunday, 28 June 2009

SharePoint and SQL Server with Greg Low at PSPUG

Sezai’s been doing a fantastic job of attracting some high-end speakers to the Perth SharePoint User Group and we had the immense pleasure of attending Greg Low‘s presentation on SQL Server optimisation for SharePoint admins this month.

Greg is a god among men in my mind and one of those can’t-miss speakers; apart from being a SQL Server MVP and Microsoft Regional Director, he really knows his stuff and delivers a nuanced presentation. I’d seen Greg in Adelaide on previous occasions and since he’d made his way all the way to Perth just for us, there’s no way I’d miss him.

There were some key take-aways from Greg’s talk I’m paraphrasing here for future reference… note I’m not a DBA and don’t necessary understand all of this—in other words, don’t apply this advice blindly without additional research!

  • The autogrow settings should be configured appropriately on both content databases and the tempdb—the SQL Server defaults are inappropriate. In general, this means turning off autogrow and managing database file size manually. Alternatively, autogrow can be left enabled for contingency but should be reconfigured to grow using an appropriate value (in MB, by ~100MB).
  • tempdb is rebuilt to the configured autogrow value every time the sqlserver process restarts; the start value should be large enough to avoid file system fragmentation.
  • Never, ever auto shrink a SharePoint database as it will only have to regrow again and may increase fragmentation.
  • Use DBCC CHECKDB but beware fixing problems may incur data loss.
  • Whack your disk subsystem using benchmarking utilities like SQLIO and Iometer. Reasonable latency falls between 5-15ms, for example.
  • Instant file initialisation allows SQL to write to files that have not been zeroed out by the OS, thereby avoiding the performance hit incurred by that activity. The MSSQLSERVER service account must be granted the SE_MANAGE_VOLUME_NAME right by virtue of being added to the Perform Volume Maintenance Tasks security policy.
  • Multiple HBAs can lead to write ordering and disk subsystem issues; SQL Server 2005 introduced page checksums to help with this issue but the feature is turned off by default; it should be enabled.
  • Virtualising database servers faced a lot of bad press in the past but those days are behind us. Hyper-V R2 and ESX are the way forward.
  • Don’t even think about SQL Server 2000.
  • The gap between SQL Standard and Enterprise is getting wider with SQL Server 2008.
  • As most SQL Server instances are disk and memory-bound, consider enabling table/row/page compression in 2008 to reduce IO and memory usage at the expense of CPU. You mileage may vary as this will obviously depend on the content to be stored as photos and Office documents are already compressed. Backup compression is also possible.
  • Configure databases with one file per CPU core.
  • Different databases have different IO profiles; tempdb should be hosted on a dedicated spindle (see for more info about locating the various SharePoint databases).
  • Ensure statistics are configured correctly.
  • Index fill factor should typically be ~70% for a typical SharePoint environment.
  • The default Windows Server 2003 sector size is too small; set it to 64k when formatting drives.
  • Clustering SQL won’t boost performance like mirroring will. Clustering happens at the server level whereas mirroring must be configured for every database.
  • If configuring a SQL alias, use TCP/IP.
  • Create a separate SQL Server instance if an existing database server’s collation doesn’t match SharePoint’s specific requirements. Changing an incorrect collation is a lot of work.

Wednesday, 24 June 2009

Can’t reinstall Flash after failed v10 update

The silly Flash player failed to update in IE8 the other day; clicking the Get Flash link would take me to the installer page but the “gold bar” would never appear for me tell IE to download and install the player.

Chrome and Firefox are easier to deal with as the Flash plugin is installed through Add/Remove Programs.

Although forum reply suggested resetting IE fixed the problem for him I’m not that kind of guy ;-). For the record, Tools > Internet Options, Advanced tab and click the Reset… button.

In my case I resolved the problem by running the Flash uninstaller from the Flash Player Support Center

The MOSS setup account must be a member of db_owner role of the content database

One of the first things I normally do after creating a new web application and site collection is backup the content database using stsadm -o backup. If you're running in a least privilege scenario, however, there's an extra step required to configure access to the content database.

Whenever I'm working interactively with MOSS in an administrative capacity, I'm logged in as the MOSS setup account. This is the account used to install MOSS and create the farm; it’s also a member of the local admin and WSS_ADMIN_WPG groups, a member of the Farm Administrators group, and I've even got it configured as the primary site collection administrator for my content databases. In general, this account gives me the permissions I need to do everything I need to do while keeping my MOSS install functional.

Despite the wealth of permissions granted to this account, I was surprised when I tried running the stsadm -o backup command and came up against this error:

Cannot open database "WSS_Content_APP" requested by the login. The login failed.
Login failed for user 'domain\moss_setup'.

A similar, dodgier message was echoed in the Application event log.

To work around this, the setup account must be added to the db_owner role of the content database being backed up:

Navigate to the {WSS_Content DB}\Security\Roles\Database Roles\db_owner role and view its properties
Add the Setup account (eg. domain\moss_setup) as a Role Member

I've found no other role will suffice. Note in a least privilege scenario, the setup account should not be a member of the local admin group on the database server.

I'd love to know how to do this automatically whenever a new content database is added although I can understand why MOSS doesn’t do this for me… sort of ;-)