Tuesday 21 December 2010

grepWin–A lightweight replacement for Windows Search

I don't know if Windows Search on my Windows Server 2008 R2 machine is crippled because it's a server machine (I'm all set up to run Search) but I invariably have trouble finding strings I know exist in files. In other words, search in files or search in content doesn't seem to work despite my configuration, rebuilding the search index, and pulling my hair out.

Having yet another service running that indexes the majority of the non-system files on my computer also doesn't really appeal and for that reason (among others) I've also decided I don't want to add more search indexing with the likes of Google Desktop Search.

In fact, what I want is a lightweight search application that I can occasionally use with regular expressions. It should be fast and convenient to use, preferably without a command line. I want to be able to specify a start directory and optionally have it search subdirectories and file contents.

And here is a tool that meets all of the above criteria: http://tools.tortoisesvn.net/grepWin.html

(FWIW, this is not an infomercial—I haven no affiliation with Stefan or grepWin and am not paid to write this)

grepWin is 64-bit compatible and Stephan offers an installable version that offers Explorer context menu integration or a portable version that doesn't need to be installed.

There's no indexing involved so you'll likely want to narrow the start location of your search as much as possible but it's still fast.

Your search can be case sensitive or not and you can limit file sizes. A search can also exclude directories.

The results pane lists file details (size, path, date, etc) and also offers a content view so you can preview the match.

Search settings are automatically persisted so using the tool is incredibly convenient.

grepWin

If you found this post helpful, please support my advertisers.

Wednesday 15 December 2010

How to disable web part chrome

In the context of a highly-branded WCM site, the default "chrome" (border and title) SharePoint wraps around your custom web parts can be ugly and inconvenient. At worst, it will make your site look SharePoint-y and require every web part added to a page to have its Chrome settings manually adjusted.

Chrome Type

That's no way for a web part to behave in content management system but what to do?

In the past I've set the chrome type explicitly for each custom web part, either in the .webpart/.dwp file itself or programmatically:

<property name="ChromeType" type="chrometype">None</property>

this.ChromeType = PartChromeType.None;

To avoid doing this per web part (or cluttering up your nice little base web part class), you can set the PartChromeType property on the WebPartZone declared in your page layout:

<WebPartPages:WebPartZone id="wpzLeftColumn" runat="server" title="Left Column" PartChromeType="None">

Any web parts added to that zone with a default Chrome Type (aptly named "Default") will inherit this setting from the web part zone. Of course individual web parts can override this as required.

If you found this post helpful, please support my advertisers.

Enumeration-based property with initialiser fails to load in tool pane

I encountered what is likely an obscure issue today while using an existing enumeration to populate a drop down list in the tool pane of a SharePoint web part.

The web part didn't require a custom toolpart so I'm referring to a web part property defined within the web part class itself:

[SPWebCategoryName ("TWA")]
[FriendlyNameAttribute ("Region Override")]
[WebDescription ("Explicitly configure the region displayed when the mode is set to Region.")]
[Personalizable (PersonalizationScope.Shared, false)]
[WebBrowsable (true)]
public WARegions RegionOverride { get; set; }

Notice how the public RegionOverride property has a type of WARegions. Normally, the end result is an additional custom property displayed in the web part tool pane; in this case, the WARegions type is an enumeration and SharePoint therefore renders this as a drop down list:

Enumeration DDL

I say normally because today I was only getting a disabled DDL containing a single item ("Perth"). SharePoint was also kind enough to present me with this vaguely useful error message:

Some of the properties for this Web Part cannot be displayed properly. For more information, see your site administrator.

The WARegions enumeration is an existing enum I chose to reuse for the sake of convenience. Upon further inspection, I noticed the first element in the enum (Perth) was explicitly assigned a value of 1. Acting on my suspicion, I removed the explicit assignment and the DDL would then display as expected.

As a final test to confirm whether the explicit initialisation was somehow behind the single list item being displayed, I set default values for all elements in the enum but the error returned. Fortunately in my case I can do away with the explicit initialisation (which I'm not keen on—I prefer to initialise enums myself).

If you found this post helpful, please support my advertisers.

Monday 13 December 2010

My custom Visual Studio keyboard shortcuts

I do all of the builds in my position as Development Coordinator but with the Visual Studio project context menus out of control and nearly unusable with clutter from various plugins (including ReSharper, PowerCommands, and Productivity Power Tools), I've finally cracked.

Since we don't (yet) have a build server, a normal manual build workflow goes something like "do stuff, do more stuff, BUILD, do stuff, OPEN DEPLOYMENT FOLDER IN EXPLORER, copy files to drop folder, etc. Before installing ReSharper, the VS team finally had all context menu items displaying on a single screen, with no scrolling required; ReSharper has ruined all of that again and I now find myself forever scrolling the context menu from the top (Rebuild) to the bottom (Open Folder in Windows Explorer).

Because I'm forever installing Visual Studio in different environments/upgrading/etc, I'm loathe to define custom keyboard mappings (likely a hangover from my days of using the product before Import and Export Settings… was introduced). Nonetheless, these are the shortcuts I've finally created:

Project.OpenFolderinWindowsExplorer – Alt+ O

Build.RebuildSolution – Alt+ R (Why rebuild instead of build? Personal preference)

If you found this post helpful, please support my advertisers.

Friday 3 December 2010

Relative URLs, the Rich Text Editor, and Reusable Content

We recently started leveraging the Reusable Content list to supply user-editable content to one of our interactive online forms and outbound emails. This allows the content owner to tweak the content as required as this content isn't locked away in custom code or resource files—no build and deploy required and the content management system is even used as such (whatdya know?!).

The Reusable Content list is provisioned by SharePoint when the Publishing feature is enabled; it allows users to define content snippets that can be maintained in a single location and updated automatically wherever they're used. You can optionally treat a reusable content snippet as a template—an editable copy is inserted into the page instead of a read-only, auto-updating view.

Reusable Content list items are pretty straightforward and, most importantly, contain a single HTML (rich text) field. SharePoint naturally displays its rich text editor around this field in edit mode so the edit user experience is similar to editing page content.

Unfortunately HTML fields in SharePoint are smarter than they should be and (in MOSS 2007), the product will mangle some content. I recently discovered it refuses to play with background-image style—they're silently removed whether they're inline or in an embedded stylesheet. (And yes, I know, inline styles are evil but this snippet was actually being plugged into an email so everything had to be self-contained).

Despite the tricks SharePoint plays on you with "managed" URLs, it seems the rich text field also stores URLs pointing to content within the current site as relative URLs. Absolute URLs are converted automagically but you'll never really see this until you pull the content out via the API.

We hunted for a way within SharePoint to convert all relative URLs in this content to absolute URLS but without much luck. I think there may be a Javascript function in one of the client scripts to do so for a chunk of content but there's nothing obvious in the server-side API.

To address this, we replace all relative URLs using the regex below (note the URL group) (is using regex to parse HTML bad? You be the judge). You may want to use SPUtility.GetFullUrl() to convert individual URLs.

@"(?:<\s*(?:a|img)\s+[^>]*(?:href|src)\s*=\s*[\""'])(?!http)(?<url>[^\""'>]+)[\""'>]"

If you found this post helpful, please support my advertisers.

Thursday 18 November 2010

DateTime format safe for file system

Couldn't find anything similar when I searched so thought I'd throw this out there for use in file names in a Windows file system:

DateTime.Now.ToString ("yyyyMMddHHmmss")

Will output "201001312359"

Add a ".txt" or ".dat" if you're hard-core and you're all set.

If you found this post helpful, please support my advertisers.

Wednesday 17 November 2010

How to determine if a SPPublishingPage is published

You can programmatically determine if the SPPublishingPage you're dealing with is in a published state by retrieving the SPFileLevel of the page's corresponding SPFile:

if (page.ListItem.File.Level == SPFileLevel.Published) …

You can also access this through the PublishingPage.ListItem.Level property.

The same SPFileLevel enumeration will also indicate if the page is in draft mode (checked in but not published) or checked out but in my experience this property will return SPFileLevel.Published or SPFileLevel.Draft more often than not.

To determine whether the page is checked out, use the SPFile.CheckOutStatus property in WSS 3.0 or the  SPFile.CheckOutType property in SP2010. Any value other than None means the page is checked out.

If you found this post helpful, please support my advertisers.

SPSite.AllWebs vs SPWeb.Webs

There's a subtle distinction between SPSite.AllWebs and SPWeb.Webs to be aware of—and yes, despite this being the SharePoint API we're talking about, it's more than just a naming oddity!

SPSite.AllWebs returns the immediate child webs of the site collection and all of their child webs (in other words, all webs recursively within the site collection); SPWeb.Webs has a much narrower scope and only returns the first level child webs immediately below the SPWeb object.

So what does this mean and how does it benefit you? If you're writing code that recursively walks the site hierarchy using SPWeb.Webs, you may be able to avoid the overhead by simply using SPSite.AllWebs. Recursion is fun and all but it adds complexity where it may not be warranted.

If you found this post helpful, please support my advertisers.

Thursday 28 October 2010

Free Antivirus for Windows Server 2008

I've been a long time home user of Avast! antivirus. It's a great product that doesn't bog down my machine. Unfortunately the free home edition won't install on Windows Server 2008 R2 and I run a single-boot W2k8 environment because of my Hyper-V love affair.

Don't get me wrong, I normally despise AV and security in any form… the guys I work with are probably fed up with me always shouting "security != productivity" when the bloody proxy policy has once again broken something or prevented me doing my job. But, every now and then, I feel incline to download evil things and God forbid those things include a virus of some kind—a quick scan would then come in handy…

ClamWin to the rescue! Don't know about the name but it doesn't include a real-time scanner which meets my requirements. So far so good… evil things downloaded and appear to be virus free!

Wednesday 27 October 2010

Empty ImageUrl results in empty src and duplicate request

Not exactly a new issue but an empty ASP.NET ImageUrl tripped us up today. In this case there were no visible symptoms but two requests were coming through when Fiddlering the page view: the first initiated by the user and the second initiated by the page itself towards the end of the trace. In development with a debugger attached, this was manifest with Page_Load, CreateChildControls, etc being called twice for no obvious reason.

Because I initially thought I introduced the problem with the control I was working on, I first attempted to convince ASP.NET CreateChildControls was complete; I did so by clearing the Controls collection before unleashing my own code and setting the ChildControlsCreated property true once done. Neither of these tricks had any affect and the Fiddler trace had me convinced something beyond the ASP.NET pipeline and IIS had to be at fault.

That turned out to be the case but ASP.NET was still to blame ;-)

Specifically, we were adding a server-side ASP.NET image control but not initialising its ImageUrl property; the image src was later being set by jQuery at runtime on the client side. As mentioned, there were no visible symptoms of the double request: the image src was set as expected by jQuery and everything appeared okay on the surface. The IE dev toolbar also showed image src as being correctly set and there were no 404s in the mix.

It wasn't until we looked at the raw HTML that our ever-faithful admin extraordinaire noticed the empty src="" attribute. Removing the attribute removed the problem so I can only conclude IE is helpful enough to attempt to interpret a request for an empty image as a request for the parent directory of the current page while parsing the HTML before any Javascript runs. Thanks again, IE!

Notably this problem wasn't reproducible in Firefox or Chrome.

To fix the problem, we first set a default value for the ImageUrl property but that left me feeling dirty since it was still resulting in an unnecessary request. When I realised the server-side Image tag wasn't actually being used for anything server-side anyway, I replaced it with a boring old HTML img tag with no src. Microsoft has other, equally lame workarounds for this if you're interested; note they also don't plan to fix this bug.

If you found this post helpful, please support my advertisers.

Friday 15 October 2010

Must-know SharePoint debugging tips

This post is a follow-on to my Must-Know Visual Studio debugging tips article; I'm separating out my SharePoint debugging tips focused on list, feature, and solution deployment that don't relate to Visual Studio.

Here you go:

  • Try activating your feature without the –force attribute; you'll likely need to deactivate the feature first
  • Try uninstalling and reinstalling the feature
  • If something stuck and rebooting seems to clear some kind of cache deep inside SharePoint, stop IIS and restart the various SharePoint Windows services

@echo off
@echo Stopping services...
iisreset /stop /noforce
net stop "Windows SharePoint Services Timer"
net stop "Windows SharePoint Services Administration"
net stop "Office SharePoint Server Search"
net stop "Windows SharePoint Services Search"
net stop "Windows SharePoint Services Tracing"
@echo Starting services...
net start "Windows SharePoint Services Tracing"
net start "Windows SharePoint Services Search"
net start "Office SharePoint Server Search"
net start "Windows SharePoint Services Administration"
net start "Windows SharePoint Services Timer"
iisreset /start
@pause

  • Create a new list from your list definition (or at least, a new list item)
  • Rebuild your solution and redeploy
If you found this post helpful, please support my advertisers.

Thursday 14 October 2010

The Mysterious SourceID Attribute

I see a lot field definitions around that unnecessarily include a SourceID attribute:

<Field
SourceID="http://schemas.microsoft.com/sharepoint/v3"

More often than not, the value of this attribute is set as above. Microsoft's own documentation samples tend to include this attribute.

So what does it do? Not a lot as far as I can tell; I don't include it with my own field definitions because inspecting the field after deployment with SharePoint Manager proves SharePoint sets it automatically with the GUID of the list that created the field. The Field Element documentation also indicates it is optional.

If you're intent on using it, the documentation describes this attribute as "containing the namespace that defines the field, such as http://schemas.microsoft.com/sharepoint/v3" so rather than use the default namespace it may be advisable to use your own (I haven't tried this myself).

If you found this post helpful, please support my advertisers.

Wednesday 13 October 2010

SharePoint and Chrome - Better Together

I've been using Google's Chrome browser since its first release in 2008; I've loved nearly every second of the experience. Who would've thought there was room left to innovate in the browser space? Chrome's omnibar and rapid-fire JavaScript rendering, among other tweaks, are simply light years ahead of the competition.

While I normally rely on IE for my MOSS/SharePoint editing interactions, as of late I'm making the switch to Chrome in that space as well. What I've found to date has blown my mind.

Yes the MOSS 2007 UI degrades somewhat but it's still very useable. More importantly, Chrome drastically reduces the time it takes to accomplish basic tasks like modifying page content or viewing list data. I'm not saying these are normally slow in SharePoint but they can be in the www.westernaustralia.com environment (it's an ageing site with a lot of content and a lot of customisations); some pages in particular nearly grind to a halt in IE8 with the corresponding process consuming upwards of 1GB of memory the more I interact with the page.

Chrome "fixes" many of these slowdowns I'd previously attributed to the SharePoint environment and gives me all the Chrome goodness I've come to love over the last two years. It almost makes the SharePoint editing experience pleasurable!

If you found this post helpful, please support my advertisers.

Monday 13 September 2010

How to pass JSON arrays and other data types to an ASMX web service

Ah interoperability… great fun, great fun.

So jQuery is your new best friend and, along with JSON, there's nuthin' you can't do. The server side stuff is still there in the background and you've got some old school ASP.NET (.asmx) web services hanging around but DOM elements are otherwise flying all over the place, postbacks are just so passé, and even the marketing girls are mildly impressed at your skillz. You're branching out, shifting code and complexity from the server to the browser, and it's time to do some heavier data shunting. Here a few things to know about passing JSON data to an ASMX web service that may help you on your way…

JSON.stringify

Know it, use it, love it. It's part of the JSON2 library and you need it if you don't have it already. Use it to prepare (aka properly encode) your JSON data before sending it off to the big mean ol' web server:

data: {"days": JSON.stringify(["Mon", "Tues"])}

That will encode as &days=["Monday","Tuesday"]

Yeah, I know, it's another file to download but the guy who wrote JSON also wrote this and it can be merged and minified. I've tried writing my own mini-version as a function and while this works for simple strings, save yourself some time when it comes to arrays and the like and just use this sucker.

Arrays

Arrays seem trickier than they should… maybe I'm just a dumb guy—probably. Anyway, you can pass a JSON array to an .asmx web service without much work at all.

The client-side call listed above is everything you need to do from that end. On the server side, create yourself a new web service method with a List<string> parameter:

[WebMethod]
[ScriptMethod (UseHttpGet = true, ResponseFormat = ResponseFormat.Json)]
public string ConsumeArray (List<string> days)
{…}

That's all there is to it. If you're not passing in strings, declare the List<> parameter with a type of object or something else. You can use .NET arrays in the web method signature as well if you really want (need) to.

Integers

When in doubt, stringify:

data: { "i": JSON.stringify(2) }

An int parameter on the web service end will handle this graciously.

Booleans

The good ol' boolean—a simple concept computer science has managed to bastardise like no other…

When in doubt, stringify:

data: { "b": jsonpEncode("true") }

Like the int parameter, a bool in your web method signature will take care of this.

A brief note: JSON, or rather jQuery's parseJSON function, is a particular beast and doesn't seem to know about anything other than the lower case true and false strings. If, for any reason, you ToString a bool in your .NET web service and try to pass it back, parseJSON will fail. If you forget to brush your teeth in the morning, parseJSON will fail.

Dates

Sorry, on my todo list ;-)

Tools

When working through this stuff, it pays to have Fiddler open to inspect the requests you're sending through and any error messages you're getting back. I find Fiddler sometimes breaks this stuff so try turning off the capture if you're getting weird errors; optionally, revert to Firebug (Firefox only, of course).

Fully decoding the data you sniff from a JSONP request passed along in the query string will require some additional tooling; in short, you'll want to decode the value using a free online tool like Opinionated Geek's URL decoder.

If you found this post helpful, please support my advertisers.

Wednesday 8 September 2010

Exposing the Global Assembly Cache

If you're familiar with the Global Assembly Cache (GAC) you're probably aware there's a special file system viewer thingy (technical term) sitting over top of the GAC contents at c:\windows\assembly; this is a nice convenience when it comes to registering assemblies in the GAC—simply drag and drop, avoiding a trip to the command line and gacutil –i

More often than not, this is all good. When you need to dive into the real GAC, to extract an assembly, drop in debugging symbols, or whatever, you'll quickly realise the viewer is somewhat limiting.

To get past the GAC's outer facade, you've got a few options:

  • From a command line, browse to c:\windows\assembly\gac_msil
  • Map a network drive to \\machine-name\c$\windows\assembly\gac_msil
  • Create a virtual drive: subst z: c:\windows\assembly\gac_msil where 'z:' is any unmapped drive letter
  • Start –> Run c:\windows\assembly\gac_msil
  • Turn off the viewer altogether to browse the GAC directory structure normally within Windows Explorer: create a new DWORD named DisableCacheViewer with a value of 1 below the HKLM\Software\Microsoft\Fusion key

Five ways to do the same thing? Well this is Windows after all—and there are probably more!!! ;)

If you drop the gac_msil bit you'll find there are other directories that make up the GAC proper to explorer but most of what you'll be after resides below gac_msil. Each assembly is represented in by name as a folder with different versions represented as sub folders named as the version number with a GUID appended; the assembly proper will live in one of these folders.

If you found this post helpful, please support my advertisers.

Changing a VS2010 SharePoint package name

The SharePoint development tooling built into Visual Studio 2010 does a pretty good job at hiding some of the ugly bits involved in creating a SharePoint solution file. No longer must we deal with ddf files (hooray!) but we also lose some control (or, at the very least, have to dig a bit further to change some things that were previously "easy" in the broader context of "really painful").

One of those things that we previously had full control over was the name of the solution file (.wsp) that results from our ddf file and the mighty makecab.exe. While the file name is now set from within Visual Studio, you have to open the package designer before being presented with the package properties:

Rename-solution-file-vs2010
Simply clicking on the Package.package file will present you with the file properties which is not what you want—you have to double-click/open the package designer.

The one thing I used to do with the ol' ddf file was specify the extension as .cab instead of .wsp. This was a convenience thing—SharePoint doesn't care about the extension (or at least doesn't object to .cab files)—but a .cab file will open in Windows Explorer on machines where you don't have a better archive tool installed like my favourite, 7zip. Essentially, it just saved me having to rename the file by hand.

Unfortunately I have yet to find a way to do this with VS2010; the package name is only that—Visual Studio appends ".wsp" to the end of whatever name you supply and I can't find a way to override that behaviour. In truth I'll probably just start using .wsp like everyone else but it may be possible to tweak the file name in one of the pre/post events.

If you found this post helpful, please support my advertisers.

Monday 6 September 2010

.NET 4.0 application compatibility

Looks like those of us running SharePoint may be stuck on .NET 3.5 SP1 for a while longer but that's no excuse not to have .NET 4.0 installed side-by-side for use by other apps. One of the guys on the team, for instance, recently updated the custom crawler we run and needed pull in some of the Entity Framework features only made available in the 4.0 release; of course Microsoft will tell you the various .NET runtimes can be installed to run side-by-side and we managed to get away with a lot during the whole 2.0/3.0/3.5 onion thing.

Of course this all gets interesting very quickly when you've got a console application built against 4.0 that references assemblies (i.e. the SharePoint assemblies) built against version 2.0. Naturally everything worked fine in dev but as soon as we hit UAT, this exception cropped up:

An error has occured: Mixed mode assembly is built against version 'v2.0.50727' of the runtime and cannot be loaded in the 4.0 runtime without additional configuration information. [sic]

To address this, a simple .config element was added:

<configuration>
  <startup useLegacyV2RuntimeActivationPolicy="true">
    <supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.0" />
  </startup>

Notably, the useLegacyV2RuntimeActivationPolicy="true" attribute was required and we found this out the hard way by not having in place initially (dodgy internet code most likely!).

If you found this post helpful, please support my advertisers.

Friday 3 September 2010

Detection of product '{90140000-104C-0000-1000-0000000FF1CE}', feature 'PeopleILM', component '{1C12B6E6-898C-4D58-9774-AAAFBDFE273C}' failed

After sorting out a problem with the FIM Service not starting automatically after a reboot and several resultant application event log errors, there was one more thing to clean up: running a user profile sync would spit a couple of MsiInstaller warnings about product detection failing (see below).

Various users in the forums suggested granting the Network Service account read access to the C:\Program Files\Microsoft Office Servers\14.0\Service directory and/or the C:\Program Files\Microsoft Office Servers\14.0\Sql directory; I initially found only the latter was required (this directory currently has Read & execute, List folder contents, and Read) but on reboot, the same warnings were logged again. Granting the same access to \Services had the same result and I've found starting a crawl produces these warnings but they go away with subsequent crawls. Reboot and they're back :-(

This change relates to the following warnings logged as a crawl is initialised:

Log Name:      Application
Source:        MsiInstaller
Date:          2/09/2010 3:40:24 PM
Event ID:      1004
Task Category: None
Level:         Warning
Keywords:      Classic
User:          NETWORK SERVICE
Computer:      dev-sps2010-01.dev.mediawhole.com
Description:
Detection of product '{90140000-104C-0000-1000-0000000FF1CE}', feature 'PeopleILM', component '{1C12B6E6-898C-4D58-9774-AAAFBDFE273C}' failed.  The resource 'C:\Program Files\Microsoft Office Servers\14.0\Service\Microsoft.ResourceManagement.Service.exe' does not exist.

Log Name:      Application
Source:        MsiInstaller
Date:          2/09/2010 3:40:24 PM
Event ID:      1001
Task Category: None
Level:         Warning
Keywords:      Classic
User:          NETWORK SERVICE
Computer:      dev-sps2010-01.dev.mediawhole.com
Description:
Detection of product '{90140000-104C-0000-1000-0000000FF1CE}', feature 'PeopleILM' failed during request for component '{9AE4D8E0-D3F6-47A8-8FAE-38496FE32FF5}'

Log Name:      Application
Source:        MsiInstaller
Date:          2/09/2010 3:40:24 PM
Event ID:      1015
Task Category: None
Level:         Warning
Keywords:      Classic
User:          NETWORK SERVICE
Computer:      dev-sps2010-01.dev.mediawhole.com
Description:
Failed to connect to server. Error: 0x80070005

If you found this post helpful, please support my advertisers.

Forefront Identity Manager Service fails to start after reboot

Update [28/09/2010]: Spence recently released a follow-up article to the Rational Guide… in which he discusses an additional change for those of us using SQL Server aliases. Check out the section entitled "Using a SQL Server Named Instance" and scoot down to the local DTC configuration steps. I haven't tried this yet myself but it sounds promising.

Update [27/10/2010]: I see Spence has updated the above-mentioned article to include a section about this problem which validates the solution presented here.

After following Spence Harbar's Rational Guide to implementing SharePoint Server 2010 User Profile Synchronization, I was able to not only get the UPS service started but I was also able to run a sync on my first attempt. I probably got lucky ;-)

The one small hiccup I had along the way was getting the Forefront Identity Manager Service to start following a reboot; the service simply refused to start automatically despite being configured by SharePoint/FIM to do so. Interestingly, both the User Profile Service and the User Profile Synchronization Service items listed in Central Admin's Services on Server page listed the services as running. Starting the FIM Service manually from the Windows Services snapin succeeded (I didn't try directly through CA) but felt hacky and annoying.

What to do? Since the Synchronization Service was starting successfully and I could manually start the service after logging in, I assume this has to be some kind of dependency issue between the services themselves or SQL Server (some of the event log error message listed below definitely take issue with SQL).

Update 29/09/2010: After examining the sequence of event log entries relating to MSSQLSERVER and FIM, I can clearly see SQL is NOT ready to accept client connections by the time the FIM services kick in. I should point out my test environment is running as a single-server farm (AD, SQL, IIS, SharePoint, etc) so I'd definitely pay attention to Spence's follow-up article I note above in the 28/09 update.

My solution was to therefore set both services to start automatically at boot time after a delay by reconfiguring the startup type of BOTH services and Automatic (Delayed Start) in the Windows Services snapin:

FIM-Delayed-Start

Interestingly, I found the FIM Service starts before the FIM Sync Service, fwiw. I also still have one error remaining stating The Forefront Identity Manager Service cannot connect to the SQL Database Server but it doesn't prevent the services from starting or a sync from running.

So is this an inappropriate change to make? I can't say, especially with everyone and their dog saying "let SharePoint manage these services, don't start 'em manually!" In a single-server environment, I'll suggest it is acceptable. I know for certain both services now start automatically after a minute or so (once all other services set to just Automatic have started) and I can still run a profile sync; the following errors are also no longer present:

Log Name:      Application
Source:        Forefront Identity Manager
Date:          3/09/2010 12:37:17 PM
Event ID:      3
Task Category: None
Level:         Error
Keywords:      Classic
User:          N/A
Computer:      dev-sps2010-01.dev.mediawhole.com
Description:
.Net SqlClient Data Provider: System.Data.SqlClient.SqlException: Cannot open database "Sync DB" requested by the login. The login failed.
Login failed for user 'DEV\SVC_SPFARM'.
   at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection)
   at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)
   at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)
   at System.Data.SqlClient.SqlInternalConnectionTds.CompleteLogin(Boolean enlistOK)
   at System.Data.SqlClient.SqlInternalConnectionTds.AttemptOneLogin(ServerInfo serverInfo, String newPassword, Boolean ignoreSniOpenTimeout, Int64 timerExpire, SqlConnection owningObject)
   at System.Data.SqlClient.SqlInternalConnectionTds.LoginNoFailover(String host, String newPassword, Boolean redirectedUserInstance, SqlConnection owningObject, SqlConnectionString connectionOptions, Int64 timerStart)
   at System.Data.SqlClient.SqlInternalConnectionTds.OpenLoginEnlist(SqlConnection owningObject, SqlConnectionString connectionOptions, String newPassword, Boolean redirectedUserInstance)
   at System.Data.SqlClient.SqlInternalConnectionTds..ctor(DbConnectionPoolIdentity identity, SqlConnectionString connectionOptions, Object providerInfo, String newPassword, SqlConnection owningObject, Boolean redirectedUserInstance)
   at System.Data.SqlClient.SqlConnectionFactory.CreateConnection(DbConnectionOptions options, Object poolGroupProviderInfo, DbConnectionPool pool, DbConnection owningConnection)
   at System.Data.ProviderBase.DbConnectionFactory.CreatePooledConnection(DbConnection owningConnection, DbConnectionPool pool, DbConnectionOptions options)
   at System.Data.ProviderBase.DbConnectionPool.CreateObject(DbConnection owningObject)
   at System.Data.ProviderBase.DbConnectionPool.UserCreateRequest(DbConnection owningObject)
   at System.Data.ProviderBase.DbConnectionPool.GetConnection(DbConnection owningObject)
   at System.Data.ProviderBase.DbConnectionFactory.GetConnection(DbConnection owningConnection)
   at System.Data.ProviderBase.DbConnectionClosed.OpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory)
   at System.Data.SqlClient.SqlConnection.Open()
   at Microsoft.ResourceManagement.Data.DatabaseConnection.Open(SqlConnection connection)

Log Name:      Application
Source:        Forefront Identity Manager
Date:          3/09/2010 12:37:17 PM
Event ID:      3
Task Category: None
Level:         Error
Keywords:      Classic
User:          N/A
Computer:      dev-sps2010-01.dev.mediawhole.com
Description:
.Net SqlClient Data Provider: System.Data.SqlClient.SqlException: Cannot open database "Sync DB" requested by the login. The login failed.
Login failed for user 'DEV\SVC_SPFARM'.
   at Microsoft.ResourceManagement.Data.Exception.DataAccessExceptionManager.ThrowException(SqlException innerException)
   at Microsoft.ResourceManagement.Data.DatabaseConnection.Open(SqlConnection connection)
   at Microsoft.ResourceManagement.Data.DatabaseConnection.Open(DataStore store)

Log Name:      Application
Source:        Microsoft.ResourceManagement.ServiceHealthSource
Date:          3/09/2010 12:37:17 PM
Event ID:      26
Task Category: None
Level:         Error
Keywords:      Classic
User:          N/A
Computer:      dev-sps2010-01.dev.mediawhole.com
Description:
The Forefront Identity Manager Service was not able to initialize a timer necessary for supporting the execution of workflows.

Upon startup, the Forefront Identity Manager Service must initialize and set a timer to support workflow execution.  If this timer fails to get created, workflows will not run successfully and there is no recovery other than to stop and start the Forefront Identity Manager Service.

Restart the Forefront Identity Manager Service.

Log Name:      Application
Source:        Microsoft.ResourceManagement.ServiceHealthSource
Date:          3/09/2010 12:37:17 PM
Event ID:      2
Task Category: None
Level:         Error
Keywords:      Classic
User:          N/A
Computer:      dev-sps2010-01.dev.mediawhole.com
Description:
The Forefront Identity Manager Service could not bind to its endpoints.  This failure prevents clients from communicating with the Web services.

A most likely cause for the failure is another service, possibly another instance of Forefront Identity Manager Service, has already bound to the endpoint.  Another, less likely cause, is that the account under which the service runs does not have permission to bind to endpoints.

Ensure that no other processes have bound to that endpoint and that the service account has permission to bind endpoints.  Further, check the application configuration file to ensure the Forefront Identity Manager Service is binding to the correct endpoints.

Log Name:      Application
Source:        Forefront Identity Manager
Date:          3/09/2010 12:37:17 PM
Event ID:      3
Task Category: None
Level:         Error
Keywords:      Classic
User:          N/A
Computer:      dev-sps2010-01.dev.mediawhole.com
Description:
.Net SqlClient Data Provider: System.Data.SqlClient.SqlException: Cannot open database "Sync DB" requested by the login. The login failed.
Login failed for user 'DEV\SVC_SPFARM'.
   at Microsoft.ResourceManagement.Data.Exception.DataAccessExceptionManager.ThrowException(SqlException innerException)
   at Microsoft.ResourceManagement.Data.DatabaseConnection.Open(SqlConnection connection)
   at Microsoft.ResourceManagement.Data.DatabaseConnection.Open(DataStore store)
   at Microsoft.ResourceManagement.Data.TransactionAndConnectionScope..ctor(Boolean createTransaction, IsolationLevel isolationLevel, DataStore dataStore)
   at Microsoft.ResourceManagement.Data.TransactionAndConnectionScope..ctor(Boolean createTransaction)
   at Microsoft.ResourceManagement.Data.DataAccess.RegisterService(String hostName)
   at Microsoft.ResourceManagement.Workflow.Hosting.HostActivator.RegisterService(String hostName)
   at Microsoft.ResourceManagement.Workflow.Hosting.HostActivator.Initialize()
   at Microsoft.ResourceManagement.WebServices.ResourceManagementServiceHostFactory.CreateServiceHost(String constructorString, Uri[] baseAddresses)
   at Microsoft.ResourceManagement.WindowsHostService.OnStart(String[] args)

Log Name:      Application
Source:        Microsoft Resource Management Service
Date:          3/09/2010 12:37:17 PM
Event ID:      0
Task Category: None
Level:         Error
Keywords:      Classic
User:          N/A
Computer:      dev-sps2010-01.dev.mediawhole.com
Description:
Service cannot be started. System.Data.SqlClient.SqlException: Cannot open database "Sync DB" requested by the login. The login failed.
Login failed for user 'DEV\SVC_SPFARM'.
   at Microsoft.ResourceManagement.WindowsHostService.OnStart(String[] args)
   at System.ServiceProcess.ServiceBase.ServiceQueuedMainCallback(Object state)

If you found this post helpful, please support my advertisers.

Troubleshooting user profile sync via FIM

Your initial foray into SharePoint 2010 user profile sync will likely lead you to the FIM client and, if you're anything like me, your mind will boggle at what FIM is, why it has to be involved at all, and where to start when things to horribly wrong.

I won't attempt to enlighten you on the first two subjects but I do want to point out some interesting and non-intuitive FIM user interface screens you may not be aware of and that will help you determine if your UPS setup is on the right path. By the way, you can run the FIM client as soon as the two FIM services are running on your machine (in other words, as soon as UPS has been provisioned but before you run a sync).

If you've got the UPS service in a running state, the next thing you'll likely want to do is run your first (or 50th) sync; in addition to the dodgy status screen within Central Admin itself, you can fire up the FIM client to watch from the bushes (the Operations view) as SharePoint, FIM, SQL Server, and AD do their magic dance. A successful run includes ten operations in my dev environment and I've previously posted a screen shot of this if you're interested.

If you look carefully, the operations view will reveal the user name involved with each operation and list some partition info as well. To dive in deeper, click the Management Agents button in the top menu; in my case, I'm presented with three MAs (if you've got more because you've been struggling with connections, you may be in trouble):

  • The first MA named ILMMA connects to the database I specified when setting up UPS ("Sync DB")
  • The second MA named MOSS-{GUID} connects to the ProfileImportExportService web service
  • The final MA named MOSSAD-{name of my connection as configured in CA} connects to Active Directory

FIM MAsBy viewing the properties for each MA (right-click on an MA and select Properties from the context menu or use the Actions pane to the right of the window) I can also examine specific properties to determine exactly what domain name FIM is configured to use and the accounts used to interact with AD, SQL Server, and the web service:

ILMMA-Properties MOSSAD-ConnectionName-Properties MOSSAD-ConnectionName-Directory-Partitions

The attentive reader will note there's a lot of farm account action going on here and that's because both FIM services are configured to log on as the farm account and my understanding is they have to be because of the way the relevant timer job(s), which are also run as the farm account, interact with these services (says Spence). I'll also point out my svc_spups account is the account to which I've granted Replicating Directory Changes in AD.

If you found this post helpful, please support my advertisers.

Thursday 2 September 2010

Testing database connectivity

Developer that I am, I've gone so far as to write throwaway console apps to test connectivity to a database server as a different user; hopefully I'll never have to do that again after learning about this really cool trick:

  • Create a new text file and change the extension to .udl
  • Double-click it
  • Select a provider, configure the options to connect and hit Test Connection (note the Use a specific user name and password option means SQL authentication; entering an AD account here will fail).
  • Start a console window as another user and execute the .udl file to connect as someone else using integrated security if necessary
  • Having ruled a security issue in or out, fix the problem!

UDL-database-connection 

A massive shout out to Todd Klindt for sharing this in one of his recent netcasts!!!

Ps. "OK" through to persist the connection details in the form of a connection string to your text file. Kinda helpful!

If you found this post helpful, please support my advertisers.

miisclient.exe success output

Partly for the glory but more for posterity (my own future reference!), I've screen captured what a successful SharePoint 2010 AD user profile synchronisation looks like in the FIM client:

miisclient-synchronization-successWill this gain me entry to a special club or something? If so, thanks Spence! Now to work through all the event log errors and warnings… perhaps I ought to call it a day!

If you found this post helpful, please support my advertisers.

Wednesday 1 September 2010

Where to find the FIM Client (miisclient.exe)

Note to self and anyone else who cares… you'll find the FIM client (aka miisclient) at the following location:

C:\Program Files\Microsoft Office Servers\14.0\Synchronization Service\UIShell\miisclient.exe

Use it to debug SharePoint 2010 user profile synchronisation.

Sunday 29 August 2010

Moving files in Visual Studio and the SharePointProjectItem.spdata file

While mucking around the CKS:Development Tools Edition Visual Studio 2010 extension for SharePoint, I inadvertently created a new master page from the Starter Master Page (CKSDev) template in an existing VS module:

Module-CKSDev-MasterPage

Neither VS nor SharePoint seemed to object to this arrangement—likely because the module in question was otherwise empty—but, as an experiment, I decided to move the CKS:Dev master page SPI into the project (and the top-level MasterPages module) nonetheless, making the nested CKS:Dev container obsolete.

This wasn't a good idea because subsequent attempts to build the package failed with the error "Could not find the file 'c:\…PageLayouts\StarterMasterPage1.master". Omg, what to do and why is trying to find these files in the project root?! I briefly pined for the good ol' .ddf file from 2007 days and then slapped some sense back into myself.

To begin fixing the problem, I first had to click the Show All Files button in the Solution Explorer window to reveal the SharePointProjectItem.spdata file in the root-level module (note the CKS:Dev inner master page module has its own .spdata file, which is empty after the move):

SharePointProjectItem_spdata

This is an XML file and its structure is quite basic; I'd describe it as a mini project file for the module in question as it lists the source files in the module along with their targets, and types.

Interestingly, the Source values specified all started with "..\" but inspecting a similar file from an un-modified module simply listed each file names; I assume Visual Studio helped out during the move to add this additional path information.

Removing the "..\" prefix fixed the problem and suggested to me the SharePoingProjectItem.spdata file may prove fragile during project refactoring. Sure enough, moving the files back to their original home added a spurious MasterPages\ prefix to each file and wreaked havoc all over again.

Moral of the story: manually edit the SharePointProjectItem.data file as required but beware Visual Studio may "fix" any of your changes.

One other thing to note: when I moved my files around, Visual Studio reset the properties on some of them. Most notably, Elements.xml file reverted to a Deployment Type of ElementFile instead of ElementManifest and the Build Action was set to Content. Modifying these properties also added the parent directory prefixes back to my .spdata file. Perhaps removing files from one location and adding them them back through the Add –> Existing Item… dialog is a safer way to approach this long-term if VS is going to continue meddling with things.

And just to add to the joy, one final bit of fun: a page layout added to the module in question was deploying fine; for the sake of producing the screenshot above, I excluded it from the solution, which also removed it from the files listed below the MasterPages module in the visual feature designer. While I assumed the feature designer would automatically update itself when I added the layout back to the module, I was wrong: the designer refused to update to reflect the inclusion of the page layout until I restarted Visual Studio! If this is a VS 2010 bug, it'll be one of several I've already filed with MS! Viva SP1!!!

If you found this post helpful, please support my advertisers.

Thursday 19 August 2010

Setting the Visible property on a webpart throws

Using myWebPart.Visible = false and getting this?

The Visible property cannot be set on Web Part 'your_web_part'. It can only be set on a standalone Web Part.

Use myWebPart.Hidden = true instead.

If you found this post helpful, please support my advertisers:

Wednesday 18 August 2010

Must Have Windows Server 2008 R2 Hyper-V Hotfixes

Now there's a damning blog post title!

In brief, I've had a few issues since installing Windows Server 2008 R2 and adding the Hyper-V Role on my Dell Precision M6500 Core i7 laptop (if you've landed on this post and you don't care about laptops--i.e. you're an admin--don't stop reading now as this post will likely apply to you too). This post is meant to be a running log of problems and their resolutions while we await the next service pack, I suppose.

Problem #1: Blue Screen of Death/random reboots

Resolution:

KB975530: Stop error message on a Windows Server 2008 R2-based computer that has the Hyper-V role installed and that uses one or more Intel CPUs that are code-named Nehalem: "0x00000101 - CLOCK_WATCHDOG_TIMEOUT"

Symptoms, event log entries, and whatnot: Unexpected freezes, BSODs and reboots during periods of high network activity.

  • Event ID 219, The driver \Driver\WUDFRd failed to load for the device USB\VID_0A5C&PID_5800&MI_01\7&66de6c9&0&0001.
  • Event ID 41, The system has rebooted without cleanly shutting down first. This error could be caused if the system stopped responding, crashed, or lost power unexpectedly.
  • Event ID 4, Broadcom NetXtreme 57xx Gigabit Controller: The network link is down.  Check to make sure the network cable is properly connected.
  • Event ID 1001, The computer has rebooted from a bugcheck.  The bugcheck was: 0x00000101 (0x0000000000000019, 0x0000000000000000, 0xfffff880020ce180, 0x0000000000000003). A dump was saved in: C:\Windows\MEMORY.DMP. Report Id: 043010-31168-01.

Apparently this issue is caused by an Intel erratum affecting Nehalem-based processors (i.e. Xeon 5500, Core i7-800, and Core i5-700 series).

Problem #2: Guest VMs freeze, lost connection

Resolution:

None of this worked for me but changing my host power settings from Balanced to High performance did. In theory you should be able to revert to Balanced after installing and the above.

You may also want to investigate turning off TCP offloading.

Symptoms, event log entries, and whatnot: Hyper-V console freezes in some virtual machines but not others (the only one affected in my case was an XP VM upgraded from Virtual PC 2007 SP1), Hyper-V manager reports the Heartbeat as "Lost connection".

  • Event ID 5: The miniport 'Microsoft Virtual Machine Bus Network Adapter #3' hung.

...followed by...

  • Event ID 4: The miniport 'Microsoft Virtual Machine Bus Network Adapter #3' reset.

For additional information about this problem, definitely check out http://social.technet.microsoft.com/Forums/en-US/windowsserver2008r2virtualization/thread/0408a28d-6ab8-4c85-8773-4bc42c2df40b

More to come? Hopefully not! ;-)

If you found this post helpful, please support my advertisers:

Thursday 12 August 2010

Configuring the People Picker and No exact match was found

SharePoint's people picker is generally one of those things that just kind of works—and fairly well no less. There's a gotcha to that statement however: the people picker works well when the the SharePoint farm exists in the same domain as the users you want to match against, or exists in a second domain where a two-way trust has been established with the first domain.

SharePoint-People-Picker

If your SharePoint web frontends exist in another domain with a one-way trust, you've got some extra work to do. This is the case with our dev setup and DMZ setup: our dev VMs are joined to the dev domain which trusts the corporate domain and our production servers are similarly joined to a web domain that also trusts the corporate domain. Both trusts are one-way.

I'll simplify what needs to be done by stating you simply need to tell SharePoint which forests or domains house the users you're after and provide an account from that domain.

Firstly you'll need to provide a key that will be used to encrypt any passwords you plug in during the next step. Run this on every WFE where "key" is a string of your choosing:

STSADM.exe -o setapppassword -password key

Next, set the peoplepicker-searchadforests property:

STSADM.exe -o setproperty-propertyname peoplepicker-searchadforests -propertyvalue <Valid list of forests or domains> -url <URL of the Web application>

where <Valid list of forests or domains> might look like this:

"domain:mydomain.com,mydomain\myuser,mypassword"

Supply the URL of the web application you want to configure (and note you don't need to set this for Central Admin, set it for a specific web application). Multiple domains and forests can be listed if necessary.

For more information, check out these resources:

 
If you found this post helpful, please support my advertisers:

Tuesday 10 August 2010

Data Compare crashes Visual Studio 2010

Although I'd previously noticed the Data Compare and Schema Compare options on the Data menu in Visual Studio, until today I didn't have a good reason to see what they can do.

Note I'm running VS Ultimate; if you can't see it, I'm not certain in which editions of the product this menu makes an appearance. SQL Server Management Studio 2008 doesn't seem to have a comparable tool that I'm aware of—which seems odd—but please comment if you know something I don't.

So here's the situation: two copies of the same database, one that's been moved to a different server and both of which have data changes. Your mission: find and fix the differences.

Open VS, Data –> Data Compare –> New Data Comparison…

CRASH!

Luckily I was able to work around this problem (feature?) by re-opening VS, Data –> Transact-SQL Editor –> New Query Connection, connecting to any ol' database (or optionally just cancelling out of the connection window) and then launching the compare wizard again.

I'm not sure what this does since you specify a database connection in the compare wizard anyway and cancelling the connection window or closing the new query window after opening the Transact-SQL Editor connection makes the problem go away. I've logged a bug with Microsoft connect.

Update 6 Sept 2010: MSFTConnect just got in touch to tell me this is likely related to the Solution Navigator extension in the Productivity Power Tools. Disabling apparently fixes the problem and the problem should be fixed this month.

If you found this post helpful, please support my advertisers:

Monday 2 August 2010

Enable Fusion Assembly Binding Logging

The WRN: Assembly binding logging is turned off message is annoying but assembly binding logging can be extremely helpful when you need to know how .NET is (or isn't) locating the assemblies you've referenced. The message is annoying because, if you're like me, you never have logging enabled and the registry key cited is a bit unusual:

To enable assembly bind failure logging, set the registry value [HKLM\Software\Microsoft\Fusion!EnableLog] (DWORD) to 1.

What does the exclamation mark mean? Presumably EnableLog is a DWORD in the Fusion key but it's all a bit unclear for my liking. Once you've set this DWORD, you need to figure out what to do next and for many people that will likely involve firing up the Fusion Log Viewer fuslogvw.exe or the Windows/.NET SDK. But then what?

Gary Kindel was kind enough to post the following details in response to a related Stack Overflow question:

Add the following values to HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Fusion

DWORD ForceLog set value to 1
DWORD LogFailures set value to 1
DWORD LogResourceBinds set value to 1
String LogPath set value to folder for logs ie) C:\FusionLog\

Make sure you include the backslash after the folder name. I also found an IISRESET was necessary in a web context.

Since I wanted to enable this logging in an environment without Visual Studio or the Windows SDK installed, the above option was clean and lightweight. Log files were dumped to the expected location in .html format and it was then a case of locating the assembly I was interested in… and, oh yeah, fixing the problem ;-) The issue was also detailed in the ASP.NET error message returned by IIS.

Advanced geeks: because Fusion logging supposedly affects performance, you might want to create a reg file/batch script to toggle logging on and off. Here's a starting point—put this into a file called EnableFusionLogging.reg and double-click to run it:

Windows Registry Editor Version 5.00

[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Fusion]
"ForceLog"=dword:00000001
"LogFailures"=dword:00000001
"LogResourceBinds"=dword:00000001
"LogPath"="C:\\Temp\\FusionLog\\"

I actually call this from a batch script which also resets IIS and I have a mirror reg file that disables logging.

If you found this post helpful, please support my advertisers:

Sunday 1 August 2010

Windows Licensing Details

Windows will report licensing details if you run the following from an elevated console window:

slmgr.vbs /dlv

The Windows Script Host popup will shortly tell you which edition of Windows you're running, which licensing channel you fall under (retail, OEM, volume), your activation ID, license status, rearm counts, and KMS details.

Wednesday 28 July 2010

Visual Studio 2010 Remote Debugger Location

You'll find the VS 2010 remote debugger a similar location to that of the 2005/2008 debuggers:

C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\Remote Debugger

Note VS2010 is still a 32-bit app so if you're running a 64-bit OS, you'll need to be looking in the Program Files (x86) directory. The debugger itself comes in both x86 and x64 flavours.

To install, I normally just copy the relevant architecture folder to the remote server. Run msvsmon.exe and assuming you're otherwise setup for remote debugging, it should be same as previous versions of Visual Studio!

Ps. You can also download the remote debugger from Microsoft.
Pps. I normally add msvsmon to the Windows Startup folder so it starts whenever I boot my remote dev environment (I've found I previously had to log on to the server, however, and two users can't run the monitor simultaneously on the same server). Apparently the 2010 monitor can be run as a service—not sure if this was the case for previous versions.

Friday 23 July 2010

PublishingPageLayout doesn't point to current web application

In the past, we've developed a habit of "birthing" new SharePoint sites in a UAT environment for initial setup and stabilisation before the big move to the production farm; during major site renovations—like the recent www.westernaustralia.com rebrand—we'll implement a quasi-freeze on production (emergency edits only) and conversely pull the content down to UAT for overhaul. At the end of the day, we're moving SharePoint content between farms on a fairly regular basis.

Prior to the release of the April 2009 Cumulative Update packages, Microsoft didn't support moving content databases between farms because "MOSS often stores absolute URLs to the Page Layout in the properties of a Publishing Page." With the April Cu that changes but of course the times they a change too and we've abandoned the stsadm backup/restore commands in favour of restoring SQL Server backups and running the stsadm deletecontentdb/addcontentdb commands (it's not only more reliable but many times faster…).

Until yesterday none of the above was causing us any problems. In fact, we'd even become complacent about it all.

So finding out one of our content editors was unable to publish a page due to a previously unseen exception was not something that would have led us to review the way we handle content databases. After checking out some pages for editing, an attempt to publish the page was met with the following exception:

ArgumentException 
Value does not fall within the expected range.

    Stack trace:    at Microsoft.SharePoint.Library.SPRequestInternalClass.GetMetadataForUrl(String bstrUrl, Int32 METADATAFLAGS, Guid& pgListId, Int32& plItemId, Int32& plType, Object& pvarFileOrFolder)
   at Microsoft.SharePoint.Library.SPRequest.GetMetadataForUrl(String bstrUrl, Int32 METADATAFLAGS, Guid& pgListId, Int32& plItemId, Int32& plType, Object& pvarFileOrFolder)
   at Microsoft.SharePoint.SPWeb.GetMetadataForUrl(String relUrl, Int32 mondoProcHint, Guid& listId, Int32& itemId, Int32& typeOfObject, Object& fileOrFolder)
   at Microsoft.SharePoint.SPWeb.GetFileOrFolderObject(String strUrl)
   at Microsoft.SharePoint.Publishing.CommonUtilities.GetFileFromUrl(String url, SPWeb web)
   at Microsoft.SharePoint.Publishing.PublishingPage.get_Layout()
   at Microsoft.SharePoint.Publishing.PublishingPage.GetEffectivePageCacheProfileId(Boolean anonUserProfile)
   at Microsoft.SharePoint.Publishing.PublishingPage.GetEffectiveAnonymousPageCacheProfileId()
   at Microsoft.SharePoint.Publishing.CachedPage..ctor(PublishingPage page, SPListItem altItem, String id, String parentId, String title, String url, String description, CachedObjectFactory factory, List`1& fieldInfo, Boolean datesInUtc)
   at Microsoft.SharePoint.Publishing.CachedPage.CreateCachedPage(PublishingPage page, SPListItem altItem, CachedObjectFactory factory, List`1& fieldInfo, Boolean datesInUtc)
   at Microsoft.SharePoint.Publishing.CachedPage.CreateCachedPage(PublishingPage page, CachedObjectFactory factory, List`1& fieldInfo, Boolean datesInUtc)
   at Microsoft.SharePoint.Publishing.CachedListItem.CreateCachedListItem(SPListItem item, SPListItem alternateItem, Boolean parentIsWeb, CachedObjectFactory factory, List`1& fieldInfo, Boolean datesInUtc)
   at Microsoft.SharePoint.Publishing.CachedObjectFactory.CreateObject(SPListItem listItem, List`1& fieldInfo, Boolean datesInUtc)
   at Microsoft.SharePoint.Publishing.CachedObjectFactory.CreateObject(PublishingPage page, Boolean datesInUtc)
   at Microsoft.SharePoint.Publishing.CachedObjectFactory.GetPageForCurrentItem()
   at Microsoft.SharePoint.Publishing.TemplateRedirectionPage.ComputeRedirectionVirtualPath(TemplateRedirectionPage basePage)
   at Microsoft.SharePoint.Publishing.Internal.CmsVirtualPathProvider.CombineVirtualPaths(String basePath, String relativePath)
   at System.Web.Hosting.VirtualPathProvider.CombineVirtualPaths(VirtualPath basePath, VirtualPath relativePath)
   at System.Web.UI.DependencyParser.AddDependency(VirtualPath virtualPath)
   at System.Web.UI.DependencyParser.ProcessDirective(String directiveName, IDictionary directive)
   at System.Web.UI.PageDependencyParser.ProcessDirective(String directiveName, IDictionary directive)
   at System.Web.UI.DependencyParser.ParseString(String text)
   at System.Web.UI.DependencyParser.ParseFile(String physicalPath, VirtualPath virtualPath)
   at System.Web.UI.DependencyParser.GetVirtualPathDependencies()
   at Microsoft.SharePoint.ApplicationRuntime.SPVirtualFile.CalculateFileDependencies(HttpContext context, SPRequestModuleData basicRequestData, ICollection& directDependencies, ICollection& childDependencies)
   at Microsoft.SharePoint.ApplicationRuntime.SPDatabaseFile.EnsureDependencies(HttpContext context, SPRequestModuleData requestData)
   at Microsoft.SharePoint.ApplicationRuntime.SPDatabaseFile.EnsureCacheKeyAndViewStateHash(HttpContext context, SPRequestModuleData requestData)
   at Microsoft.SharePoint.ApplicationRuntime.SPDatabaseFile.GetVirtualPathProviderCacheKey(HttpContext context, SPRequestModuleData requestData)
   at Microsoft.SharePoint.ApplicationRuntime.SPVirtualFile.GetVirtualPathProviderCacheKey(String virtualPath)
   at Microsoft.SharePoint.ApplicationRuntime.SPVirtualPathProvider.GetCacheKey(String virtualPath)
   at Microsoft.SharePoint.Publishing.Internal.CmsVirtualPathProvider.GetCacheKey(String virtualPath)
   at System.Web.Compilation.BuildManager.GetCacheKeyFromVirtualPath(VirtualPath virtualPath, Boolean& keyFromVPP)
   at System.Web.Compilation.BuildManager.GetVPathBuildResultInternal(VirtualPath virtualPath, Boolean noBuild, Boolean allowCrossApp, Boolean allowBuildInPrecompile)
   at System.Web.Compilation.BuildManager.GetVPathBuildResultWithNoAssert(HttpContext context, VirtualPath virtualPath, Boolean noBuild, Boolean allowCrossApp, Boolean allowBuildInPrecompile)
   at System.Web.Compilation.BuildManager.GetVirtualPathObjectFactory(VirtualPath virtualPath, HttpContext context, Boolean allowCrossApp, Boolean noAssert)
   at System.Web.Compilation.BuildManager.CreateInstanceFromVirtualPath(VirtualPath virtualPath, Type requiredBaseType, HttpContext context, Boolean allowCrossApp, Boolean noAssert)
   at System.Web.UI.PageHandlerFactory.GetHandlerHelper(HttpContext context, String requestType, VirtualPath virtualPath, String physicalPath)
   at System.Web.HttpApplication.MapHttpHandler(HttpContext context, String requestType, VirtualPath path, String pathTranslated, Boolean useAppConfig)
   at System.Web.HttpApplication.MapHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()
   at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)

Which of course means nothing.

Before long the guys stumbled across one of Stefan Goßner's brilliant posts, which suggests MOSS doesn't update all absolute page layout URLs on each publishing page object when content is shifted between farms; of cours, with the April 2009 CU installed and doing things via backup/restore or import/export, we may have avoided this issue.

You can see the exact problem by examining a page in SharePoint Manager: locate a page in the Pages library, expand the page node to reveal its Properties node, and expand the Properties node so you can inspect the PublishingPageLayout node. This screenshot is from my development environment, a server named dev-moss-mh5 and the wa.com web application configured to run on port 180—so obviously the http://edit.uat.westernaustralia.com value is incorrect:

SPM 2007 Publishin Page Properties

Interestingly, this problem was only affecting the production edit site… for whatever reason, my dev environment was unaffected. However, running Stefan's FixPageLayout code corrected the problem in production. The application runs through every page in every subsite and fixes the PublishingPageLayout property where required (in addition to reporting any other page instance/layout/master page issues). 

Update: Just remembered from a while back Gary Lapointe also has an stsadm command to fix this problem:

http://stsadm.blogspot.com/2007/08/fix-publishing-pages-page-layout-url.html