Wednesday, 23 June 2010

Use BackConnectionHostNames instead of DisableLoopbackCheck in production

If you're running SharePoint, you may have come across advice to configure the DisableLoopbackCheck registry key if you're running Windows 2003 SP1 and above and/or .NET 3.5 SP1.

Adding the DWORD will certainly keep you up and running and avoid search/crawl errors like the below but it's not the way to be doing in a production environment, despite popular belief.

Access is denied. Verify that either the Default Content Access Account has access to this repository, or add a crawl rule to crawl this repository. If the repository being crawled is a SharePoint repository, verify that the account you are using has "Full Read" permissions on the SharePoint Web Application being crawled. (The item was deleted because it was either not found or the crawler was denied access to it.)

What you should be doing in production is configuring specific sites by name using the BackConnectionHostNames Multi-String Value below HKEY_LM\system\CurrentControlSet\Control\LSA\MSV1.0. The Microsoft KB article isn't clear about the format this value should take but I've found adding each site without the scheme on a new line works.

Here's an example:

Bob Fox additionally suggests adding a new DWORD named DisableStrictNameChecking with a value of 1 to HKEY_LM\system\CurrentControlSet\Services\Lanmanserver\parameters and rebooting to avoid having to reboot every time a new site is configured. I got away without rebooting at all by simply restarting the IISAdmin service.

Saturday, 19 June 2010

Don't use XmlRoot with .webpart files

Just noticed using an [XmlRoot("MyRoot")] attribute on my webpart class in conjunction with a .webpart file (as opposed to a .dwp file) gives me this error when attempting to add the webpart to a page:

Incompatible Web Part markup detected. Use *.dwp Web Part XML instead of *.webpart Web Part XML.


Well I don't want to use *.dwp Web Part XML so I removed the XmlRoot attribute instead. I'm feeling wild, you see.

Beware the old doco: 

How to add a web project item to a class library

This is an old trick but I've found myself hunting for it repeatedly as of late…

If your current visual studio project doesn't provide the option to add a web user control, web page, etc (perhaps because the project is a class library or a WSPBuilder project), you can easily tweak the project file enable support for these templates.

To do so:

  • Unload the project (right-click the project in Solution Explorer and select Unload Project from the context menu)
  • Edit the .csproj file (right-click the project in Solution Explorer and select Edit MyProj.csproj from the context menu)
  • Locate the ProjectTypeGuids element and add a magic project type GUID to beginning of the list: {349c5851-65df-11da-9384-00065b846f21};
  • Save your changes and reload the project (right-click and Reload Project)

When you're done editing the .csproj file it should look like this:


Now you can add user controls and whatnot to your heart's content!

Update Feb 2011: When I last had to this in VS2010, things were a bit different. The class library contained a ProjectGuid element but no ProjectTypeGuids. According to the post I link to above, a ProjectTypeGuids element can be added below the ProjectGuid so I pasted in the above. Although VS "helped out" by rejigging the XML for me and modifying the GUID when I subsequently inspected it, the project allows me to add web items.

Each class library seems to get a different (unique) ProjectGuid value so I'm not sure how adding the child ProjectTypeGuids element (which was subsequently removed by VS) actually worked… but it did ;-)

Wednesday, 9 June 2010

TFSDeleteProject /Collection Parameter in TFS 2010

As far as I can tell, deleting a team project in TFS 2010 is still a task that must be completed at the VS2010 command line; while the operation is relatively straightforward, this command changes slightly with the introduction of project collections in TFS2010 and the value expected by /collection:<url> parameter may not be glaringly obvious.

Firstly, see the documentation.

Next, determine the URL for the team project you want to delete. I wasn't sure how to discover this but after some trial and error I found this format seems to work:


I've since realised the URL can be determined from the TFS Administration Console (in the General tab under the Team Project Collections node) and other derivatives returned a 404 error.

Note http://server:port/tfs is the Server URL configured for my TFS instance in the Administration Console so your mileage may vary.

Monday, 7 June 2010

Australian Weather Data (RSS…)

Despite my best efforts, I've been disappointed in my (repeated) attempts over the last twelve months to locate a free Australian weather feed I can consume via RSS and present as I wish. The Bureau of Meteorology is operating in the dark ages when it comes to the distribution of its authoritative weather data, Yahoo! weather—although available in a nicely structured XML document—is not available for commercial use, and all other feeds available at the time of writing send down HTML-formatted data with very limited rights for modifying the HTML itself.

Naturally, this is all very frustrating, especially when your client's intranet site needs weather for three specific towns in the middle of the outback or you're trying to build a redistributable weather widget (SharePoint web part anyone?).

Having given up on a free RSS feed, I returned to the BoM site as they do provide weather data via FTP (they also provide registered users access to additional services for a subscription fee). The terms are a bit unclear about distribution and attribution so I'll leave them up to you to interpret. This post presents my approach to accessing and parsing the basic (free) weather data file. A bit more work is involved than simply pointing a nice XDocument at an RSS feed URL and LINQ-ing away but hey, lower-level stuff is always fun and educational ;-)

By the way, this code will eventually make its way into SharePoint web part; if you're interesting in licensing this web part for use on one of your own sites, please drop me a line –


The BoM provides weather data in the form of .dat, .txt, .html, and other file types via anonymous FTP. The .dat file is what we're after and for this example, I'll be using the IDA00006.dat file (Western Australia).

.dat files contain forecast information by town with details separated by a hash (#) character. The first line in the file describes the file format so if you get lost is a sea of hashes, start there. The Precis Forecast Products page provides formal documentation, of a sort.

Since the .dat file is just a text file we'll be using the WebClient class to manage access to the FTP location and open a readable stream over the .dat file URL:

WebClient client = new WebClient();

using (Stream data = client.OpenRead("";))


With access to the file contents, we can now parse it for easier access. To do so, we'll use a StreamReader to enumerate the file, tokenize each line, and store the results in a generic list:

using (StreamReader reader = new StreamReader(data))
    List<List<string>> weatherData = new List<List<string>>();
    string currentLine;

    while ((currentLine = reader.ReadLine()) != null)
        string[] tokens = currentLine.Split('#');

It's worth pointing out the nested list structure I'm using here in place of a dedicated collection type. Each line in the file represents data for a single town; individual data elements within each line are separated by a hash character. To accommodate this, I tokenise each data element using the String.Split() method after reading the next line and store those elements in a nested list; each outer list item therefore contains data for a single town.

If that sounds at all complicated, here's a picture of the end result:

NestedListsThe parsed data is finally cached (not shown) for subsequent access. Notably, the BoM doesn't specify any limits on access attempts per day or an effective TTL value as do many RSS feeds. There's nothing stopping you from retrieving this data with every page load but your site will likely slow down and you'll also be guilty of gumming up the intertubes. You may also run the risk of being blocked from accessing the site.

Data Extraction

The final step in this process is locate and extract the weather data you're after (most likely for a specific town). I use the List<T>.Find() method, supplying a predicate that matches on town name, to accomplish this:

List<string> town = weatherData.Find(currentTown => currentTown[WeatherIndex.Location].ToLowerInvariant() == "perth");

Final Touches

The free data doesn't provide any form of condition code so associating a forecast (e.g. "Mostly sunny") with an icon is going to be tricky. I'm hopeful the BoM uses standard phrases that can be matched (or at least partially matched as a fallback (e.g. contains "sun") against known terms for hookup to generic icons but this could become a maintenance problem.

But that's it, weather!

Wednesday, 2 June 2010

Upgrade SQL Server Edition or Change License Key

Apparently upgrading from a developer version of SQL Server is possible. These screenshots are from an MSDN installation of SQL Server 2008 and entering a non-MSDN key will presumably legitimise the bits for production use (I don’t have a retail or volume key available so this is a guess at this stage).

To get here, re-run the installation wizard and select Edition Upgrade from the Maintenance screen:

SQL Server License Key Change Edition Upgrade

Then enter your new product key or use a free edition instead:

SQL Server License Key Change Edition Upgrade 2

You can also do this at the command line:

Setup.exe /q /ACTION=editionupgrade /INSTANCENAME=<MSSQLSERVER or instancename> /PID=<PID key for new edition>" /IACCEPTSQLSERVERLICENSETERMS