Category Archives: SharePoint

FAST Search for a specific External Content Type

I hadn’t really had a chance to do much with FAST Search, beyond your standard demo-ware installs and use cases, but I’ve been anxious to have an excuse to dig into it more.

The combination of beefed up BCS/BDC support in SharePoint 2010 plus FAST search seems like it would be a pretty powerful combination, especially if you have fairly large sets of data in your LOB applications and you don’t want to hammer them with constant, open-ended queries.  In this case I was looking to be able to limit a search to return results of only a specific External Content Type (ECT).  In the past, with SharePoint Search I’ve been able to just use the ContentType managed property for this (there’s a default mapping that brings in the ows_ContentType crawled property and maps it to this).  However, it seems that for external content types, this property does not get set.

One option here would be to just setup a unique content source for each type of BCS entity that you want to bring in, and use the content source as part of the query.  That seems pretty ridiculous for scenarios where you either have: a) a lot of entities/ECTs or entities that are related to one another.  Fortunately, poking through the available crawled properties in my FAST Query SSA I found that there is one called “entityname” under the “Business Data” category.   Mapping this to the ContentType managed property worked like a charm.  I can now go ahead and send a query like the one included below to the SharePoint Search Web service and get back only results that are of the type that I care about. BTW, many thanks to the author of the FAST for SharePoint Query Tool. It’s the FAST-specific version of the SharePoint Search Tool I mentioned really liking in a previous post.

<QueryPacket Revision="1000">
      <QueryText language="en-US" type="FQL">filter(contenttype:equals("MyBCSEntity"))</QueryText>
    <SupportedFormats Format="urn:Microsoft.Search.Response.Document.Document" />
      <Property name="Rank" />
      <Property name="Title" />
      <Property name="Author" />
      <Property name="Size" />
      <Property name="Path" />

Download SharePoint User Profile Photos Using PowerShell and Search Web Services

Have you ever wanted to grab all of the user profile photos in your SharePoint environment, whether it was to make sure none of them were inappropriate or just to have a laugh at some of the creativity?  It can be pretty entertaining, even if you don’t have an HR-ish reason for wanting to check on them.

I was recently asked to put together some way for a small group of users to easily look through all the pictures that had been uploaded.   The idea was for this to work from a client machine with no server-side installs would be needed.  This meantusing the object model was out (no Client OM in MOSS remember).  Fortunately in this environment all the user profiles are indexed to enable the “People search” capability, so the search web service is a handy and performant way to grab a few profile attributes for all the users who have submitted a profile.

I started by figuring out the syntax of the query I wanted to use to get the results back.  If you don’t already use it, I can’t recommend the SharePoint Search Service Tool enough.  The standalone executable is a free download from Codeplex that some really sharp people from Microsoft were kind enough to share, and it’s a very educational and useful tool if you want to dig into the SharePoint search engines native language a bit.  Generally I find I can just pick the scope and first few attributes I want to include in the results and then tweak the query syntax by hand.  One of the most powerful aspects is that the search tool lets you use SQL Syntax for the queries, which opens up a whole bunch of possibilities that the OOTB search web parts can’t do.  In the case, the crux of the query was to use the “People” scope, pick the attributes I wanted”, and make sure I only grabbed profiles that actually had a PictureURL populated.  You might think you’d be able to do something like PictureURL is not null, but despite being legitimate syntax it doesn’t appear to actually work.  For my purposes, I just ended up a “like (h%)”, which got the job down even though it may not be the most efficient predicate ever.

At this point, I had a nice chunk of XML for the QueryPacket to send to the search web service, as well as a pretty good idea of what the XML in the response was going to look like.  I don’t claim to be an XSLT master, but it’s not rocket science to turn the XML response into a really simple HTML page that shows the thumbnail versions of all the images, especially if you have a nice XML/XSL editor at your disposal to help with debugging.  Using the auto-generated thumbnails is highly recommended, by the way, unless you like seeing your browser sweat bullets trying to layout thousands of 100K-5MB images on a single page.  Like I mentioned before, it’s pretty funny what people will upload even though it all ends up in a 150×150 square.

I now had a one-off version of the desired result, but didn’t want to want to burden the actual customers with needing to know how to use things like the Search Service Tool, even as much as I like it.  I needed something quick and easy to automate this process, and for most SharePoint folks these days “quick and easy automation = PowerShell”.  After saving the QueryPacket XML and .xsl file for transforming the response, I just need a handy way to call the search web service from PowerShell.  At first I used the “old school” method of just generating and compiling a proxy stub with wsdl.exe and calling that from PowerShell.  After a bit more search engine fun, I found that there is a New-WebServiceProxy cmdlet built into Powershell to handle this for you.

Now I have 3 easy-to-distribute text files, and as long as the user can run the PowerShell installer, which is pretty straightforward, they can just unzip the files, right-click the .ps1 to choose “Run with PowerShell”, and enjoy perusing a large HTML file with a bunch of images that link back to the user profile page.  The actual HTML could be made much fancier, and obviously you may care about a different set of profile properties than I’m pulling.

This was a fun little exercise that actually served a practical purpose in this case, but might otherwise be a nice, fun page to show a client for whom you’ve deployed SharePoint My Sites.  You can download and play with the source files however you wish.  I’d love to hear feedback, or other ways to extend this idea in the comments.

SharePoint 2010 Class Notes – Day 2

We’re talking about sandbox solutions in SharePoint 2010 today.  This seems like it will be a very popular capability amongst the power user audience, but could be tedious to manage a bunch of site collections and use the same set of sandbox solutions.

An interesting solution, um, solution might be to have to establish a central repository of sandbox solution WSP files, then have some kind of Silverlight app that would let you register a series of site collections you manage, and then deploy a set of selected solutions to all of them.   Think of it as a way to manage “my” slice of the farm, without using farm-wide solutions.

Solution validators are an interesting concept.  It seems like two very useful validators would be a simple whitelist or a simple blacklist, perhaps that just read a list of solution identifiers from a SharePoint list or simple database table (if you want to avoid a circular dependency) somewhere.

Looking at visual upgrade, there are some weird issues if you want to keep running with the old UI/master pages.  They’ve stapled a new feature called Wiki Page Home Page to the team site definition to use the new Wiki Page library, which sounds like a cool idea on the surface.  However, if you have a site that uses the old UI, the new wiki pages are pretty much totally unusable because the old editor doesn’t know what to do with all the new rich editor content.  It appears though, that if you just delete the “Site Pages” library on a team site, it will revert back to the old behavior and just use the default.aspx page that has plain old web-part zones on it.   It would probably be cleaner to just deactivate the feature though.

Some VS2010 “magics”:

  • prop[tab][tab] – Create a new property
  • object.Event+=[tab][tab] – Stub out an event handler
  • override [methodname]

RPC – Form Submission Cannot Be Processed Error

We recently had a fun little issue where a seemingly random set of users with varying combinations of Windows 2000, XP, and Vista as well as Office 2003 and Office 2007 started having issues editing documents from SharePoint documents libraries.

This didn’t effect all our users, which made it trickier to debug, but when we were finally able to do a Fiddler trace it appeared that the calls failing were all going to the /_vti_bin/author.dll library.  This is a long-standing holdover from the bad-old days of FrontPage and FrontPage extensions that still gets used in several scenarios by Office clients (including SharePoint Designer).

The RPC error we kept getting was “Form submission could not be processed because it exceeded the maximum length”.  This seemed really odd since the actual request size in these traces was only a few bytes.  We knew our Maximum Upload Size value was bigger than a few bytes, having just gone through an exercise to increase it to 200MB.

I’ll just cut to the chase rather than bore you with all the dead ends we went down (Google was unfortunately not a lot of help on this one).  It turns out someone had changed our Maximum Upload Size to 2048MB.  I don’t know why this happened, but it’s slightly bigger than the maximum size noted in the stsadm docs for the max-file-post-size (2047MB max).  If you spend a lot of time with powers of 2, you’ll probably notice this is suspiciously close to one of the magic numbers that tends to result in things like overflow errors (i.e. it’s right at the maximum size of a signed 32bit integer).  My guess is that the old FrontPage RPC stuff still uses 32-bit integers to store this kind of thing, and it probably “rolled over” to look like a negative number.  So any calls to author.dll were “too large” no matter how big they actually were.  I don’t know for sure, but it seems like a high probability.

Anyhow, just change the Max Upload Size back to a more reasonable value resolved the issue and our Office clients  were happily RPC-ing away again.

The “Oh shoot” web part editor page

We were doing a WSS hotfix install tonight and got stuck in a situation where the psconfig upgrade after the hotfix install was failing on a custom mysite web part we deployed (only for 3 users for some reason).
I figured we should just be able to remove the web part from those My Site pages, since it’s really just a fairly generic set of instructional links, but was stymied by the redirect to the user’s public profile page instead of actually getting to the default.aspx for the user’s My Site.

I remembered seeing a page that comes up when a web part is erroring out in a way that makes the page unrenderable, and that page lets you remove the offending part. My trust search engine led me to Raghu Iyer’s aptly title post on the “Savior Web Part Maintenance Page”.

So for future reference (my own if no one else’s) if you cannot access a page and want to remove a web part, you can bring up the web part maintenance page by appending the query string “?contents=1” to the end of the URL. For example, go to This will actually send you to an “spcontnt.aspx” page that’s in the _layouts directory and takes the host-relative URL to the page in question as a query string argument (e.g.

I find the “contents=1” approach way more convenient though, and hopefully, going forward, way easier to remember.

SharePoint and Kerberos = Lots of 401s

Update: We recently moved our farm to Windows 2008 (64 bit) and this issue came up again. Only now there is a different mechanism for addressing it in IIS 7. Microsoft has a new KB article on Kerberos performance with IIS 7.

At work we’re putting into place a fairly decent size SharePoint farm and trying to follow most of the best practices for enterprise deployments. This includes using configuring Kerberos to authenticate our Active Directory users. Our servers are all running Windows 2003 SP1 which is probably not terribly unusual. There are lots of walkthroughs on how to do this including this excellent one by Martin Kearn. Don’t worry, I’m not about to produce yet another one.

In fact, it’s really not all that difficult to get the basics working. Just make sure the service account you use to run your app pool has an service principle name (SPN) that matches the DNS name you’re planning to have people access the app at (e.g. should have an SPN like HTTP/ associated with it).

I’m guessing a lot of folks would get this all setup, hit the site, make sure it authenticates correctly and be done with it. Since we are attempting to do a centralized global deployment though, I’m extremely wary of anything that may degrade severely in a high-latency environment. That means pretty much everything gets run through Fiddler, which (not to overstate the case) is pretty much the best single web application debugging/analysis tool ever. A quick access of our spiffy new environment through run through Fiddler showed some worrisome behavior. Pretty much every HTTP request was actually two requests – one that would receive a 401 Unauthorized response wanting to negotiate authenticate and one that sent a Negotiate header with the request and actually got a response.

This kind of thing adds up to poor performance pretty quickly in a WAN environment, and it’s not too hard to see why. Let’s say a “typical” SharePoint page contains 19 embedded elements (images, style sheets, Javascript files, etc) plus the page itself. Also, let’s assume that you have users in Europe accessing a site a farm on somewhere in the eastern US. That will probably leave you with a round trip time of something like 100ms. Since most web browsers will only open at most two simultaneous connections to a server, you’re looking at a minimum of 20/2*100ms or about 1 second of elapsed time just for the packets to zing back and forth on your WAN. This is basically pure overhead, since it doesn’t include any processing time in the data center or transferring your actual content. If you had turned on Kerberos so that you’re getting 20 extra requests that each get a 401 response, now you’ve added an additional delay of (at least) 1 second to all of your pages.

Since Microsoft’s recommended target time for a page response is 1-2 seconds, it doesn’t seem like it’s worth wasting an extra second on a bunch of redundant authentication requests. Fortunately there is a solution. It turns out IIS changed behavior in the transition from v5 (Windows 2000) to v6 (Windows 2003) in how it handles Kerberos authenticated requests. By default it requires each and every request to be authenticated, which is exactly the behavior we’ve been talking about. After a bunch of hours bumping my head against this problem, I finally stumbled up KB Article 917557 about slow performance with Kerberos authentication. There is a hotfix available that enables adding a registry value called “EnableKerbAuthPersist”. This will change the behavior of IIS so that it will only require one authentication per connection to the web server. That means as long as your are using HTTP 1.1 keepalives (and pretty much everyone should be) you’ll only see 1 or 2 requests that get a 401 response instead of 1 for every single element of the page.

Once we added this hotfix and key to all our servers performance is substantially snappier. It’s definitely satisfying to load a page or through Fiddler and see only 50 requests where there used to be 100. Since the KB article that helped with this problem was not especially Google-noticeable and not at all tied to SharePoint, I figured I would share this and hope it may help someone else dealing with the same issue. Please share your experience in the comment if you’ve dealt with anything similar.