Track SharePoint Content Database Growth via Central Admin

Following on from a recent post I made about a SharePoint health analyzer rule that can be used to automatically expand a SharePoint content database outside of normal working hours, I wanted to create a solution for monitoring content databases growth over time via central admin. Here’s what I came up with:

image

The solution consists of four parts, the first part is the Review Databases Sizes page shown above. The page is accessed from a custom action under Application Management > Databases:

image

The Review Databases Sizes page lists each content database present in the farm, plus a spark line that shows the database data file size and log file size over time. Clicking on the database name or either of the spark lines shows the second part of the solution, the Database Size Details application page. This page will be displayed to you inside an SP.UI.ModalDialog:

image

imageThe chart shown in the modal dialog (and the spark lines) are created via the jqPlot jQuery extension and allow for some nifty features such as data point highlighting, animated rendering and zooming. Note: You may need to check the jqPlot browser requirements to ensure this will work in your environment.

To zoom into an area on the chart simply click and then drag a rectangle that contains the data to be explored:

The chart will be re-rendered to display just the data points contained in the area you selected.

After you’ve zoomed in, you can examine individual values by hovering your mouse over a data point or you can zoom back out to the full chart by double clicking anywhere on the chart.

image

The third part of the solution is the deployment of the jqPlot JavaScript libraries themselves. The required libraries are deployed by a SharePoint feature and use ScriptLinks to add themselves to the master page of central admin without updating the master page itself. I’ve used this simple and powerful method to deploy jQuery libraries before and more details about can be found here: Use jQuery in SharePoint without updating your MasterPage

The fourth and final part of the solution is a custom timer job that is set to run once a day sometime between midnight and 1am. Its called ‘SPHealth Database Size Collection’:

image

The timer job finds each content database in the farm and demines the size of the database data file and log files for each. The sizes are then stored in the property bag for each content database.

That’s it – two application pages, a timer job, and a couple of module files. Smile

I’ve published the source code to the solution at http://sphealthdbsize.codeplex.com if you want to have a poke around and try it out for yourself. Caveat: Before you deploy this in a production farm, just like any other third party solution, I would recommend you review and understand what the code is doing before you use it. Also note the following, it may take a couple of days before you see any charts as the timer job will need to have run twice to have collected enough data points to plot!

Enjoy…

Autogrowth of SharePoint Content Databases – an Automated Solution

During a recent presentation at the London SharePoint User Group by Steve Smith of Combined Knowledge about SharePoint administration (and many other things), he discussed the issues surrounding the auto-growth of SharePoint content databases and the possible performance ramifications these can have when they are triggered during business hours.

As Steve pointed out, the default auto-growth settings for a newly created content databases are to grow in 1MB increments:

imageClearly for a content database that is used off the bat with this configuration, a lot (and I mean a lot) of auto-growths will be performed on the database as users load content and even access the site collections that the content database contains. The recommendations from Microsoft are to pre-grow data and log files and to set the auto-growth to 10% – see Storage and SQL Server capacity planning and configuration (SharePoint Server 2010) for further details: http://technet.microsoft.com/en-us/library/cc298801.aspx.

 

These recommendation rightly point out that database growth should be proactively managed. So Steve’s presentation and this article got me thinking about a repairable SharePoint health analyzer rule that could warn when content databases are filling up and, if required, grow them automatically. What makes this a practical solution I believe is the ability to configure the rule so that database growths performed by the repair action of the health rule are only executed within a specified time window.

The health rule derives from SPRepairableHealthAnalysisRule so it can be configured to automatically repair (for repair read grow) a database once it has exceeded a configurable capacity threshold. The rule supports four custom configurable parameters:

<Properties>
  <!-- Enter the database capacity percentage that is used to trigger -->
  <!-- a warning and potentially a scheduled database expansion. Values -->
  <!-- should be between 0.0 and 1.0. -->
  <Property Key="CapacityThreshold" Value="0.8" />
  <!-- Enter the BeginHour for the time period in which database -->
  <!-- expansions should occur. -->
  <Property Key="BeginHour" Value="1" />
  <!-- Enter the EndHour for the time period in which database -->
  <!-- expansions should occur. -->
  <Property Key="EndHour" Value="3" />
  <!-- Enter the percentage of growth the database should undertake -->
  <!-- during an expansion. Values should be between 0.0 and 1.0. -->
  <Property Key="GrowBy" Value="0.3" />
</Properties>

The CapcityThreshold property is used to set the level at which warnings about database capacities are raised. Once a database exceeds 80%  (the default threshold for the rule) a health analyzer warning is raised and is visible in central admin.

The BeginHour and EndHour properties are used to define a time window in which, for database that have exceeded their capacity threshold, growths should be executed by the rule. These growths will not occur if the ‘Repair Automatically’ button is pressed outside of this window. Ideally you should review the properties and behaviour of this rule and if appropriate, set the rule to repair automatically. Please note, in order for the rule to repair automatically during the specified time window, the rule schedule should remain hourly:

image

Lastly, the GrowBy property is used by the repair method to determine the amount of expansion a database should undertake. The default option is 30% – this means that if a database is 100MB in size and 90% full, the database will be grown to 130MB. The total database size is used to calculate the new database size and not the amount of space currently used.

The rule is packaged as part of the SharePoint Health Analyzer Rules project on http://sphealth.codeplex.com/

The source code for the rule can be reviewed here: http://sphealth.codeplex.com/SourceControl/changeset/view/412d4aba56ba#SPHealth.SharePoint.HealthRules%2fSP%2fRules%2fPerformance%2fWarnDatabaseCapacity.cs

BTW: There is a quicker way to solve this entire auto-growth problem – make the content database read-only! Winking smile

Create custom SharePoint Health Analyzer rules

Have you got a SharePoint farm that has a unique set-up, special monitoring requirements, particular SLAs that it must meet or a farm that needs to provide your operations team with pro-active monitoring. If so, create your own SharePoint Health Analyzer rules, it’s super easy!

I’m sure we’ve all worked on deployments that fall into one or more of the categories above or have tons of other requirements than would benefit from monitoring. Perhaps the monitoring you need is nothing to do with the farm deployment and it’s operating environment but instead monitoring of a custom application you’ve built. Either way, creating your own SharePoint Health Analyzer rules could be a good idea.

Here’s how you create them…

Start a new Visual Studio 2010 Empty SharePoint Project and add to it a new class. The class must inherit from SPHealthAnalysisRule:

image

Next, you need to override the Category and ErrorLevel your rule will be reported under:

image

Next, override the Explanation, Remedy and Summary strings the rule returns. These are what the user see when the rule is displayed in the Review problems and solutions list from within Central Administration.

image

Next, override the SPHealthAnalysisRuleAutomaticExecutionParameters, these control how, where and when the rule is checked.

image

The interesting option here is the Scope. The Scope allows the rule to be executed on ‘Any’ or ‘All’ servers in the farm. Depending on what your rule is designed to do, running it on one server might be enough but you may need to run it on every server. For example, a health rule than check the size of a content database could be run on any server (SPHealthCheckScope.Any) as it doesn’t matter from which server you interrogate your database for its size. However, a rule that checks for available disk space will need to be executed on every server (SPHealthCheckScope.All).

Now the last part, the rule logic itself. To implement this, simple override the Check() method:

image

The check method must return a SPHealthCheckStatus:

image

As you can see, creating rules is simple. Installing the rules is a little more involved but still only a few lines of code. To deploy the rules, you’ll need to add a farm scoped feature to your project, add to the farm scoped feature an event receiver and override the FeatureActivated and FeatureDeactivating events. The FeatureActivated event will install the rules contained in the assembly produced by the project by calling the RegisterRules method of the SPHealthAnalyzer class:

image

Lastly, the code to remove the rules on feature activation is just as simple:

image

Now deploy your feature and watch it fail…

There’s on last trick to getting this working. It appears that there is a issue with deploying the solution and activating the feature all in one step (just like Visual Studio tries to do). The RegisterRules method call fails if you attempt this, I suspect this is due to timing of the DLL becoming available in the GAC but I haven’t got to the bottom of this one yet. To work around this issue, update the farm feature manifest.xml to include the ActivateOnDefault=”False” attribute:

image

Now you can deploy your solution, manually activate your farm feature and begin testing your custom rules.Smile

If you want a complete sample solution that includes the rule I’ve used as an example in this post and many more you can download the source code and WSP at http://sphealth.codeplex.com

Enjoy.

Update IIS bindings programmatically via SharePoint timer job

In some rare edge-cases, it may be necessary to programmatically update IIS settings from SharePoint code. In this example I’m updating the host header bindings in IIS as I’m using (and creating) host header site collections programmatically. I could use a wildcard DNS and the default port 80 but in my scenario we need explicit host header bindings.

To accomplish the update to IIS we need to use a timer job for two reasons. The first is to ensure our updates are run with sufficient privileges to perform the required updates – as timer jobs run under the farm account this is typically elevated enough already. Secondly, using the timer job framework allows us to target which servers (one, some or all) the modifications are run on. This is important  because  if we are using a farm with multiple servers, keeping changes to IIS bindings synchronised  across the farm is clearly a good place to aim for – in every other direction lies madness.

How to create a timer job is not in the scope of this post – here’s some great reference material to get you started: http://msdn.microsoft.com/en-us/library/cc427068(v=office.12).aspx – this example is for WSS 3 but the process is just the same in 2010.

The important thing to note is how you install your timer job. One of the parameters of your constructor method for the timer job can be an SPJobLockType :

image

The SPJobLockType enumeration has three members that control where the timer job is executed:

  • NoneProvides no locks. The timer job runs on every machine in the farm on which the parent service is provisioned, unless the job I associated with a specified server in which case it runs on only that server (and only if the parent service is provisioned on the server).
  • ContentDatabaseLocks the content database. A timer job runs one time for each content database associated with the Web application.
  • JobLocks the timer job so that it runs only on one machine in the farm.
    The None option is potentially the most useful option to us for updating IIS settings as this will cause the timer job to run on every server in the farm that is running the ‘Microsoft SharePoint Foundation Web Application’ service – these are effectively our web front end servers. This approach also relies on the third parameter passed to the constructor being null otherwise the SPServer object that is passed via this parameter is used as the server to host the timer job.

Once we’ve installed our timer job in the appropriate way, the payload is simple. We need to override the Execute method of our timer job with whatever logic we need. Here’s an extract of my code for updating IIS bindings which should now be executed of each SharePoint WFE:

image

I hope this helps….

Update: 23 December 2012 – A good friend of mine, Gael Fabry, pointed me in the direction of this article that describes the schema of the IIS 7 applicationHost.config file: http://msdn.microsoft.com/en-us/library/aa347559(v=VS.90).aspx

Publishing and Consuming SharePoint Service Application with PowerShell

Below are my PowerShell scripts for publishing and consuming service applications between SharePoint farms (Service Application Federation). Before I introduce the scripts let be briefly explain the process involved.

The are several reasons why you might want to share service applications between farms and I’m not planning on going into those details here but the overall process of setting up service application federation is illustrated below:

image

The SharePoint farm on the left is the consuming farm. It will use the service applications from the SharePoint farm on the right – the publishing farm. Practically this means that the publishing farm contains your service instances, service applications and service application proxies and your consuming farm contains just service application proxies that ‘point’ to the publishing farm. Loads more details on service application federation is available on technet including which service applications can be federated: http://technet.microsoft.com/en-us/library/ff621100.aspx

The scripts I have follow the pattern shown in the diagram above. First on the consuming farm, I export the farm root certificate, the security token service certificate and I write the farm ID to a text file:

Contents of 1_Consumer_ExportCerts.ps1 – run this on the CONSUMING farm

Add-PSSnapIn "Microsoft.SharePoint.PowerShell" -EA 0

# export consumer root certificate
Write-Host "Exporting Consumer Root Certificate…" -nonewline
$rootCert = (Get-SPCertificateAuthority).RootCertificate
$rootCert.Export("Cert") | Set-Content ConsumingFarmRoot.cer -Encoding byte
Write-Host "Done" -Foreground Green

# export consumer sts certificate
Write-Host "Exporting Consumer STS Certificate…" -nonewline
$stsCert = (Get-SPSecurityTokenServiceConfig).LocalLoginProvider.SigningCertificate
$stsCert.Export("Cert") | Set-Content ConsumingFarmSTS.cer -Encoding byte
Write-Host "Done" -Foreground Green

# export consumer farm id
Write-Host "Exporting Consumer Farm Id…" -nonewline
$farmID = Get-SPFarm | select Id
set-content -path ConsumerFarmID.txt $farmID
Write-Host "Done" -Foreground Green

Write-Host "Now copy ConsumingFarmRoot.cer, ConsumingFarmSTS.cer & ConsumerFarmID.txt to the publishing farm." -Foreground Yellow

You’ll notice this script ends with a prompt to guide you to the next step – copy the consuming farm certificate, consuming sts certificate and the consuming farm id txt file we’ve just created to the publishing farm. Next, switch over to the publishing farm.

Contents of 2_Publisher_ExportCerts.ps1 – run this on the PUBLISHING farm

Add-PSSnapIn "Microsoft.SharePoint.PowerShell" -EA 0

# export publisher root certificate
Write-Host "Exporting Publisher Root Certificate…" -nonewline
$rootCert = (Get-SPCertificateAuthority).RootCertificate
$rootCert.Export("Cert") | Set-Content PublishingFarmRoot.cer -Encoding byte
Write-Host "Done" -Foreground Green

Write-Host "Exporting Publisher Topology Url…" -nonewline
$topologyUrl = Get-SPTopologyServiceApplication | Select LoadBalancerUrl
$url = $topologyUrl.LoadBalancerUrl.OriginalString
Set-Content -path PublishingFarm.Url.txt -Value $url
Write-Host "Done" -Foreground Green

write-host
Write-Host "Now copy PublishingFarmRoot.cer & PublishingFarm.Url.txt to the consuming farm." -Foreground Yellow
write-host

Now copy the publishing farm certificate and publishing farm url txt file that have just been generated to the consuming farm. Now switch back to the consuming farm.

Contents of 3_Consumer_ImportCerts.ps1 – run this on the CONSUMING farm

Add-PSSnapIn "Microsoft.SharePoint.PowerShell" -EA 0

# import publisher root certificate
Write-Host "Importing Publisher Root Certificate…" -nonewline
$trustCert = Get-PfxCertificate PublishingFarmRoot.cer
New-SPTrustedRootAuthority PublishingFarm -Certificate $trustCert
Write-Host "Done" -Foreground Green

Write-Host "Now import certificates on the publishing farm." -Foreground Yellow

You’ve now imported the publishing farm root certificate into the consuming farm, Next switch back to the publishing farm and import the consuming farm certificate and consuming sts certificate into the publishing farm with the following PowerShell:

Contents of 4_Publisher_ImportCerts.ps1 – run this on the PUBLISHING farm

Add-PSSnapIn "Microsoft.SharePoint.PowerShell" -EA 0

# import consumer root certificate
Write-Host "Importing Consumer Root Certificate…" -nonewline
$trustCert = Get-PfxCertificate ConsumingFarmRoot.cer
New-SPTrustedRootAuthority ConsumingFarm -Certificate $trustCert
Write-Host "Done" -Foreground Green

# import consumer sts certificate
Write-Host "Importing Consumer STS Certificate…" -nonewline
$stsCert = Get-PfxCertificate ConsumingFarmSTS.cer
New-SPTrustedServiceTokenIssuer ConsumingFarm -Certificate $stsCert
Write-Host "Done" -Foreground Green

Write-Host "Now set permissions for application discovery on the publishing farm." -Foreground Yellow

Note: At this point I would recommend you access Central Admin on both the CONSUMING and PUBLISHING farms and verify that the trusts are in place as expected. To do this, from central admin select Security > Manage Trust.

Once you have verified your trusts are in place you’re ready to start sharing the service applications between farms. Now switch to the publishing farm.

Contents of 5_Publisher_SetPermissions.ps1– run this on the PUBLISHING farm

Add-PSSnapIn "Microsoft.SharePoint.PowerShell" -EA 0

# get consumer farm id
Write-Host "Reading Consumer Farm ID…" -nonewline
$consumerId = Get-Content -path ConsumerFarmID.txt
$consumerId = $consumerId.Replace("@{Id=","").Replace("}","")
Write-Host "Done" -Foreground Green

# set application discovery permissions
Write-Host "Set Application Discovery Permissions…" -nonewline
$security=Get-SPTopologyServiceApplication | Get-SPServiceApplicationSecurity
$claimprovider=(Get-SPClaimProvider System).ClaimProvider
$principal=New-SPClaimsPrincipal -ClaimType "http://schemas.microsoft.com/sharepoint/2009/08/claims/farmid" -ClaimProvider $claimprovider -ClaimValue $consumerId
Grant-SPObjectSecurity -Identity $security -Principal $principal -Rights "Full Control"
Get-SPTopologyServiceApplication | Set-SPServiceApplicationSecurity -ObjectSecurity $security
Write-Host "Done" -Foreground Green

# list the available service applications and prompt for one to be selected
$serviceAppList = @{"0"="DummyServiceApp"}
$serviceApps = Get-SPServiceApplication
$count = 1
$serviceWarning = ""
Write-Host
Write-Host "The following service applications are available for publishing:"
foreach ($serviceApp in $serviceApps)
{
    # ensure only service applications that can be shared are listed
    $type = $serviceApp.TypeName
    $serviceSharable = 0

    Switch ($type)
    {
        ("Business Data Connectivity Service Application") {$serviceSharable = 1}
           ("Managed Metadata Service")                       {$serviceSharable = 1}
           ("User Profile Service Application")               {$serviceSharable = 1}
           ("Search Service Application")                     {$serviceSharable = 1}
           ("Secure Store Service Application")               {$serviceSharable = 1}
           ("Web Analytics Service Application")              {$serviceSharable = 1}
           ("Microsoft SharePoint Foundation Subscription Settings Service Application") {$serviceSharable = 1}
    }
    if ($serviceSharable -gt 0)
    {
        $serviceAppList.Add("$count",$serviceApp.Id)
        Write-host "$count. " -nonewline -foregroundcolor White
        Write-host $serviceApp.DisplayName -foregroundcolor gray
        $count++
    }

}
Write-Host
$serviceAppNum = Read-Host -Prompt " – Please enter the id of the service application to be shared"
Write-Host
Write-Host "Getting Service Application…" -nonewline
$serviceAppId = $serviceAppList.Get_Item($serviceAppNum)
$serviceApp = Get-SPServiceApplication $serviceAppId
Write-Host "Done" -Foreground Green

# warn about domain trusts
$serviceWarning = ""
$type = $serviceApp.TypeName
Switch ($type)
{
    ("Business Data Connectivity Service Application") {Write-Host; Write-Host "Note: Publishing domain must trust Consuming domain." -Foreground Yellow; Write-Host;}
       ("User Profile Service Application") {Write-Host; Write-Host "Note: A two-way trust must exist between the Publishing and Consuming domains." -Foreground Yellow; Write-Host;}
       ("Secure Store Service Application") {Write-Host; Write-Host "Note: Publishing domain must trust Consuming domain." -Foreground Yellow; Write-Host;}
}

# list the service rights for the specified service application
write-host
$rightsList = @{"0"="DummyServiceApp"}
$count = 1
$serviceAppSecurity = Get-SPServiceApplicationSecurity $serviceApp
foreach ($right in $serviceAppSecurity.NamedAccessRights)
{
        $rightsList.Add("$count",$right.Name)
        Write-host "$count. " -nonewline -foregroundcolor White
        Write-host $right.Name -foregroundcolor gray
        $count++
}
write-host
$serviceAppRight = Read-Host -Prompt " – Please enter the right to be granted"
$serviceAppRight = $rightsList.Get_Item($serviceAppRight)

Write-Host "Granting ‘$serviceAppright’ to service application…" -nonewline
$security=Get-SPServiceApplication $serviceApp| Get-SPServiceApplicationSecurity
$claimprovider=(Get-SPClaimProvider System).ClaimProvider

if ($type -eq "User Profile Service Application")
{
    $consumFarmAcc= Read-Host -Prompt " – Please enter the consuming farm account e.g. DOMAIN\account"
    $principal=New-SPClaimsPrincipal -Identity $consumFarmAcc -IdentityType WindowsSamAccountName   
}
else
{
    $principal=New-SPClaimsPrincipal -ClaimType "http://schemas.microsoft.com/sharepoint/2009/08/claims/farmid" -ClaimProvider $claimprovider -ClaimValue $consumerId
}
Grant-SPObjectSecurity -Identity $security -Principal $principal -Rights $serviceAppRight
Set-SPServiceApplicationSecurity $serviceApp -ObjectSecurity $security
Write-Host "Done" -Foreground Green

Write-Host "Publishing service application…" -nonewline
Publish-SPServiceApplication -Identity $serviceApp
Write-Host "Done" -Foreground Green

$lbUrl = Get-SPserviceApplication $serviceApp | Select Uri
Set-Content -path $serviceAppId -Value $lbUrl
$cleanUrl = Get-Content -path $serviceAppId
del $serviceAppId
$cleanUrl = $cleanUrl.Replace("@{Uri=","").Replace("}","")

write-Host
Write-Host "Now connect to the service application from the consumer farm with the following url:" -Foreground Yellow
write-Host
Write-Host $cleanUrl -Foreground Yellow

It’s a whopper but basically this script lists all the available service applications on the PUBLISHING farm and allows you to publish these to the CONSUMING farm whilst choosing the service application specific permissions to grant to the consuming farm:

image

Simply enter the the number for the service application you wish to publish and then the number associated with the permissions you wish to grant to the consuming farm.

Now, switch to the consuming farm.

Contents of 6_Consumer_ConnectService.ps1 – run this on the CONSUMING farm

Add-PSSnapIn "Microsoft.SharePoint.PowerShell" -EA 0

# get url from user
Write-Host
Write-Host "Reading topology service url…" -nonewline
$topologyUrlShort = get-content -path PublishingFarm.Url.txt
Write-Host "Done" -Foreground Green
Write-Host

#get available published services:
Write-Host "Connecting to topology service $topologyUrlShort…" -nonewline
$publishedServices = Receive-SPServiceApplicationConnectionInfo -FarmUrl $topologyUrlShort
Write-Host "Done" -Foreground Green
Write-Host

# list the published services
Write-Host "The following service applications are available for consumption:"
$serviceAppList = @{"0"="DummyServiceApp"}
$count = 1
foreach ($publishedService in $publishedServices)
{
    Write-host "$count. " -nonewline -foregroundcolor White
    Write-host $publishedService.DisplayName -foregroundcolor gray
        $serviceAppList.Add("$count",$publishedService.Uri)
    $count++

}

Write-Host
$serviceAppNum = Read-Host -Prompt " – Please enter the id of the service application to be consumed"

Write-Host
$serviceAppProxyName= Read-Host -Prompt " – Please enter the service application proxy name"
Write-Host

#get the selected published service app
$count = 1
foreach ($publishedService in $publishedServices)
{
    if ($count.ToString() -eq $serviceAppNum )
    {

        #we’ve found our service application – let go create it based on the type
        $type = $publishedService.SupportingProxy
        $serviceUrl =  $serviceAppList.Get_Item($serviceAppNum)
       
        Switch ($type)
        {
            ("BdcServiceApplicationProxy"){
                    Write-Host "Creating new Business Data Connectivity Service Application Proxy…" -nonewline
                    New-SPBusinessDataCatalogServiceApplicationProxy -Uri "$serviceUrl" -Name "$serviceAppProxyName"
                }
            ("MetadataWebServiceApplicationProxy"){
                    Write-Host "Creating new Managed Metadata Service Proxy…" -nonewline
                    New-SPMetadataServiceApplicationProxy -Uri "$serviceUrl" -Name "$serviceAppProxyName"
                }
            ("UserProfileApplicationProxy"){
                    Write-Host "User Profile Service Application Proxy…" -nonewline
                    New-SPProfileServiceApplicationProxy -Uri "$serviceUrl" -Name "$serviceAppProxyName"
                }
            ("SearchServiceApplicationProxy"){
                    Write-Host "Search Service Application Proxy…" -nonewline
                    New-SPEnterpriseSearchServiceApplicationProxy -Uri "$serviceUrl" -Name "$serviceAppProxyName"
                }
            ("SecureStoreServiceApplicationProxy"){
                    Write-Host "Secure Store Service Application Proxy…" -nonewline
                    New-SPSecureStoreServiceApplicationProxy -Uri "$serviceUrl" -Name "$serviceAppProxyName"
                }
            ("WebAnalyticsServiceApplicationProxy"){
                    Write-Host "Web Analytics Service Application Proxy…" -nonewline
                    New-SPWebAnalyticsServiceApplicationProxy -Uri "$serviceUrl" -Name "$serviceAppProxyName"
                }
        }
        Write-Host "Complete." -Foreground Yellow

    }
    $count++

}

This final script connects to the topology service of the publishing farm and lists all the published services applications. Simply select the number of the service application you wish to consume

image

This has saved me loads of time in the past and has proved to be very reliable. However, please note the following points:

  • The script assumes the files copied between the farms are copied to the same location as the PowerShell scripts.
  • If you want to consume partitioned service applications you’ll need to update the final script to include the –PartitionMode switch (or the –Partitioned switch in the case of the New-SPEnterpriseSearchServiceApplicationProxy cmdlet)

I hope this helps…

Update SharePoint Timer Job Progress Programmatically

Often when developing custom timer jobs it can be very useful to provide feedback in central administration about the progress your job is making. Most of the out of the box timer jobs provide this progress feedback:

image

To extend your custom timer jobs to also support this progress bar is super easy with just one line of code:

image

The SPJobDefinition.UpdateProgress method is used to provide SharePoint with a percentage completeness of your timer jobs progress. The UpdateProgress method on SPJobDefinition takes a simple int parameter of a value between 0 and 100.

Now the hard part, how to calculate your actual (and accurate) progress value – you might find this article useful: http://en.wikipedia.org/wiki/Wikipedia:Reference_desk/Archives/Miscellaneous/2008_July_1#how_long_is_a_piece_of_string.3FWinking smile

Enjoy!

Adding additional claims to a Trusted Identity Token Issuer

In my first blog post about setting up claims based authentication between the Thinktecture identity server and SharePoint I showed how to create a basic token that contains a single claim – emailaddress.

Here is how you can extend the claims that SharePoint will accept in a token. I’m assuming you’ve setup claims based authentication as per by previous article.

First, we get a reference to the trusted identity token issuer we created:

$ap = Get-SPTrustedIdentityTokenIssuer | where {$_.Name -eq "idp SAML Provider"  }

Next we extend this to include our new claim – role:

$ap.ClaimTypes.Add("http://schemas.microsoft.com/ws/2008/06/identity/claims/role")
$ap.Update()

Next we create our claim mapping:

$map1 = New-SPClaimTypeMapping -IncomingClaimType "http://schemas.microsoft.com/ws/2008/06/identity/claims/role" -IncomingClaimTypeDisplayName "Role" –SameAsIncoming

Finally we add this mapping to our trusted identity provider:

Add-SPClaimTypeMapping -Identity $map1 -TrustedIdentityTokenIssuer $ap

If we query our trusted identity token issuer again we should see the additional claim:

image

Finally, logging onto our claims based authenticated site we should see our new claim courtesy of  the claims viewer web part I installed from the codeplex project http://claimsid.codeplex.com/:image

Enjoy!