SharePoint Logging Database Size on Development Environment

December 21, 2016

I have noticed on a few different development environments that the logging database can grow significantly and on a development machine that is limited with resource this can cause some issues.

There are lots of articles out there which provide step by step guides and on the whole these are very usual, I have included a few below:

I was reviewing some of the articles above but I noticed that even after I followed the process to establish which database tables were taking the space and reduce the logging details my database still wouldn’t shrink to a sensible size. I checked the settings, ran the two timer jobs time and again but it didn’t make any difference.

I tried changing the values to lower and lower values and eventually set it to 1 day but regardless it never changed the database size. When reviewing all the settings by running the Get-SPUsageDefinition command I noticed the item I was updating the retention period for was actually disabled. I ran the command to enable the category, see below

Set-SPUsageDefinition -Identity GUID/NAME -Enable

After this I ran the two usage timer jobs, although I’m guessing you probably only need to run the processing one, and this time when I selected to shrink the database in SQL there were GBs of free space.

So it looks like when SharePoint is doing the processing for the categories it will skip any ones that are disabled. This does make sense but its is important to consider as if you have a lot of old data and disable a particular category for whenever reason the data will stay in the logging database.

Another day and another SharePoint quirk found.

Advertisements

How to add new properties to the workflow service configuration

February 5, 2015

While investigating an issue I encountered when running a SharePoint 2013 workflow, see blog post for details, some of the suggested fixes I found suggested adding a new property into the workflow configuration settings.

I had a look at the MSDN article on how to do this but when looking at the steps it seems to indicate you could only set existing values not add new ones, see figure 1 for a snippet from the MSDN article

 “For Name, specify a name of the configuration setting (See below for the list of valid settings.)”.

Figure 1

I had a look around on how to add new settings but I wasn’t able to find anything except from people adding rows directly into the “WorkflowServiceConfig” table in the “WFResourceManagementDB” database. Anyone who has used SharePoint knows that it is generally not supported to touch any SharePoint database without putting them into an unsupported state.

As it turns out I was able to resolve my issue by updating an existing property so I didn’t need to do this in the end. After I had worked out what property I needed to update, which I had done via the PowerShell command supplied in the MSDN article, I thought it would be a good idea to create a wrapper PowerShell function in a common PowerShell file I have. I have several helper methods in this common PowerShell file and I use it to avoid having to remember specific PowerShell commands.

I started to implement this helper function and was adding some error handling to see what would happen when a property name was supplied that didn’t exist. I ran the helper method with an invalid name and it completed without any errors. I was expecting some kind of error message so I was slightly confused at this. I changed the name and debugged through the PowerShell, I use PowerGUI when writing scripts and it has this feature. When the script got to call the “Set-WFServiceConfiguration” function it completed with no errors.

I opened up the workflow table where the settings are stored and to my surprise the two test values I had tried were in the table.

I thought it was a bit strange that Microsoft would allow you to update properties but not add them, however the wording on the MSDN article does clearly suggest the name has to be one of the existing values. Added to this all the articles I found people were directly adding rows into the database table.

I have included the function I wrote below, see figure 2, in case anyone finds it useful but as always with PowerShell always try this in an non-production environment first.

Function to add property
function Set-SP2013WorkflowSetting{
   [cmdletbinding()]
   Param(    
    [parameter(Mandatory=$true)][string]$workflowManagerURL,
    [parameter(Mandatory=$true)][string]$propertyName,    
    [parameter(Mandatory=$true)][string]$propertyValue)    
    
    Write-Host "Starting to update workflow manager property $propertyName" -ForegroundColor Green
    
    $existingValue = Get-WFServiceConfiguration -serviceUri $workflowManagerURL -Name $propertyName
    
    if($existingValue -eq $null)
    {
        Write-Host "Property $propertyName doesn't have a value which means it isn't in the settings DB table. Please confirm you want to add a new setting" -ForegroundColor Magenta
        $title = "Add Workflow Setting"
        $message = "Do you want to add a new workflow setting?"

        $yes = New-Object System.Management.Automation.Host.ChoiceDescription "&Yes", `
           "Add a new row into the settings table."

        $no = New-Object System.Management.Automation.Host.ChoiceDescription "&No", `
           "Doesn't add the new setting and exists the function."

        $options = [System.Management.Automation.Host.ChoiceDescription[]]($yes, $no)

        $result = $host.ui.PromptForChoice($title, $message, $options, 1)

        switch ($result)
        {
            0 {}
            1 {return}
        }
    }
    else
    {
        Write-Host "Poperty $propertyName has a value of $existingValue it will be changed to $propertyValue" -ForegroundColor Green
    }
    
    Set-WFServiceConfiguration -serviceUri $workflowManagerURL -Name $propertyName -Value $propertyValue
    
    Write-Host "Finished updating workflow manager property $propertyName" -ForegroundColor Green    
}

Figure 2

Updates needed to get “Navigation options for SharePoint Online” MSDN article to work

January 20, 2015

My company was working on a project with a SharePoint Online implementation. While developing the site it was noticed that the performance wasn’t up to the required level.

My colleagues and I done some investigation and the results seem to indicate the slow load times were down to the response coming from the server.

We al tried different approaches to try and track down which elements were causing the poor performance by changing the master pages and adding/removing components.

The main issue seemed to related to the navigation options on the site. Our initial setup was to create the site using the structural navigation as this is easy to use and is security trimmed.

When the top navigation was removed we noticed significantly better page load times. After it was identified the top navigation was having an impact it was changed from structural to managed metadata and while this did help it doesn’t doesn’t security trim the navigation and requires the term set to be updated when new sites are added.

After much searching I finally found an MSDN article which backed up the top navigation potentially being a performance problem. The article suggested the managed metadata as an alternative but we had already ruled this out. It did provide a third approach which I had not previously considered and this was to use a search driven approach.

After reading through this I thought it sounded like a good solution. The only downsides were the ability to order the top navigation and the fact it renders the navigation using JavaScript so obviously puts additional load on the client.

I decided to implement this approach and then compare the page load times with the structural and search driven navigation to see if there was any significant difference. Reading the article I though this will be a 5 minute change but in the end it took me half the morning. Hopefully this blog will provide a more detailed set of steps for anyone looking to implement this approach.

Steps 1 to 4 are simply supplying instructions on how to download a copy of one of the existing master page files. Steps 5 and 6 provide details on how to update the downloaded master page and re-upload it to SharePoint under a different name.

All these steps for far are fine, however it was after this I started noticing some issues. The first issue is nowhere in the article does it actually tell you to add the JavaScript anywhere to the site. This may seem obvious but for completeness I wanted to highlight it. In my case I copied the JavaScript, made the change to the ‘root’ variable at the top of the file and then uploaded it to a folder in the Style library, which can be found at the root of the site.

With the JavaScript file in place I then added a reference to it in the head section of the custom master page, see figure 1

<script type="text/javascript" src="/sites/Intranet/Style%20Library/CN.Intranet/js/SearchTopNavigation.js"></script>

Figure 1

Steps 7 and 8 detail the change needed to update the root site URL and also describe in detail how the JavaScript works. This is where another of the additional steps is required. As part of step 7  it says “This data is then parsed into the previously defined SPO.Models.NavigationNode objects which use Knockout.js to create observable properties for use by data binding the values into the HTML that we defined earlier”. The key in here is the code uses Knockout.js so I needed to add a reference to this in my master page as well. To do this you can either download a copy or point the reference to a CDN. I always download a copy as then you are in control over the version and the availability of the content. I downloaded version 3 from the Knockout website. Once I have a copy of this I uploaded it to the same folder in the style library and added another reference in my custom master page, see below for reference.

<script type="text/javascript" src="/sites/Intranet/Style%20Library/CN.Intranet/js/JQ/knockout-3.0.0.js"></script>

Also in point 7 there is another line which is easy to skip over which requires another JavaScript reference “Next, the results are assigned to the self.nodes array and a hierarchy is built out of the objects using linq.js assigning the output to an array self.heirarchy.”. The key here is it uses linq.js to create the object hierarchy so it can build the navigation. I had a Google for this and found a CodePlex site. It looked like the correct plugin so I downloaded the solution and took the minimised version ‘linq.min.js’ and uploaded this into the same folder in the style library. I also added another reference to my custom master page, see figure 3

<script type="text/javascript" src="/sites/CHASIntranet/Style%20Library/CN.Intranet/js/JQ/linq.min.js"></script>

Figure 3

At this point I thought that would be all the changes I needed, however when I loaded the site I started getting a JavaScript error, see figure 4.

“object doesn’t support property or method “byhierarchy””

Figure 4

I then spent a while debugging through all the code to see if I could indentify what was happening and it seemed to be in the ‘buildHierarchy’ function where it was calling ‘ByHierarchy’ on the collection of navigation items. I checked and there were items being passed to the function.

I looked around and tried the JQuery Linq JS version which comes as part of the download, jquery.linq.min.js, but this didn’t work either. I then had a Google for the error and I found the Source Control where the “ByHierarchy” functionality was added to linq.js. The download provides a Visual Studio project so I opened this and searched for “ByHierarchy” and I found references in “jquery.linq.js” and “linq.js”. It took me a minute to trigger but I was using the minified versions to increase performance but it looks like the new function wasn’t added to the minified version. I double checked this by opening the minified version I upload to SharePoint and searched for “ByHierarchy” but nothing was returned.

I uploaded “linq.js” to the style library and updated the reference in my master page and then tested the site again and while the JavaScript error was gone the navigation was not displaying. All I could see was the navigation div was being displayed but there was no contents, see figure 5.

Generated Top Nav
  1. <div style="" id="navContainer">
  2.                     <div class="noindex ms-core-listMenu-horizontalBox" data-bind="foreach: hierarchy">
  3.                         <ul style="padding-left:20px" data-bind="foreach: $data.children" id="menu"></ul>
  4.                     </div>
  5.                 </div>

Figure 5

I checked and there were no JavaScript errors in the console in IE so I started debugging through the code. I could see items were being returned as part of the query and NavigationNode objects were getting created but I couldn’t see what was missing. In the end I got a colleague of mine to double checked and he noticed that when building up the NavigationNode objects in function “getNavigationFromDto” it looked like the parent property was getting set to the content type ID not the parent site URL.

After checking the values available the parent site URL seemed to be stored in index 20 so I updated my version of the code, see figure 6

Updated JavaScript Function
  1. //Parses a local object from JSON search result.
  2. function getNavigationFromDto(dto) {
  3.     var item = new SPO.Models.NavigationNode();
  4.     if (dto != undefined) {
  5.  
  6.         item.Title(dto.Cells.results[3].Value);
  7.         item.Url(dto.Cells.results[6].Value);
  8.         item.Parent(dto.Cells.results[20].Value);
  9.     }
  10.  
  11.     return item;
  12. }

Figure 6

When I then tested the site the navigation started working. I was very relieved at this point but I wanted to make sure everything was functioning as expected so I attached a break point in IE and started to step through the code and  this is where I found my final error. In the sample supplied in the MSDN article the “NavigationViewModel” function is used to get the data from either the localStorage cache or it executes the search API query, however in the sample code the checked implemented is always false so never retrieves from the local storage see line 2 in figure 7 below.

Retrieve Navigation Function
  1. var fromStorage = localStorage["nodesCache"];
  2.         if (false) {
  3.             var cachedNodes = JSON.parse(localStorage["nodesCache"]);
  4.             var timeStamp = localStorage["nodesCachedAt"];
  5.             if (cachedNodes && timeStamp) {
  6.                 //Check for cache expiration. Currently set to 3 hrs.
  7.                 var now = new Date();
  8.                 var diff = now.getTime() – timeStamp;
  9.                 if (Math.round(diff / (1000 * 60 * 60)) < 3) {
  10.  
  11.                     //return from cache.
  12.                     var cacheResults = [];
  13.                     $.each(cachedNodes, function (i, item) {
  14.                         var nodeitem = getNavigationFromCache(item, true);
  15.                         cacheResults.push(nodeitem);
  16.                     });
  17.  
  18.                     self.buildHierarchy(cacheResults);
  19.                     self.toggleView();
  20. addEventsToElements();
  21.                     return;
  22.                 }
  23.             }
  24.         }
  25.         //No cache hit, REST call required.
  26.         self.queryRemoteInterface();

Figure 7

I updated this test to check the value of the fromStorage variable, see figure 8, and done some testing and the code starting pulling the results from my local storage instead of doing the query every time.

Retrieve Navigation Function 2
  1. var fromStorage = localStorage["nodesCache"];
  2.         if (fromStorage != null) {
  3.             var cachedNodes = JSON.parse(localStorage["nodesCache"]);
  4.             var timeStamp = localStorage["nodesCachedAt"];
  5.             if (cachedNodes && timeStamp) {
  6.                 //Check for cache expiration. Currently set to 3 hrs.
  7.                 var now = new Date();
  8.                 var diff = now.getTime() – timeStamp;
  9.                 if (Math.round(diff / (1000 * 60 * 60)) < 3) {
  10.  
  11.                     //return from cache.
  12.                     var cacheResults = [];
  13.                     $.each(cachedNodes, function (i, item) {
  14.                         var nodeitem = getNavigationFromCache(item, true);
  15.                         cacheResults.push(nodeitem);
  16.                     });
  17.  
  18.                     self.buildHierarchy(cacheResults);
  19.                     self.toggleView();
  20. addEventsToElements();
  21.                     return;
  22.                 }
  23.             }
  24.         }
  25.         //No cache hit, REST call required.
  26.         self.queryRemoteInterface();

Figure 8

The final issue with this was noticed by one of my colleagues Tal Peer. Our development environment was a basic site collection with some team sites, however when this was deployed to the production site collections Tal noticed the top navigation was returning additional items. It turns out that the search query will also return app webs which are installed as part of adding apps to the site. Obviously these webs are generally hidden from the end user so they shouldn’t appear in the top navigation. Tal was able to add an additional part to the search query to exclude these app webs by adding “-WebTemplate:APP”, the full query can be seen in figure 9

Hopefully this helps any other people who were trying to implement this functionality. It’s a very good idea to load the navigation from the search index and the basis of the code is there but I just found these additional steps/issues to be very frustrating.

Happy SharePointing 🙂


SharePoint 2013 Workflow Instance Size Error

January 6, 2015

While working on a recent project I got the chance to setup and use SharePoint 2013 workflows. This is a major step forward for the product and has a lot of benefits, however it does introduce a few complications.

As the workflows are now all stored and processed outside of SharePoint all communication is done via WCF services. This is great as it extracts the processing out of SharePoint and thus reduces the load on SharePoint.

The downside of this is it introduces additional areas where something can go wrong and this is what leads me onto the topic of this blog post.

The workflow I was working on was used to handle the approval of data on a SharePoint list item. In order for the workflow to move onto the next stage it required 5 groups of users to approve the data. The workflow had to wait until all 5 groups of users had completed and approved the data and once approved the workflow would move onto the next stage and send some emails.

The UI to handle the approval locked down the form so only users in certain groups could actually approve each section so in most cases a user would come in and approve 1 out of the 5 sections and this worked fine. There was also an admin group who could approve all 5 sections at one time. It was in this case where an error was being thrown by the workflow causing it to be terminated, see figure 1

image

Figure 1

When clicking on the workflow I could see the internal status was ‘Terminated’ and when clicking on the information icon I could see the error was “System.Activities.Statements.WorkflowTerminatedException: The workflow instance was too large to persist (over 5120 kilobytes). The maximum persistable workflow instance size is 5120 kilobytes.”, see figure 2.

image

Figure 2

For SharePoint errors this is actually reasonably detailed as it supplies a message explaining what the issue is instead of the usual something went wrong error. As this was my first real experience of SharePoint 2013 workflows I didn’t really know where to start debugging this so I had a Google around.

There are a few articles out there but nothing which matched what I was looking for. The closest I came was an article which suggested changing a server configuration property called ‘WorkflowServiceMaxActivityXamlSizeInBytes’.

I had a look into this and discovered this property can be updated by using a PowerShell command called ‘Set-WFServiceConfiguration’, see MSDN article. I tried to increase this value but unfortunately it didn’t seem to make any difference. I looked over the additional properties which can be set and I found one called ‘WorkflowServiceMaxInstanceSizeKB’ which seemed to match the error I was getting in terms of the description of the error but also the max value limit on the error message seemed to match the default value on the MSDN article. I tried changing this using the PowerShell command but again the workflow was still failing with the same error.

Given these properties were exposed so Microsoft obviously expected these might need to be changed in certain circumstances and the fact that the ‘WorkflowServiceMaxInstanceSizeKB’ property matched so closely to the error I was encountering I done some investigation into this and found there was another property called ‘WorkflowServiceMaxInstanceCompressedSizeKB’ which also had the same default value of 5120.

Unfortunately all articles referring to this were people adding a row directly into a table in one of the workflow databases. Anyone who knows SharePoint knows touching any SharePoint databases is not supported so I was reluctant to do this, however since it was a development environment I tried it but I still encountered the same error.

It wasn’t until I came across an article which was describing a slightly different error that on a I noticed down at the bottom of the article it said you have to restart the ‘Workflow Manager Backend’ windows service in order for settings changes to be picked up and after I done this the workflow started working.

I then had to do some testing to check which of the properties I had changed was the one which fixed the error and in my case it turns out I only needed the ‘WorkflowServiceMaxInstanceSizeKB’ property which could be set via PowerShell, see figure 3 for the exact script, so I didn’t have to worry about adding a value directly into the workflow database.

Set-WFServiceConfiguration -ServiceUri http://workflowmanagerurl:12291/ -Name “WorkflowServiceMaxInstanceSizeKB” -Value 30720

Figure 3

Overall the process of changing the workflow settings is very easy using the available PowerShell functions the key is to remember and re-start the ‘Workflow Manager Backend’ windows service to get these updated values picked up.


SharePoint error The URL ‘Pages/default.aspx’ is invalid

November 28, 2013

On a recent project I was working on an Intranet which had the publishing infrastructure feature enabled. I was working on setting up some pages and configuring web parts when I noticed an error on saving or checking in a page, see figure 1 below.

The URL ‘Pages/default.aspx’ is invalid. It may refer to a nonexistent file or folder, or refer to a valid file or folder that is not in the current Web.

Figure 1

As with most errors in SharePoint I cracked opened the ULS logs. Its slightly more difficult to find relevant data in the ULS logs without a correlation ID but after searching for the error text I was able to locate the ULS details for the page request. Unfortunately there was no further details around what was causing the error.

I then turned to Google and had a look for the error message. I found quite a few articles but pretty much all of these related to database issues. Most talk was about space on the database server or databases being read-only, however none of this was relevant to me.

At this point I couldn’t find any details in the ULS logs or online so I decided to start stripping back my solution component by component to see at which point the error occurred. While the site was still in development there were a lot of steps to replicate. I started by spinning up a new web application and then applying the changes one by one. After each change I was creating a new page and editing an existing page to test if the site was broken.

In the end it turns out the issue was a PowerShell script which was being run to create some site columns. This had been created to quickly provision site columns which are commonly used across most projects.

Once I had identified the source of the issue I then had to figure out if it was a particular site column which was causing the issue. I started commenting out sections of the script and I was able to narrow it down to a particular column. The problematic site column was a calculated column and as soon as I commented this out and deleted the column the site started to work again. I’m not sure why this was causing an issue but it can be easily created manually so i removed it from my script.

Hopefully this helps others as all the articles I found online all pointed to a database issues so keep in mind it could be a corrupted site column as well.

Happy SharePointing 🙂


Error editing publishing page in SharePoint

November 25, 2013

While working on a recent project I was navigating our development site and when I tried to edit a page I got the standard SharePoint error screen, see figure 1. There was very little customisations on the site as it was still in the earlier stages of the development process so I was slightly confused as to what the issue could be.

image

Figure 1

As with most SharePoint errors the easiest way to get to the bottom of the issue is to check the ULS logs so I copied the correlation ID and opened up ULS Viewer on the server. I opened the latest ULS log file and filtered by the correlation ID. Looking through the log file I finally found the details of the error, see Figure 2.

Application error when access /Pages/default.aspx, Error=Index was out of range. Must be non-negative and less than the size of the collection. Parameter name: index
at Microsoft.SharePoint.SPFieldMultiColumnValue.get_Item(Int32 index)

System.ArgumentOutOfRangeException: Index was out of range. Must be non-negative and less than the size of the collection. Parameter name: index
at Microsoft.SharePoint.SPFieldMultiColumnValue.get_Item(Int32 index)

Getting Error Message for Exception System.Web.HttpUnhandledException (0x80004005): Exception of type ‘System.Web.HttpUnhandledException’ was thrown. —> System.InvalidOperationException: Failed to compare two elements in the array. —> System.ArgumentOutOfRangeException: Index was out of range. Must be non-negative and less than the size of the collection. Parameter name: index
at Microsoft.SharePoint.SPFieldMultiColumnValue.get_Item(Int32 index)

Figure 2

Looking over the details it wasn’t obvious what the actual error was but it seemed to point to an issue with the page layouts so I decided to review all custom ones to see if there was anything obviously wrong. I opened the site in the browser and navigated to the master page gallery. When I tried to edit one of the custom page layouts I got an error, see figure 3.

image

Figure 3

The error seemed to be highlighting an issue with the content type the page layout was associated with so I returned to the master page gallery and hovered over the associated content type link, see RHS column on Figure 4, and I could see the URL on the bottom of the page was “_layouts/15/ManageContentType.aspx?ctype=#VALUE!” whereas it should contain the ID of the content type.

 

screenshot2

Figure 4

Normally this would not be an issue with page layouts uploaded via the browser or SharePoint designer, however in my case the custom page layouts had been uploaded into SharePoint via PowerShell.

I checked the PowerShell script and I could see the original version uploaded a page layout into the correct location, however it was setting the associated content type property of the page layout to be a string, see figure 5. I knew from previous experience this needed to be a concatenated string but I couldn’t remember the exact format so I quickly put together a test script which got the value of an OOTB page layout. Using this script I could see the value actually had to be a concatenated string of the content type name and ID, see figure 6 for updated PowerShell.

Incorrect version
  1. $newFile.Item["PublishingAssociatedContentType"] = "Article Page"

Figure 5
Correct version
  1. $newFile.Item["PublishingAssociatedContentType"] = ";#Article Page;#0x010100C568DB52D9D0A14D9B2FDCC96666E9F2007948130EC3DB064584E219954237AF3900242457EFB8B24247815D688C526CD44D;#"

Figure 6

As the site was still in the early stages of development I was able to easily delete all pages which used the custom page layouts, delete the page layouts and re-upload them with the corrected script. I then double checked the associated content link in the master page gallery, see figure 7, and it was correctly populated. After that I was able to edit pages and content within the site.

screenshot 
Figure 7

Hopefully this saves some other people some time and I suppose the key lesson would be, as always, be careful with PowerShell and double check its actually doing what you expect.



%d bloggers like this: