SharePoint Logging Database Size on Development Environment

December 21, 2016

I have noticed on a few different development environments that the logging database can grow significantly and on a development machine that is limited with resource this can cause some issues.

There are lots of articles out there which provide step by step guides and on the whole these are very usual, I have included a few below:

I was reviewing some of the articles above but I noticed that even after I followed the process to establish which database tables were taking the space and reduce the logging details my database still wouldn’t shrink to a sensible size. I checked the settings, ran the two timer jobs time and again but it didn’t make any difference.

I tried changing the values to lower and lower values and eventually set it to 1 day but regardless it never changed the database size. When reviewing all the settings by running the Get-SPUsageDefinition command I noticed the item I was updating the retention period for was actually disabled. I ran the command to enable the category, see below

Set-SPUsageDefinition -Identity GUID/NAME -Enable

After this I ran the two usage timer jobs, although I’m guessing you probably only need to run the processing one, and this time when I selected to shrink the database in SQL there were GBs of free space.

So it looks like when SharePoint is doing the processing for the categories it will skip any ones that are disabled. This does make sense but its is important to consider as if you have a lot of old data and disable a particular category for whenever reason the data will stay in the logging database.

Another day and another SharePoint quirk found.


Updates needed to get “Navigation options for SharePoint Online” MSDN article to work

January 20, 2015

My company was working on a project with a SharePoint Online implementation. While developing the site it was noticed that the performance wasn’t up to the required level.

My colleagues and I done some investigation and the results seem to indicate the slow load times were down to the response coming from the server.

We al tried different approaches to try and track down which elements were causing the poor performance by changing the master pages and adding/removing components.

The main issue seemed to related to the navigation options on the site. Our initial setup was to create the site using the structural navigation as this is easy to use and is security trimmed.

When the top navigation was removed we noticed significantly better page load times. After it was identified the top navigation was having an impact it was changed from structural to managed metadata and while this did help it doesn’t doesn’t security trim the navigation and requires the term set to be updated when new sites are added.

After much searching I finally found an MSDN article which backed up the top navigation potentially being a performance problem. The article suggested the managed metadata as an alternative but we had already ruled this out. It did provide a third approach which I had not previously considered and this was to use a search driven approach.

After reading through this I thought it sounded like a good solution. The only downsides were the ability to order the top navigation and the fact it renders the navigation using JavaScript so obviously puts additional load on the client.

I decided to implement this approach and then compare the page load times with the structural and search driven navigation to see if there was any significant difference. Reading the article I though this will be a 5 minute change but in the end it took me half the morning. Hopefully this blog will provide a more detailed set of steps for anyone looking to implement this approach.

Steps 1 to 4 are simply supplying instructions on how to download a copy of one of the existing master page files. Steps 5 and 6 provide details on how to update the downloaded master page and re-upload it to SharePoint under a different name.

All these steps for far are fine, however it was after this I started noticing some issues. The first issue is nowhere in the article does it actually tell you to add the JavaScript anywhere to the site. This may seem obvious but for completeness I wanted to highlight it. In my case I copied the JavaScript, made the change to the ‘root’ variable at the top of the file and then uploaded it to a folder in the Style library, which can be found at the root of the site.

With the JavaScript file in place I then added a reference to it in the head section of the custom master page, see figure 1

<script type="text/javascript" src="/sites/Intranet/Style%20Library/CN.Intranet/js/SearchTopNavigation.js"></script>

Figure 1

Steps 7 and 8 detail the change needed to update the root site URL and also describe in detail how the JavaScript works. This is where another of the additional steps is required. As part of step 7  it says “This data is then parsed into the previously defined SPO.Models.NavigationNode objects which use Knockout.js to create observable properties for use by data binding the values into the HTML that we defined earlier”. The key in here is the code uses Knockout.js so I needed to add a reference to this in my master page as well. To do this you can either download a copy or point the reference to a CDN. I always download a copy as then you are in control over the version and the availability of the content. I downloaded version 3 from the Knockout website. Once I have a copy of this I uploaded it to the same folder in the style library and added another reference in my custom master page, see below for reference.

<script type="text/javascript" src="/sites/Intranet/Style%20Library/CN.Intranet/js/JQ/knockout-3.0.0.js"></script>

Also in point 7 there is another line which is easy to skip over which requires another JavaScript reference “Next, the results are assigned to the self.nodes array and a hierarchy is built out of the objects using linq.js assigning the output to an array self.heirarchy.”. The key here is it uses linq.js to create the object hierarchy so it can build the navigation. I had a Google for this and found a CodePlex site. It looked like the correct plugin so I downloaded the solution and took the minimised version ‘linq.min.js’ and uploaded this into the same folder in the style library. I also added another reference to my custom master page, see figure 3

<script type="text/javascript" src="/sites/CHASIntranet/Style%20Library/CN.Intranet/js/JQ/linq.min.js"></script>

Figure 3

At this point I thought that would be all the changes I needed, however when I loaded the site I started getting a JavaScript error, see figure 4.

“object doesn’t support property or method “byhierarchy””

Figure 4

I then spent a while debugging through all the code to see if I could indentify what was happening and it seemed to be in the ‘buildHierarchy’ function where it was calling ‘ByHierarchy’ on the collection of navigation items. I checked and there were items being passed to the function.

I looked around and tried the JQuery Linq JS version which comes as part of the download, jquery.linq.min.js, but this didn’t work either. I then had a Google for the error and I found the Source Control where the “ByHierarchy” functionality was added to linq.js. The download provides a Visual Studio project so I opened this and searched for “ByHierarchy” and I found references in “jquery.linq.js” and “linq.js”. It took me a minute to trigger but I was using the minified versions to increase performance but it looks like the new function wasn’t added to the minified version. I double checked this by opening the minified version I upload to SharePoint and searched for “ByHierarchy” but nothing was returned.

I uploaded “linq.js” to the style library and updated the reference in my master page and then tested the site again and while the JavaScript error was gone the navigation was not displaying. All I could see was the navigation div was being displayed but there was no contents, see figure 5.

Generated Top Nav
  1. <div style="" id="navContainer">
  2.                     <div class="noindex ms-core-listMenu-horizontalBox" data-bind="foreach: hierarchy">
  3.                         <ul style="padding-left:20px" data-bind="foreach: $data.children" id="menu"></ul>
  4.                     </div>
  5.                 </div>

Figure 5

I checked and there were no JavaScript errors in the console in IE so I started debugging through the code. I could see items were being returned as part of the query and NavigationNode objects were getting created but I couldn’t see what was missing. In the end I got a colleague of mine to double checked and he noticed that when building up the NavigationNode objects in function “getNavigationFromDto” it looked like the parent property was getting set to the content type ID not the parent site URL.

After checking the values available the parent site URL seemed to be stored in index 20 so I updated my version of the code, see figure 6

Updated JavaScript Function
  1. //Parses a local object from JSON search result.
  2. function getNavigationFromDto(dto) {
  3.     var item = new SPO.Models.NavigationNode();
  4.     if (dto != undefined) {
  5.  
  6.         item.Title(dto.Cells.results[3].Value);
  7.         item.Url(dto.Cells.results[6].Value);
  8.         item.Parent(dto.Cells.results[20].Value);
  9.     }
  10.  
  11.     return item;
  12. }

Figure 6

When I then tested the site the navigation started working. I was very relieved at this point but I wanted to make sure everything was functioning as expected so I attached a break point in IE and started to step through the code and  this is where I found my final error. In the sample supplied in the MSDN article the “NavigationViewModel” function is used to get the data from either the localStorage cache or it executes the search API query, however in the sample code the checked implemented is always false so never retrieves from the local storage see line 2 in figure 7 below.

Retrieve Navigation Function
  1. var fromStorage = localStorage["nodesCache"];
  2.         if (false) {
  3.             var cachedNodes = JSON.parse(localStorage["nodesCache"]);
  4.             var timeStamp = localStorage["nodesCachedAt"];
  5.             if (cachedNodes && timeStamp) {
  6.                 //Check for cache expiration. Currently set to 3 hrs.
  7.                 var now = new Date();
  8.                 var diff = now.getTime() – timeStamp;
  9.                 if (Math.round(diff / (1000 * 60 * 60)) < 3) {
  10.  
  11.                     //return from cache.
  12.                     var cacheResults = [];
  13.                     $.each(cachedNodes, function (i, item) {
  14.                         var nodeitem = getNavigationFromCache(item, true);
  15.                         cacheResults.push(nodeitem);
  16.                     });
  17.  
  18.                     self.buildHierarchy(cacheResults);
  19.                     self.toggleView();
  20. addEventsToElements();
  21.                     return;
  22.                 }
  23.             }
  24.         }
  25.         //No cache hit, REST call required.
  26.         self.queryRemoteInterface();

Figure 7

I updated this test to check the value of the fromStorage variable, see figure 8, and done some testing and the code starting pulling the results from my local storage instead of doing the query every time.

Retrieve Navigation Function 2
  1. var fromStorage = localStorage["nodesCache"];
  2.         if (fromStorage != null) {
  3.             var cachedNodes = JSON.parse(localStorage["nodesCache"]);
  4.             var timeStamp = localStorage["nodesCachedAt"];
  5.             if (cachedNodes && timeStamp) {
  6.                 //Check for cache expiration. Currently set to 3 hrs.
  7.                 var now = new Date();
  8.                 var diff = now.getTime() – timeStamp;
  9.                 if (Math.round(diff / (1000 * 60 * 60)) < 3) {
  10.  
  11.                     //return from cache.
  12.                     var cacheResults = [];
  13.                     $.each(cachedNodes, function (i, item) {
  14.                         var nodeitem = getNavigationFromCache(item, true);
  15.                         cacheResults.push(nodeitem);
  16.                     });
  17.  
  18.                     self.buildHierarchy(cacheResults);
  19.                     self.toggleView();
  20. addEventsToElements();
  21.                     return;
  22.                 }
  23.             }
  24.         }
  25.         //No cache hit, REST call required.
  26.         self.queryRemoteInterface();

Figure 8

The final issue with this was noticed by one of my colleagues Tal Peer. Our development environment was a basic site collection with some team sites, however when this was deployed to the production site collections Tal noticed the top navigation was returning additional items. It turns out that the search query will also return app webs which are installed as part of adding apps to the site. Obviously these webs are generally hidden from the end user so they shouldn’t appear in the top navigation. Tal was able to add an additional part to the search query to exclude these app webs by adding “-WebTemplate:APP”, the full query can be seen in figure 9

Hopefully this helps any other people who were trying to implement this functionality. It’s a very good idea to load the navigation from the search index and the basis of the code is there but I just found these additional steps/issues to be very frustrating.

Happy SharePointing 🙂


SharePoint error The URL ‘Pages/default.aspx’ is invalid

November 28, 2013

On a recent project I was working on an Intranet which had the publishing infrastructure feature enabled. I was working on setting up some pages and configuring web parts when I noticed an error on saving or checking in a page, see figure 1 below.

The URL ‘Pages/default.aspx’ is invalid. It may refer to a nonexistent file or folder, or refer to a valid file or folder that is not in the current Web.

Figure 1

As with most errors in SharePoint I cracked opened the ULS logs. Its slightly more difficult to find relevant data in the ULS logs without a correlation ID but after searching for the error text I was able to locate the ULS details for the page request. Unfortunately there was no further details around what was causing the error.

I then turned to Google and had a look for the error message. I found quite a few articles but pretty much all of these related to database issues. Most talk was about space on the database server or databases being read-only, however none of this was relevant to me.

At this point I couldn’t find any details in the ULS logs or online so I decided to start stripping back my solution component by component to see at which point the error occurred. While the site was still in development there were a lot of steps to replicate. I started by spinning up a new web application and then applying the changes one by one. After each change I was creating a new page and editing an existing page to test if the site was broken.

In the end it turns out the issue was a PowerShell script which was being run to create some site columns. This had been created to quickly provision site columns which are commonly used across most projects.

Once I had identified the source of the issue I then had to figure out if it was a particular site column which was causing the issue. I started commenting out sections of the script and I was able to narrow it down to a particular column. The problematic site column was a calculated column and as soon as I commented this out and deleted the column the site started to work again. I’m not sure why this was causing an issue but it can be easily created manually so i removed it from my script.

Hopefully this helps others as all the articles I found online all pointed to a database issues so keep in mind it could be a corrupted site column as well.

Happy SharePointing 🙂


Content by search web part server side version displays dates out by an hour

December 20, 2012

Since the release of SharePoint 2013 I have been lucky enough to use it in an actual project rather than playing around with it and I must admit there are several things which can be very frustrating. I came across a good example of one earlier today when doing some testing.

The project I am working on uses search to display news articles but since most of the search web parts all use JavaScript we also used the content by search web part set to run server side. Everything seemed to be working as expected until I noticed the dates on the JavaScript, figure 1, and non JavaScript version, figure 2, were different

ClientSideDate  

Figure 1

ServerSideDate

Figure 2

The data being displayed is held in a list behind the screens and the field in question is a date only field. I double checked the date through the UI, figure 3, and I also wrote a PowerShell script, figures 4 & 5, but this confirmed the data was correct.

DateViaUI

Figure 3

ScriptToGetDate

Figure 4

DateViaPS

Figure 5

It just seemed like on the server side version it always removed an hour from the actual time so 20 June 2012 00:00:00 becomes 19 June 2012 23:00:00. I had a look around but I couldn’t see why it might be this and I looked on Google but I couldn’t see anything else on this.

In the implemented solution the list items being crawled via the search service were getting added via code so this allowed me to update the date field to set it to be 9am on the morning of the date entered. Even though the field type is date only it still stores the content in the DB with a time parameter so I could safely update this without fear of overwriting users content. This meant when the server side version removed an hour the value being returned is always 8am on the morning of the date entered, thus solving my problem.

I fully recognise there will be others who don’t have control over the content in the same manner so this solution will only work in certain circumstances. I would be very interested to see if anyone else has come across this and if so how they managed to resolve the issue.


%d bloggers like this: