Configuring OOTB People web part using PnP Core Online library

August 6, 2018

Following on from my previous post Find properties for OOTB modern web parts when adding them to the page using PnP Core Online library I was working with adding the OOTB People web part to a modern page.

I was following the steps documented in the previous post and was able to tell the required web part JSON was as below

PersonWebPartProperties

As this web part supports configuring multiple people as you may expect it wants an array of users. This was slightly different from other web parts as in most cases its simple data types like strings, integers or booleans.

At first I tried creating an object and then passing a JSON string to this property but that didn’t work as its expecting an actual JSON array. When debugging through the code I noticed this was expecting a JToken object, see below

PersonPropertyType

After a little trial and error I eventually got this working by creating a generic list of JObject items and then creating a new dynamic JObject and set all required person details from the web part property data above. See full code example below

Code Sample

var loginName = userLogin.LoginName;

//Create list of people to add
List<JObject> persons = new List<JObject>();

//Create new person object
dynamic person = new JObject();
person.id = loginName;
person.upn = loginName.Substring(loginName.LastIndexOf(“|”) + 1);
person.role = “”;
person.department = “”;
person.phone = “”;
person.sip = “”;

//add person to list
persons.Add(person);

//Create token object required by web part from list above
var token = Newtonsoft.Json.Linq.JToken.FromObject(persons);
//set web part property
peopleWebPart.Properties[“persons”] = token;

//add the web part to the page
page.AddControl(peopleWebPart, secondColumn, 2);

 

I hope this helps if someone else is struggling with the same issue

Advertisements

Find properties for OOTB modern web parts when adding them to the page using PnP Core Online library

August 6, 2018

With the uptake in the SharePoint modern experience I’ve found my self getting to understand and know the new ways of working with Yeoman, Gulp, VSCode, etc. At times this has been a painful experience but on the whole it has been an interesting shift away from the classic C# server side development world I’m used to.

That said I’m currently working on a project which requires c# code and its been a funny switch back into this frame of mind. My task has been to provision some modern pages and add various different web parts to the page.

When looking at this I decided to use SharePointPnPCoreOnline to help speed up the development and this is a great library to save you writing code. There are some great guides and articles out there on how you can interact with modern pages,

Most of these have lots of detail around creating pages, updating pages and adding web parts but when it comes to adding some of the OOTB web parts you need specific properties set in order for these to work as expected. This article details the steps I followed to work out what you need to set in each case, this process should work for any modern web part.

My first test was adding the OOTB image web part and this is actually covered in the MS documentation, however when I tried I noticed that although the web part was added it wasn’t correctly configured

Sample Code Snippet

var imageWebPart = page.InstantiateDefaultWebPart(DefaultClientSideWebParts.Image);

imageWebPart.Properties[“imageSourceType”] = 2;
imageWebPart.Properties[“siteId”] = imageFile.SiteId;
imageWebPart.Properties[“webId”] = imageFile.WebId;
imageWebPart.Properties[“listId”] = imageFile.ListId;
imageWebPart.Properties[“uniqueId”] = imageFile.UniqueId;
page.AddControl(imageWebPart, firstColumn, controlIndexPosition);

The result of the code above was a new image web part was added but it wasn’t showing the image, see below

IncorrectlyConfiguredImage

From one of the previous PnP calls I was aware you can add a special query string parameter to get information about the page and web part configuration so I navigated to my new page and added ?maintenancemode=true to the url and refreshed the page. This puts the page and web parts in a debug mode were you can see all the configuration data, see below

MaintenanceMode

I found the added ImageWebPart instance and then clicked on the “Data” tab and this shows all the web part specific properties, see below

ImageWebPartProperties

I then added an image web part and manually configured this. Once added I saved and closed the page and re-added the query string parameter, see below

ImageWebPartPropertiesWorking

I compared the two and was able to see in the “serveProcessedContent” section of the JSON there is an “imageSource” property. At first I tried to set this on the “serveProcessedContent” web part object but this is read only. In the end I simple added the “imageSource” to the properties directly, see below, and this worked.

Working Code Sample

var imageWebPart = page.InstantiateDefaultWebPart(DefaultClientSideWebParts.Image);

imageWebPart.Properties[“imageSourceType”] = 2;
imageWebPart.Properties[“imageSource”] = image.Url;
imageWebPart.Properties[“siteId”] = imageFile.SiteId;
imageWebPart.Properties[“webId”] = imageFile.WebId;
imageWebPart.Properties[“listId”] = imageFile.ListId;
imageWebPart.Properties[“uniqueId”] = imageFile.UniqueId;
imageWebPart.Properties[“linkUrl”] = image.LinkUrl;
imageWebPart.Properties[“captionText”] = image.CaptionText;
imageWebPart.Properties[“overlayText”] = image.TextOverlay;
imageWebPart.Properties[“altText”] = image.AltText;

page.AddControl(imageWebPart, firstColumn, controlIndexPosition);

This was an example specifically focused on the image web part, however its the same process for any of the OOTB or custom web parts and I have followed the same process for the OOTB File Viewer and Person web part.

Hope this helps


Debugging SPFx web parts

November 8, 2017

I have been spending some time recently working with SPFx web parts and extensions doing some training to get up to speed but also as part of my most recently project. It has taken some time but I think I’m finally starting to get used to the various components and technologies involved.

One of the most basic things I’ve found frustrating was debugging the code as previously I was trying to add debug points in developer tools or adding ‘debugger’ points in the actual code. The debugger points worked well but it meant every time I wanted to debug I would have to add the debugger line and wait while the changes were saved and compiled.

Today I was working on a complicated project and I got tired of adding debugger points in the code so after a quick Google I found an article on sp dev docs. My project was created using a previous version of the yeoman template so I had to manually create the launch.json file and copy the sample settings in from the article.

When I first debugged after following the article I was able to debug the root web part ts file, see figure 1, but I wasn’t able to debug any other component, see figure 2.

LoadedBreakpoint

Figure 1 (Debug point loaded)

NotLoaded

Figure 2 (Debug point not loaded)

In my project I had the web part file but I also had a sub folder, see figure 3, for other nested components and it was in these sub components I wasn’t able to debug.

ComponentStructure

Figure 3 (Nested component structure)

After reviewing the settings and testing I was getting a message saying there was no source loaded. Looking at the launch.json file the most obvious settings were the ‘sourceMapPathOverrides’ properties, see figure 4. I haven’t done much with using the new web stack technologies but I figured the path reflected the folder structure in the project so I added additional paths to reflect the nested component structure, see figure 5.

defaultDebuggingSettings

Figure 4 (Default debugging settings)

updatedDebuggingSettings

Figure 5 (Updated debugging settings)

After I changed these I was then able to debug at the lower level components. As with everything and the ever changing technology stack its important to try and keep up-to-date with changes and methodologies but the problem is trying to find out the required information.


SharePoint Logging Database Size on Development Environment

December 21, 2016

I have noticed on a few different development environments that the logging database can grow significantly and on a development machine that is limited with resource this can cause some issues.

There are lots of articles out there which provide step by step guides and on the whole these are very usual, I have included a few below:

I was reviewing some of the articles above but I noticed that even after I followed the process to establish which database tables were taking the space and reduce the logging details my database still wouldn’t shrink to a sensible size. I checked the settings, ran the two timer jobs time and again but it didn’t make any difference.

I tried changing the values to lower and lower values and eventually set it to 1 day but regardless it never changed the database size. When reviewing all the settings by running the Get-SPUsageDefinition command I noticed the item I was updating the retention period for was actually disabled. I ran the command to enable the category, see below

Set-SPUsageDefinition -Identity GUID/NAME -Enable

After this I ran the two usage timer jobs, although I’m guessing you probably only need to run the processing one, and this time when I selected to shrink the database in SQL there were GBs of free space.

So it looks like when SharePoint is doing the processing for the categories it will skip any ones that are disabled. This does make sense but its is important to consider as if you have a lot of old data and disable a particular category for whenever reason the data will stay in the logging database.

Another day and another SharePoint quirk found.


How to add new properties to the workflow service configuration

February 5, 2015

While investigating an issue I encountered when running a SharePoint 2013 workflow, see blog post for details, some of the suggested fixes I found suggested adding a new property into the workflow configuration settings.

I had a look at the MSDN article on how to do this but when looking at the steps it seems to indicate you could only set existing values not add new ones, see figure 1 for a snippet from the MSDN article

 “For Name, specify a name of the configuration setting (See below for the list of valid settings.)”.

Figure 1

I had a look around on how to add new settings but I wasn’t able to find anything except from people adding rows directly into the “WorkflowServiceConfig” table in the “WFResourceManagementDB” database. Anyone who has used SharePoint knows that it is generally not supported to touch any SharePoint database without putting them into an unsupported state.

As it turns out I was able to resolve my issue by updating an existing property so I didn’t need to do this in the end. After I had worked out what property I needed to update, which I had done via the PowerShell command supplied in the MSDN article, I thought it would be a good idea to create a wrapper PowerShell function in a common PowerShell file I have. I have several helper methods in this common PowerShell file and I use it to avoid having to remember specific PowerShell commands.

I started to implement this helper function and was adding some error handling to see what would happen when a property name was supplied that didn’t exist. I ran the helper method with an invalid name and it completed without any errors. I was expecting some kind of error message so I was slightly confused at this. I changed the name and debugged through the PowerShell, I use PowerGUI when writing scripts and it has this feature. When the script got to call the “Set-WFServiceConfiguration” function it completed with no errors.

I opened up the workflow table where the settings are stored and to my surprise the two test values I had tried were in the table.

I thought it was a bit strange that Microsoft would allow you to update properties but not add them, however the wording on the MSDN article does clearly suggest the name has to be one of the existing values. Added to this all the articles I found people were directly adding rows into the database table.

I have included the function I wrote below, see figure 2, in case anyone finds it useful but as always with PowerShell always try this in an non-production environment first.

Function to add property
function Set-SP2013WorkflowSetting{
   [cmdletbinding()]
   Param(    
    [parameter(Mandatory=$true)][string]$workflowManagerURL,
    [parameter(Mandatory=$true)][string]$propertyName,    
    [parameter(Mandatory=$true)][string]$propertyValue)    
    
    Write-Host "Starting to update workflow manager property $propertyName" -ForegroundColor Green
    
    $existingValue = Get-WFServiceConfiguration -serviceUri $workflowManagerURL -Name $propertyName
    
    if($existingValue -eq $null)
    {
        Write-Host "Property $propertyName doesn't have a value which means it isn't in the settings DB table. Please confirm you want to add a new setting" -ForegroundColor Magenta
        $title = "Add Workflow Setting"
        $message = "Do you want to add a new workflow setting?"

        $yes = New-Object System.Management.Automation.Host.ChoiceDescription "&Yes", `
           "Add a new row into the settings table."

        $no = New-Object System.Management.Automation.Host.ChoiceDescription "&No", `
           "Doesn't add the new setting and exists the function."

        $options = [System.Management.Automation.Host.ChoiceDescription[]]($yes, $no)

        $result = $host.ui.PromptForChoice($title, $message, $options, 1)

        switch ($result)
        {
            0 {}
            1 {return}
        }
    }
    else
    {
        Write-Host "Poperty $propertyName has a value of $existingValue it will be changed to $propertyValue" -ForegroundColor Green
    }
    
    Set-WFServiceConfiguration -serviceUri $workflowManagerURL -Name $propertyName -Value $propertyValue
    
    Write-Host "Finished updating workflow manager property $propertyName" -ForegroundColor Green    
}

Figure 2

Updates needed to get “Navigation options for SharePoint Online” MSDN article to work

January 20, 2015

My company was working on a project with a SharePoint Online implementation. While developing the site it was noticed that the performance wasn’t up to the required level.

My colleagues and I done some investigation and the results seem to indicate the slow load times were down to the response coming from the server.

We al tried different approaches to try and track down which elements were causing the poor performance by changing the master pages and adding/removing components.

The main issue seemed to related to the navigation options on the site. Our initial setup was to create the site using the structural navigation as this is easy to use and is security trimmed.

When the top navigation was removed we noticed significantly better page load times. After it was identified the top navigation was having an impact it was changed from structural to managed metadata and while this did help it doesn’t doesn’t security trim the navigation and requires the term set to be updated when new sites are added.

After much searching I finally found an MSDN article which backed up the top navigation potentially being a performance problem. The article suggested the managed metadata as an alternative but we had already ruled this out. It did provide a third approach which I had not previously considered and this was to use a search driven approach.

After reading through this I thought it sounded like a good solution. The only downsides were the ability to order the top navigation and the fact it renders the navigation using JavaScript so obviously puts additional load on the client.

I decided to implement this approach and then compare the page load times with the structural and search driven navigation to see if there was any significant difference. Reading the article I though this will be a 5 minute change but in the end it took me half the morning. Hopefully this blog will provide a more detailed set of steps for anyone looking to implement this approach.

Steps 1 to 4 are simply supplying instructions on how to download a copy of one of the existing master page files. Steps 5 and 6 provide details on how to update the downloaded master page and re-upload it to SharePoint under a different name.

All these steps for far are fine, however it was after this I started noticing some issues. The first issue is nowhere in the article does it actually tell you to add the JavaScript anywhere to the site. This may seem obvious but for completeness I wanted to highlight it. In my case I copied the JavaScript, made the change to the ‘root’ variable at the top of the file and then uploaded it to a folder in the Style library, which can be found at the root of the site.

With the JavaScript file in place I then added a reference to it in the head section of the custom master page, see figure 1

<script type="text/javascript" src="/sites/Intranet/Style%20Library/CN.Intranet/js/SearchTopNavigation.js"></script>

Figure 1

Steps 7 and 8 detail the change needed to update the root site URL and also describe in detail how the JavaScript works. This is where another of the additional steps is required. As part of step 7  it says “This data is then parsed into the previously defined SPO.Models.NavigationNode objects which use Knockout.js to create observable properties for use by data binding the values into the HTML that we defined earlier”. The key in here is the code uses Knockout.js so I needed to add a reference to this in my master page as well. To do this you can either download a copy or point the reference to a CDN. I always download a copy as then you are in control over the version and the availability of the content. I downloaded version 3 from the Knockout website. Once I have a copy of this I uploaded it to the same folder in the style library and added another reference in my custom master page, see below for reference.

<script type="text/javascript" src="/sites/Intranet/Style%20Library/CN.Intranet/js/JQ/knockout-3.0.0.js"></script>

Also in point 7 there is another line which is easy to skip over which requires another JavaScript reference “Next, the results are assigned to the self.nodes array and a hierarchy is built out of the objects using linq.js assigning the output to an array self.heirarchy.”. The key here is it uses linq.js to create the object hierarchy so it can build the navigation. I had a Google for this and found a CodePlex site. It looked like the correct plugin so I downloaded the solution and took the minimised version ‘linq.min.js’ and uploaded this into the same folder in the style library. I also added another reference to my custom master page, see figure 3

<script type="text/javascript" src="/sites/CHASIntranet/Style%20Library/CN.Intranet/js/JQ/linq.min.js"></script>

Figure 3

At this point I thought that would be all the changes I needed, however when I loaded the site I started getting a JavaScript error, see figure 4.

“object doesn’t support property or method “byhierarchy””

Figure 4

I then spent a while debugging through all the code to see if I could indentify what was happening and it seemed to be in the ‘buildHierarchy’ function where it was calling ‘ByHierarchy’ on the collection of navigation items. I checked and there were items being passed to the function.

I looked around and tried the JQuery Linq JS version which comes as part of the download, jquery.linq.min.js, but this didn’t work either. I then had a Google for the error and I found the Source Control where the “ByHierarchy” functionality was added to linq.js. The download provides a Visual Studio project so I opened this and searched for “ByHierarchy” and I found references in “jquery.linq.js” and “linq.js”. It took me a minute to trigger but I was using the minified versions to increase performance but it looks like the new function wasn’t added to the minified version. I double checked this by opening the minified version I upload to SharePoint and searched for “ByHierarchy” but nothing was returned.

I uploaded “linq.js” to the style library and updated the reference in my master page and then tested the site again and while the JavaScript error was gone the navigation was not displaying. All I could see was the navigation div was being displayed but there was no contents, see figure 5.

Generated Top Nav
  1. <div style="" id="navContainer">
  2.                     <div class="noindex ms-core-listMenu-horizontalBox" data-bind="foreach: hierarchy">
  3.                         <ul style="padding-left:20px" data-bind="foreach: $data.children" id="menu"></ul>
  4.                     </div>
  5.                 </div>

Figure 5

I checked and there were no JavaScript errors in the console in IE so I started debugging through the code. I could see items were being returned as part of the query and NavigationNode objects were getting created but I couldn’t see what was missing. In the end I got a colleague of mine to double checked and he noticed that when building up the NavigationNode objects in function “getNavigationFromDto” it looked like the parent property was getting set to the content type ID not the parent site URL.

After checking the values available the parent site URL seemed to be stored in index 20 so I updated my version of the code, see figure 6

Updated JavaScript Function
  1. //Parses a local object from JSON search result.
  2. function getNavigationFromDto(dto) {
  3.     var item = new SPO.Models.NavigationNode();
  4.     if (dto != undefined) {
  5.  
  6.         item.Title(dto.Cells.results[3].Value);
  7.         item.Url(dto.Cells.results[6].Value);
  8.         item.Parent(dto.Cells.results[20].Value);
  9.     }
  10.  
  11.     return item;
  12. }

Figure 6

When I then tested the site the navigation started working. I was very relieved at this point but I wanted to make sure everything was functioning as expected so I attached a break point in IE and started to step through the code and  this is where I found my final error. In the sample supplied in the MSDN article the “NavigationViewModel” function is used to get the data from either the localStorage cache or it executes the search API query, however in the sample code the checked implemented is always false so never retrieves from the local storage see line 2 in figure 7 below.

Retrieve Navigation Function
  1. var fromStorage = localStorage["nodesCache"];
  2.         if (false) {
  3.             var cachedNodes = JSON.parse(localStorage["nodesCache"]);
  4.             var timeStamp = localStorage["nodesCachedAt"];
  5.             if (cachedNodes && timeStamp) {
  6.                 //Check for cache expiration. Currently set to 3 hrs.
  7.                 var now = new Date();
  8.                 var diff = now.getTime() – timeStamp;
  9.                 if (Math.round(diff / (1000 * 60 * 60)) < 3) {
  10.  
  11.                     //return from cache.
  12.                     var cacheResults = [];
  13.                     $.each(cachedNodes, function (i, item) {
  14.                         var nodeitem = getNavigationFromCache(item, true);
  15.                         cacheResults.push(nodeitem);
  16.                     });
  17.  
  18.                     self.buildHierarchy(cacheResults);
  19.                     self.toggleView();
  20. addEventsToElements();
  21.                     return;
  22.                 }
  23.             }
  24.         }
  25.         //No cache hit, REST call required.
  26.         self.queryRemoteInterface();

Figure 7

I updated this test to check the value of the fromStorage variable, see figure 8, and done some testing and the code starting pulling the results from my local storage instead of doing the query every time.

Retrieve Navigation Function 2
  1. var fromStorage = localStorage["nodesCache"];
  2.         if (fromStorage != null) {
  3.             var cachedNodes = JSON.parse(localStorage["nodesCache"]);
  4.             var timeStamp = localStorage["nodesCachedAt"];
  5.             if (cachedNodes && timeStamp) {
  6.                 //Check for cache expiration. Currently set to 3 hrs.
  7.                 var now = new Date();
  8.                 var diff = now.getTime() – timeStamp;
  9.                 if (Math.round(diff / (1000 * 60 * 60)) < 3) {
  10.  
  11.                     //return from cache.
  12.                     var cacheResults = [];
  13.                     $.each(cachedNodes, function (i, item) {
  14.                         var nodeitem = getNavigationFromCache(item, true);
  15.                         cacheResults.push(nodeitem);
  16.                     });
  17.  
  18.                     self.buildHierarchy(cacheResults);
  19.                     self.toggleView();
  20. addEventsToElements();
  21.                     return;
  22.                 }
  23.             }
  24.         }
  25.         //No cache hit, REST call required.
  26.         self.queryRemoteInterface();

Figure 8

The final issue with this was noticed by one of my colleagues Tal Peer. Our development environment was a basic site collection with some team sites, however when this was deployed to the production site collections Tal noticed the top navigation was returning additional items. It turns out that the search query will also return app webs which are installed as part of adding apps to the site. Obviously these webs are generally hidden from the end user so they shouldn’t appear in the top navigation. Tal was able to add an additional part to the search query to exclude these app webs by adding “-WebTemplate:APP”, the full query can be seen in figure 9

Hopefully this helps any other people who were trying to implement this functionality. It’s a very good idea to load the navigation from the search index and the basis of the code is there but I just found these additional steps/issues to be very frustrating.

Happy SharePointing 🙂


SharePoint 2013 Workflow Instance Size Error

January 6, 2015

While working on a recent project I got the chance to setup and use SharePoint 2013 workflows. This is a major step forward for the product and has a lot of benefits, however it does introduce a few complications.

As the workflows are now all stored and processed outside of SharePoint all communication is done via WCF services. This is great as it extracts the processing out of SharePoint and thus reduces the load on SharePoint.

The downside of this is it introduces additional areas where something can go wrong and this is what leads me onto the topic of this blog post.

The workflow I was working on was used to handle the approval of data on a SharePoint list item. In order for the workflow to move onto the next stage it required 5 groups of users to approve the data. The workflow had to wait until all 5 groups of users had completed and approved the data and once approved the workflow would move onto the next stage and send some emails.

The UI to handle the approval locked down the form so only users in certain groups could actually approve each section so in most cases a user would come in and approve 1 out of the 5 sections and this worked fine. There was also an admin group who could approve all 5 sections at one time. It was in this case where an error was being thrown by the workflow causing it to be terminated, see figure 1

image

Figure 1

When clicking on the workflow I could see the internal status was ‘Terminated’ and when clicking on the information icon I could see the error was “System.Activities.Statements.WorkflowTerminatedException: The workflow instance was too large to persist (over 5120 kilobytes). The maximum persistable workflow instance size is 5120 kilobytes.”, see figure 2.

image

Figure 2

For SharePoint errors this is actually reasonably detailed as it supplies a message explaining what the issue is instead of the usual something went wrong error. As this was my first real experience of SharePoint 2013 workflows I didn’t really know where to start debugging this so I had a Google around.

There are a few articles out there but nothing which matched what I was looking for. The closest I came was an article which suggested changing a server configuration property called ‘WorkflowServiceMaxActivityXamlSizeInBytes’.

I had a look into this and discovered this property can be updated by using a PowerShell command called ‘Set-WFServiceConfiguration’, see MSDN article. I tried to increase this value but unfortunately it didn’t seem to make any difference. I looked over the additional properties which can be set and I found one called ‘WorkflowServiceMaxInstanceSizeKB’ which seemed to match the error I was getting in terms of the description of the error but also the max value limit on the error message seemed to match the default value on the MSDN article. I tried changing this using the PowerShell command but again the workflow was still failing with the same error.

Given these properties were exposed so Microsoft obviously expected these might need to be changed in certain circumstances and the fact that the ‘WorkflowServiceMaxInstanceSizeKB’ property matched so closely to the error I was encountering I done some investigation into this and found there was another property called ‘WorkflowServiceMaxInstanceCompressedSizeKB’ which also had the same default value of 5120.

Unfortunately all articles referring to this were people adding a row directly into a table in one of the workflow databases. Anyone who knows SharePoint knows touching any SharePoint databases is not supported so I was reluctant to do this, however since it was a development environment I tried it but I still encountered the same error.

It wasn’t until I came across an article which was describing a slightly different error that on a I noticed down at the bottom of the article it said you have to restart the ‘Workflow Manager Backend’ windows service in order for settings changes to be picked up and after I done this the workflow started working.

I then had to do some testing to check which of the properties I had changed was the one which fixed the error and in my case it turns out I only needed the ‘WorkflowServiceMaxInstanceSizeKB’ property which could be set via PowerShell, see figure 3 for the exact script, so I didn’t have to worry about adding a value directly into the workflow database.

Set-WFServiceConfiguration -ServiceUri http://workflowmanagerurl:12291/ -Name “WorkflowServiceMaxInstanceSizeKB” -Value 30720

Figure 3

Overall the process of changing the workflow settings is very easy using the available PowerShell functions the key is to remember and re-start the ‘Workflow Manager Backend’ windows service to get these updated values picked up.


%d bloggers like this: