Export User Profile Information Using PowerShell

April 5, 2013

While working on my most recent project I have spent a lot of time using the user profile service and while it offers some real benefits I have found it can be difficult to troubleshoot, especially when it comes to synchronizing with external sources like active directory (AD).

After configuring the user profile service on the client’s live environment I noticed that the email and manager fields were missing for some users. I wanted to try and get an idea of the extent of the issue so I went to the manage user profiles screen in central admin (CA) and while its useful for finding individual users its doesn’t really help view certain properties for all users.

Thinking about the options available I knew from previous experience I could write some C# code to get this information but I have been making an effort to brush up on my PowerShell (PS) skills so I decided this would be a perfect opportunity.

As with all PS scripts its essential these are first tested in a development environment before run on live. I began by creating my PS script on my development environment using PowerGUI as I think this is an excellent tool which makes creating PS scripts much easier.

Before starting I worked out what the rough steps would be to get the user profile information and i came up with the following:

  1. Get the site the user profile service is associated with
  2. Get the user profile service
  3. Get all user profiles
  4. Loop through all user profiles outputting the required information
  5. Collate steps and parameterise into a function

I then started working on each step and finally put it all together. Each of the steps is detailed below.

Step 1

From previous experience I knew the easiest way to get the user profile service was to get the service context for the site the user profile service is associated with. The first line gets the site passing a variable holding the site URL, next I check we have the site. I then got the service context associated with the site by calling the Get-SPServiceContext method passing the site

  1. $site = Get-SPSite $SiteURL -ErrorAction SilentlyContinue
  2.     if($site -eq $null)
  3.     {
  4.         Write-Host "Unable to access site " $SiteURL " please ensure the URL is correct and you have access to this"
  5.         return
  6.     }
  7.     
  8.     $serviceContext = Get-SPServiceContext($site) -ErrorAction SilentlyContinue        
  9.     if($serviceContext -eq $null)
  10.     {
  11.         Write-Host "Unable to get the service context for site " $SiteURL
  12.         return
  13.     }

 

Step 2

Once I had the service context the next step is to get the user profile manager which is done by creating a new UserProfileManager instance passing in the associated service context.

  1. $userProfileManager = new-object Microsoft.Office.Server.UserProfiles.UserProfileManager($serviceContext)    
  2.     if($userProfileManager -eq $null)
  3.     {
  4.         Write-Host "User Profile Manager does NOT exist"
  5.         return
  6.     }

Step 3

With the instance of the user profile manager I then got all users by calling the GetEnumerator method.

  1. $users = $userProfileManager.GetEnumerator()

Step 4

In my situation there were hundreds of users so I decided outputting the information to the PS console would be difficult to read of I decided to output the details to a text file instead.

I looped though each user profile outputting the users preferred name. In addition so I could get additional user profile property values I looped through an array containing the names of user profile properties I wanted to get and output a string containing the property name and the property value for the current user profile

  1. foreach ($user in $users)
  2.     {        
  3.         $userName = $user.Item("PreferredName")
  4.         Write-Output "User Profile $userName" | Out-File $FileLocation -Append
  5.         if($UserProperties.Count -gt 0)
  6.         {        
  7.             for ($i = 0; $i -lt $UserProperties.Count;$i++)
  8.             {
  9.                 $UserProperty = $UserProperties[$i]                
  10.                 if($UserProperty -ne "" -and $UserProperty -ne $null)
  11.                 {
  12.                     try
  13.                     {
  14.                         $userPropertyValue = $user.Item($UserProperty)
  15.                         Write-Output "User Profile Property $UserProperty value is $userPropertyValue" | Out-File $FileLocation -Append
  16.                     }
  17.                     catch
  18.                     {
  19.                         Write-Host "Property $UserProperty doesn't exist"
  20.                     }    
  21.                 }
  22.             }
  23.         }
  24.     }

Step 5

Putting all the above steps together adding additional logging, variable checks and ensuring all settings can be passed in I ended up with the function below.

  1. function GetAllUserProfiles([string]$SiteURL, [string]$FileLocation, [array]$UserProperties)
  2. {
  3.     if($siteURL -eq $null -or $FileLocation -eq $null)
  4.     {
  5.         Write-Host "Please ensure all parameters are supplied"
  6.         return
  7.     }        
  8.     
  9.     Write-Host "Starting to get all user profiles"
  10.     
  11.     if($SiteURL -eq $null)
  12.     {
  13.         Write-Host "Please supply the site URL the User Profile service application is assocaited with"
  14.         return
  15.     }
  16.     
  17.     $site = Get-SPSite $SiteURL -ErrorAction SilentlyContinue
  18.     if($site -eq $null)
  19.     {
  20.         Write-Host "Unable to access site " $SiteURL " please ensure the URL is correct and you have access to this"
  21.         return
  22.     }
  23.     
  24.     Write-Host "Got site " $SiteURL
  25.     
  26.     $serviceContext = Get-SPServiceContext($site) -ErrorAction SilentlyContinue
  27.         
  28.     if($serviceContext -eq $null)
  29.     {
  30.         Write-Host "Unable to get the service context for site " $SiteURL
  31.         return
  32.     }
  33.     
  34.     $userProfileManager = new-object Microsoft.Office.Server.UserProfiles.UserProfileManager($serviceContext)
  35.     
  36.     if($userProfileManager -eq $null)
  37.     {
  38.         Write-Host "User Profile Manager does NOT exist"
  39.         return
  40.     }
  41.     
  42.     Write-Host "Got User Profile Manager "
  43.     
  44.     Write-Host "There are " $userProfileManager.Count " user profiles starting to loop through them. The results will be output to specified text file"
  45.     
  46.     $users = $userProfileManager.GetEnumerator()
  47.     foreach ($user in $users)
  48.     {        
  49.         $userName = $user.Item("PreferredName")
  50.         Write-Output "User Profile $userName" | Out-File $FileLocation -Append
  51.         if($UserProperties.Count -gt 0)
  52.         {        
  53.             for ($i = 0; $i -lt $UserProperties.Count;$i++)
  54.             {
  55.                 $UserProperty = $UserProperties[$i]                
  56.                 if($UserProperty -ne "" -and $UserProperty -ne $null)
  57.                 {
  58.                     try
  59.                     {
  60.                         $userPropertyValue = $user.Item($UserProperty)
  61.                         Write-Output "User Profile Property $UserProperty value is $userPropertyValue" | Out-File $FileLocation -Append
  62.                     }
  63.                     catch
  64.                     {
  65.                         Write-Host "Property $UserProperty doesn't exist"
  66.                     }    
  67.                 }
  68.             }
  69.         }
  70.     }
  71.     
  72.     Write-Host "Function completed"
  73. }

Conclusion

I have already used this function several times without any issues and I have found it a very easy and flexible way to pull information from the user profile store.

I hope this is helpful for others too.


Find Items in External List Programmatically

May 15, 2012

While working on a recent project I had to connect to an external DB to pull in some data and then use this in other lists in a SharePoint site. The obvious solution for this was to use BCS and set up an external content type to get the data from my DB. I had never setup BCS before so decided to use SharePoint Designer to set it up as this seemed the easiest way. It took a little trial and error, especially around permissions, but eventually I managed to get my list up and running displaying data. In addition I added a filter to my external content type to allow users to search on the data. I won’t go over how to add a filter as there are numerous articles on the web about how to do this, see examples below.

http://msdn.microsoft.com/en-us/library/ff798274.aspx
http://www.lightningtools.com/blog/archive/2010/01/14/creating-comparison-and-wildcard-filters-for-bcs-in-sharepoint-designer.aspx

With the external list working I then created another list and within this I created a column which was an lookup to my external list. I think SharePoint must realise this is a special case as even though you add a column as a lookup to another list it changes the type to an External Data column. I was then able to add new items to my custom SharePoint list and associate them with data from my external DB.

This was all fine but I then need to migrate some of the customers existing data from an Excel SpreadSheet into SharePoint. Adding items into SharePoint is a fairly easy task and something I have done numerous times over the years but the one new section to this was how I would query the external list and then set the value of the external data column in my custom list.

After some research I found a lot of people suggesting that you could simply use CAML in the normal manner, however during my testing the external list always had 0 items. I then found an MSDN article on Using the BDC Object Model so I decided to try this.

I copied the code, see full code below, from the Using Filters section of this article and added a reference for the BCS code, Microsoft.BusinessData.dll located in the ISAPI folder in the SharePoint root.

MSDN Example Code
  1. const string entityName = "Machines";
  2.         const string systemName = "PartsManagement";
  3.         const string nameSpace = "DataModels.ExternalData.PartsManagement";
  4.         BdcService bdcService = SPFarm.Local.Services.GetValue<BdcService>();
  5.         IMetadataCatalog catalog =
  6.           bdcService.GetDatabaseBackedMetadataCatalog(SPServiceContext.Current);
  7.         ILobSystemInstance lobSystemInstance =
  8.           catalog.GetLobSystem(systemName).GetLobSystemInstances()[systemName];
  9.         IEntity entity = catalog.GetEntity(nameSpace, entityName);
  10.         IFilterCollection filters = entity.GetDefaultFinderFilters();
  11.  
  12.         if (!string.IsNullOrEmpty(modelNumber))
  13.         {
  14.             WildcardFilter filter = (WildcardFilter)filters[0];
  15.             filter.Value = modelNumber;
  16.         }
  17.  
  18.         IEntityInstanceEnumerator enumerator =
  19.           entity.FindFiltered(filters, lobSystemInstance);
  20.  
  21.         entity.Catalog.Helper.CreateDataTable(enumerator);

 

My first issue was I had no idea what the string variables at the top of the code should be set as. I tried debugging the code and I was able to find out the system name but it took some time. It turns out all the required details are in SharePoint Designer, see below.

  • Entity name should be the name at the top of the external content type information
  • System name should be the external system at the bottom of the external content type information
  • nameSpace should be the Namespace in the middle of the external content type information

SharePointDesignerExternalContentTypeDetails

Similarly to the example MSDN code I had setup one filter and this was a wildcard filter so I didn’t have to change that section of the code. All I changed was I set the filter value to be a value I was interested. When I ran the code everything seemed to be working as expected but I noticed if I was searching for ‘Test123’ and there was an item in the external list which matched exactly then it would return my result, however if I changed the code to search for ‘Test’ it didn’t find anything. I tried adding ‘*’ in various places to act as a wildcard but it made no difference to the results.

I expected since the filter was a wildcard filter I could search on only a part of the phrase and it would return what I was interested in but it wasn’t working. I checked the filtering functionality by adding a new item in my custom list and using the searching functionality for BCS and it worked in that I could search for only part of a phrase such as ‘Test’ and it would find partial matches like ‘Test123’. After confirming the out of the box functionality was working I rechecked my code but it seemed to match the example provided above. I then check and rechecked my settings in SharePoint Designer. but everything was as I would expect.

I then turned to a colleague Ross MacKenzie and we both went through the code together but even then we were unable to establish what was going wrong. He then suggested if I have tried ‘*’ around my search criteria why not try ‘%’ as this is the wildcard in SQL. After spending a good couple of hours looking about it finally started working with the code below.

Correct Wildcard Syntax
  1. WildcardFilter companyReferenceFilter = (WildcardFilter)filters[0];
  2.         companyReferenceFilter.Value = String.Format("%{0}%", siteRefCode.ToString());

 

Conclusion

I hope this helps others and stops them encountering the same issues as me as while the change to the example code was small it made a massive impact and from what I could see it wasn’t documented very well.

Happy SharePointing Smile


CAML Query Join Lists

May 11, 2012

I was in a situation where I had a parent list with some details and then a separate list for child items. The child list had a lookup to the parent list to make the association between the two. I needed a query which allowed me to search on some fields in the parent list and some fields in the child list.

Approach

When I was thinking about this I came up with a few different ideas which are each discussed below:

  1. Multiple Caml Queries. One option was I could run a Caml query against the parent list to get the items which meet the parent item criteria and then do another Caml query looking for items in the child list which are associated to the items returned in the parent Caml query but also meet the filtering on the child list. Obviously this is not idea as it would involve a decent amount of code to do but would also be fairly inefficient as well.
  2. Linq to SharePoint. While this would probably have been the easiest approach as you can very quickly join two lists using Linq, once you have your entry classes generated,  but I decided against this approach. My reasoning was based around a solution design perspective. I decided that while Linq to SharePoint would prove beneficial in this situation it would require adding and maintain another element in the solution thus increasing the complexity.
  3. Join Caml Query. This was something I had never done before but I was aware it was one of the new features in SharePoint. I done some research and it seemed like a fairly straightforward approach so I decided to go with this.

 

Method

There are various articles out there on how to join lists via a Caml query so I won’t go into a great deal of detail only the points I found interesting.

The first thing is the Caml query will be created as you normally would be it needs to be run against the child list not the parent list.

In order to map the lists together you use the Joins property of the SPQuery object. As I mentioned there is a lot of information out there for more complicated joins but in my situation it was simply joining parent and child lists together, see example below.

Join Lists
  1. StringBuilder joinDetails = new StringBuilder();
  2.         joinDetails.Append("<Join Type='INNER' ListAlias='ParentListName'><Eq><FieldRef Name='ChildListLookupColumnName' RefType='Id'/><FieldRef List='ParentListName' Name='ID'/></Eq></Join>");           

 

With the join created the next step is to set up the projected fields, these are the fields from the parent list which you want to display or filter against in the where clause, see below. The Name element is any name you want to give it and this will be used in the view fields and query sections of the SPQuery object. As far as I can see the type is always Lookup. The ShowField element is the name of the field in the parent list you want to map to this new field.

Projected Fields
  1. StringBuilder projectedFields = new StringBuilder();
  2.         projectedFields.Append("<Field Name='ParentListField1' Type='Lookup' List='ParentListName' ShowField='Field1'/>");
  3.         projectedFields.Append("<Field Name='ParentListField2' Type='Lookup' List='ParentListName' ShowField='Field2'/>");
  4.         projectedFields.Append("<Field Name='ParentListField3' Type='Lookup' List='ParentListName' ShowField='Field3'/>");
  5.         projectedFields.Append("<Field Name='ParentListField4' Type='Lookup' List='ParentListName' ShowField='Field4'/>");
  6.         projectedFields.Append("<Field Name='ParentListField5' Type='Lookup' List='ParentListName' ShowField='Field5'/>");
  7.         projectedFields.Append("<Field Name='ParentListField6' Type='Lookup' List='ParentListName' ShowField='Field6'/>");

 

You must ensure all projected fields are listed in the view fields SPQuery property. Again anyone who has done basic Caml queries will have seen this before the only consideration is when using fields from the parent list you have to use the name set in the projected fields not the name of the column in the parent list, see below.

View Fields
  1. StringBuilder viewFields = new StringBuilder();
  2.         viewFields.Append("<FieldRef Name='ParentListField1'/><FieldRef Name='ParentListField2'/><FieldRef Name='ParentListField3'/>");
  3.         viewFields.Append("<FieldRef Name='ParentListField4'/><FieldRef Name='ParentListField5'/><FieldRef Name='ParentListField6'/>");

With the join between the lists done and the mapping for the fields in the parent list you can then write your Caml query as per normal and filter against details in the parent list, see below for a simple example Caml query.

Example Caml Query
  1. sb.Append("<Where><IsNull><FieldRef Name='ParentListField1' /></IsNull></Where>");

 

The final element is to associate all these with your SPQuery object and then pass this to the list GetItems method.

Associated with SPQuery
  1. var query = new SPQuery();
  2.         query.Joins = joinDetails.ToString();
  3.         query.ProjectedFields = projectedFields.ToString();
  4.         query.ViewFields = viewFields.ToString();
  5.         query.Query = sb.ToString();

 

Getting to this point did take some changing of the various Caml query properties and it was slightly frustrating but I don’t think this is particularly related to joins within Caml but more of a general issue with Caml.

Issues

There was one issue which seems particularly related to joins and this was when trying to get a DataTable of the results instead of a SPListItemCollection, see example below. This throws a NullReferenceException, see stack trace below, and it seems the only way around this is to work with the SPListItemCollection and not a DataTable

Get DataTable
  1. var queryResults = list.GetItems(query).GetDataTable();  

 

Stack Trace

The error was System.NullReferenceException: Object reference not set to an instance of an object.     at Microsoft.SharePoint.SPFieldMap.EnsureFieldArray()     at Microsoft.SharePoint.SPFieldMap.GetFieldObject(Int32 columnNumber)     at Microsoft.SharePoint.SPListItemCollection.GetVisibleFieldIndices(Boolean isJsGrid, Int32[]& arrVisibleFieldIndices, Int32& iVisibleFieldCount)     at Microsoft.SharePoint.SPListItemCollection.GetDataTableCore(Boolean isJsGrid)

Conclusion

I think that the join functionality in SharePoint 2010 is a useful feature, however given the fiddly nature of this along with how easy it is to accomplish the same thing using Linq to SharePoint will mean a lot of people won’t use this approach. In my circumstances it works very well and with a few extra lines of code, compared to a normal Caml Query, gives me a lot of extra functionality.


Incorrect date format when using SharePoint DateTimeControl in application page launched via ModalDialog

May 2, 2012

I was working on a project where I needed to create a custom UI for a customer in order to do some complex logic and form formatting. The form was developed as an application page which was deployed into the 14 hive and it was presented to the user via a Popup using the JavaScript ModalDialog functionality. The modal dialog was being launched from a custom web part which was in a subsite below the main site

As part of the form I needed to prompt the user with a field which would capture a date so I used the SharePoint DateTimeControl under the Microsoft.SharePoint.WebControls namespace. I have used this several times before so I didn’t think anything of it until when testing I noticed the date format was US instead of UK. I have had this before so I added some code in the page load of my application page to set the controls regional settings to match the current web regional settings, see below.

Set DateTimeControl Region
  1. DtDueDate.LocaleId = Convert.ToInt32(SPContext.Current.Web.RegionalSettings.LocaleId);

 

This normally works, however when I started testing it was still not setting the region to UK, 2057, instead it was still set to US, 1033. While debugging I noticed the SPContext details where indicating the current web was the top level web not the subsite from which the modal dialog was getting launched.

This explained why the DateTimeControl was displaying in the wrong format as the regional settings at the top level site where set to US. I changed the regional settings at the top level web and the control starting displaying in UK format.

While I was happy this was working I was slightly confused as to why the context was showing the current web to be the root web not the web for the subsite. After some digging around I realised that while I was on the subsite in the main browser window when I was calling the popup I was passing in a relative URL, see examples below, and this meant the context in the popup used this URL not the location from which it was getting called.

Original URL of popup window where context is the root site

“/_layouts/ApplicationPages/Page.aspx”

Adjusted URL of popup window where context is the subsite

“/subsite/_layouts/ApplicationPages/Page.aspx”

This caused me some issues but once I got my head around it it made perfect sense so hopefully this will help others or help me in the future when I forget all about it Smile


Generate WSP and coy to another location on post build event in Visual Studio 2010

April 19, 2012

When working on all SharePoint 2010 projects deployments are generally done via PowerShell scripts. I usually have a set PowerShell script which I copy and alter updating items such as the site URL, solution name and feature ID and I save this in a deployment folder under where the solution file is located. I then create a solution folder in Visual Studio called ‘Deployment files’ and add my PowerShell scripts to this. This way the deployment files are part of the solution and should be added into our code repository as well.

In my PowerShell scripts I don’t hard coded the WSP location I find the current directory from which the script is getting executed and then append the WSP name, see below. This means the WSP has to be in the same folder as the scripts which isn’t generally an issue but it means I have to package the project in Visual Studio and then copy the WSP from the build folder to my scripts folder. While this only takes a few minutes you need to do it each time which can get a bit repetitive.

Find Solution Location
  1. $scriptpath = $MyInvocation.MyCommand.Path
  2. $dir = Split-Path $scriptpath
  3. $solutionName=”WSPNAME.wsp”
  4. $solutionPath = $dir + “\” + $solutionName

 

I decided to look into using post build commands to see if I could generate the WSP then copy it to the folder where my scripts are located. Since I have done it several times I first looked at copying the WSP file to another location. This can be done by

  1. Right click on the project and select properties

ProjectPropertiesWindow

  1. Next select the Build Events option on the RHS
  2. In the post-build event command line window enter the script below

copy $(TargetDir)$(TargetName).wsp $(SolutionDir)DeploymentFiles

    1. The TargetDir should be the full path to the build folder where the WSP will be created
    2. The TargetName should be the same name as the project so in my case if the project name was TestProject I would be looking for a file called TestProject.wsp
    3. The SolutionDir should be the full path to the folder where the solution file is located.
    4. \DeploymentFiles is simply the name of the folder located under the main solution folder where I keep my PowerShell scripts

3. Save the file and build the project

If it has worked you should get a message in the Output screen indicating the file has been copied.

This was fine except this simply copies the WSP from the bin folder to my deployment scripts folder but what I need is to ensure I am getting the latest version of the WSP so I need to generate the WSP before copying the file. As this was not something I had done before I done some research and it seems the only way to do this is to edit the project file.

I opened my project file in Notepad++, might be worth taking a backup of this first, and found the section at the end, see below, and added the suggested XML, see below.

Section After which I need to add my addition XML
<Import Project=”$(MSBuildExtensionsPath32)\Microsoft\VisualStudio\v10.0\SharePointTools\Microsoft.VisualStudio.SharePoint.targets” />

Required XML
<PropertyGroup>
<BuildDependsOn>$(BuildDependsOn);CreatePackage</BuildDependsOn>
</PropertyGroup>”

What I found was as I had already added my post build event to copy the WSP there was already a section in the project file called “PropertyGroup”, see below.

PostBuildCopyProjectXML

At this point I wasn’t sure if I was supposed to add a new PropertyGroup element or add my BuildDependsOn section to the existing property group. I looked around on Google but there wasn’t much details on this so added the section to generate the WSP as another PropertyGroup element but I added this before my copy build event in case the order was important. When I opened Visual Studio, or if you already had it opened you will need to reload the project, and build it I found it did copy the file and it did build the WSP it did them in the wrong order. It first copied the WSP then it built a new version of the WSP.

Obviously this isn’t much use as I would always end up with an old version of the WSP. I tried adding the post build code which moves the file into the same property group, see below, but I still ended up with the same outcome where the file was moved first then a new version created.

XML with generate new WSP and move file in one property group in project file
PostBuildAndGenerateWSPProjectXML

I spent some time researching this issue on Google and found an article on how to generate a WSP in post build command and noticed that I had “<BuildDependsOn>” in my version but in the article above it was “<PostBuildEventDependsOn>”. As soon as I changed this, see new project XML below, things executed in the correct order and it first built my WSP then copied it.

Final version of project file XML

<PropertyGroup>  <PostBuildEventDependsOn>$(PostBuildEventDependsOn);CreatePackage</PostBuildEventDependsOn>
<PostBuildEvent>copy “$(TargetDir)$(TargetName).wsp” “$(SolutionDir)DeploymentFiles”</PostBuildEvent>
</PropertyGroup>

I hope this will help others as it took me a while to get this right. As always please be careful with making changes to the project file as this can cause issues.


JQuery to see if the user is on either the display, edit or new list item form

April 18, 2012

There have been a few situations where I have wanted to run some JQuery on either a display, edit or new item form to change the layout of the pages or hide certain fields. Typically I done this by checking the URL to see if I’m on the list in question, see example below, and then running my JQuery.

  1. if (window.location.href.indexOf('/Lists/TestListName/') != -1) {
  2.         alert('Am in list');
  3.         }

 

This works fine but it results in the above condition generally being true when you are interacting with the list i.e. looking at a particular view. I decided to take a look around to see if there was anything out there which would allow me to see if I was on one of the list forms but I didn’t find anything.

I spent some time looking at the HTML of the list forms to see if there was any way I could accurately check which page I was on. In the end I decided I would use the ribbon as part of the way I would target the type of form along with the breadcrumb. When checking the HTML I noticed the display form had an element for the ribbon which had an ID of ‘Ribbon.ListForm.Display’. From my previous experience of working with the ribbon I knew this was a relatively safe way of checking as the element wouldn’t appear on any other pages as it was particularly targeted at List Forms ribbon options, see function below.

Check if display form
  1. function IsDisplayForm()
  2. {
  3.     var isDisplayForm = false;
  4.  
  5.     var ribbonLiElement = document.getElementById("Ribbon.ListForm.Display");    
  6.     if ($(ribbonLiElement).length > 0) {
  7.         isDisplayForm = true;
  8.     }            
  9.  
  10.     return isDisplayForm;
  11. }

 

While this worked for display forms the solution for the edit and the new forms was slightly more complex as while both of these have a ribbon element they are both ‘Ribbon.ListForm.Edit’. This meant I needed another way to be able to distinguish between a new and an edit form. After comparing the HTML I decided to use the current breadcrumb node as the text for the two types of forms is slightly different. For the new forms it is ‘New Item’ and for the edit form it is ‘Edit Item’, see code below.

Check if edit form
  1. function IsEditForm() {
  2.     var isEditForm = false;
  3.  
  4.     var ribbonLiElement = document.getElementById("Ribbon.ListForm.Edit");
  5.     var currentBreadcrumbElement = $("span.s4-breadcrumbCurrentNode");    
  6.     if (currentBreadcrumbElement.length > 0 && currentBreadcrumbElement.text().toLowerCase() == "edit item"
  7.     && $(ribbonLiElement).length > 0) {
  8.         isEditForm = true;
  9.     }
  10.  
  11.     return isEditForm;
  12. }

 

Check if new form
  1. function IsNewForm() {
  2.     var isNewForm = false;
  3.  
  4.     var ribbonLiElement = document.getElementById("Ribbon.ListForm.Edit");
  5.     var currentBreadcrumbElement = $("span.s4-breadcrumbCurrentNode");        
  6.     if (currentBreadcrumbElement.length > 0 && currentBreadcrumbElement.text().toLowerCase() == "new item"
  7.     && $(ribbonLiElement).length > 0) {
  8.         isNewForm = true;
  9.     }
  10.  
  11.     return isNewForm;
  12. }

Conclusion

The easy option for this may have been to simply check the URL for NewForm.aspx, DispForm.aspx  or EditForm.aspx but I decided it would be better not to use this approach as it is possible to create custom versions of these forms which would have different names.

The one area I have tested where this will not work is if you create a custom InfoPath form using the option in the list ribbon so this is worth keeping in mind.

The other limitation is I have hardcoded the text I’m testing against so this will only work in English but it could be change for other languages.

As always please check this works on a development environment and is tested thoroughly before using. Fingers crossed this will help some people.


%d bloggers like this: