In my previous posts about SharePoint I focused on how to improve read-access of items stored in SharePoint lists. Now its time to discuss how best to update or add new items. As the SharePoint Object Model provides a big variety of interfaces we can again choose between multiple approaches.

Scenario 5: Updating a large number of SharePoint list items

The first obvious way of adding or updating items in a SharePoint list is SPListItem.Update. You can obtain the list item by either querying an existing item or by adding a new one via SPListItemCollection.Add.

Let’s have a look at the following sample:

for (int itemIx=0;itemIx<100;itemIx++) {
  SPListItem newItem = items.Add();
  // fill all the individual fields with values
  newItem.Update();
}

Analyzing this code with dynaTrace shows us that EVERY call to the Update method actually calls the internal method SPListItem.AddOrUpdateItem which in fact calls a stored procedure to perform the task:

Each Update call needs a roundtrip to the database in order to add/update the list item
Each Update call needs a roundtrip to the database in order to add/update the list item

We can see that adding 100 items to my list took a total of 4.5 seconds.

Using batch updates instead of individual updates

If you have to update a larger number of items its highly recommended to not use the Update method on every item. Instead – use the batch update function ProcessBatchData provided by SPWeb.

ProcessBatchData takes batch method definitions in XML format. There is a nice article explaining how to use batch update. Implementing the sample from above by making use of batch updates looks like this:

StringBuilder query = new StringBuilder();
for (int itemIx=0;itemIx<100;itemIx++) {
  query.AppendFormat("<Method ID=\"{0}\">" +
          "<SetList>{1}</SetList>" +
          "<SetVar Name=\"ID\">New</SetVar>" +
          "<SetVar Name=\"Cmd\">Save</SetVar>" +
          "<SetVar Name=\"{3}Title\">{2}</SetVar>" +
       "</Method>", i, listGuid, someValue, "urn:schemas-microsoft-com:office:office#");
}
SPContext.Current.Web.ProcessBatchData("<?xml version=\"1.0\" encoding=\"UTF-8\"?>" +
    "<ows:Batch OnError=\"Return\">{0}</ows:Batch>", query.ToString())

Adding the same 100 items via ProcessBatchData analyzed again with dynaTrace:

SPWeb.ProcessBatchData processes batch updates significantly faster than individual updates
SPWeb.ProcessBatchData processes batch updates significantly faster than individual updates

Comparing both update approaches shows that we have a significant performance improvement with the batch update:

Performance Difference between ProcessBatchData and individual updates for 100 list items
Performance Difference between ProcessBatchData and individual updates for 100 list items

Caution

Batch updates are really recommended when doing larger updates. But please consider the overhead of creating the batch update xml:

  • Make sure you use a StringBuilder and not individual string objects that get concatenated
  • Split up batch update calls to keep the generated XML small enough to not run into out of memory exceptions. I ran into an OOM when executing the above example with 50000 batch updates

As an alternative you can also use the UpdateListItems method of the Lists Web Service.

Conclusion

Use batch processing when updating a larger number of items. It will allow the database to process all your update requests at once and will significantly improve the overall performance of your SharePoint application.

Update: Read the summary blog of my blog series about The Top SharePoint Performance Problems. As SharePoint is built on the .NET Platform you might also be interested in my latest White Papers about Continuous Application Performance for Enterprise .NET Systems