Tuesday, 15 July 2014

Optimizing your SharePoint Development VM for Solid State Drives (SSDs)

Once you upgrade to a Solid State Drive, there is no going back. Your VMs boots faster, your package deployments take mere seconds, and everything feels the way it should. Unfortunately there is a major drawback to a SSD, its lifespan.
Traditional Hard Drives are measured by “Mean Time Between Failures”, indicating the average lifespan of the drive in hours. As a reference point, TheWestern Digital Red NAS drive has an MTBF of 1 million hours.
Solid State Drives have a maximum number of times that each block can be written to. Once a block has reached its limit, the drive often fails with data recovery being next to impossible. To mitigate this problem, SSD controllers use a technique called Wear Leveling to help distribute which blocks are written to on any given IO operation. From my personal experiences, I have had my SSDs die after approximately 12-13 months.
To help reduce the unnecessary writes and to improve your SSD’s lifespan, the following can be done. Please note that these recommendations are strictly for personal Development VMs.
Disable Disk Defragmenter

Because SSDs can access the data directly (unlike HDDs that require seeking), defragmenting a SSD will not provide any performance improvement. However, it will cause a lot of unnecessary writes to your drive. Windows will automatically disable defragmentation when it detects an SSD, however Virtual Machines don’t always know what physical storage the Virtual HDs are running on. Turn off Scheduled Defragmentation to play it safe:


  • Windows 2008 R2
    1. Win Key + R
    2. %windir%\system32\dfrgui.exe
    3. Ensure that Scheduled defragmentation is turned off
  • Windows 2012
    1. Win Key + R
    2. %windir%\system32\dfrgui.exe
    3. Change settings
    4. Uncheck Run on a schedule
    5. Save & Close
Turn off SharePoint Logging
SharePoint ULS Logging is invaluable when troubleshooting a problem or observing a particular behavior, but otherwise the logs aren’t checked daily. I recommend turning on logging in your development environment only when needed. To disable diagnostic logging:

  1. Open Central Administration
  2. Navigate to Monitoring in the left navigation
  3. Click on Configure diagnostic logging
  4. Click on the checkbox next to All Categories
  5. Select None from both of the dropdowns at the bottom of the section
  6. Save and close
If you need to restore your logging, you can follow the same steps and select Reset to default from the dropdowns.
Disable Search Indexing
Independent of SharePoint’s search engine, Windows has its own search index. Disabling Windows Indexing can save some I/O, but it really comes down to personal preference. I like to keep my development environment well organized by client & project, and do not have files scattered all over the place. Disabling the index will allow for a slightly slower search to occur, which isn’t a problem since my scope is a specific folder. To disable Windows Search Indexing:

  1. Open Windows Explorer and navigate to Computer (Win 2008) or This PC (Win 2012)
  2. Right click on your Local Disk (C:) drive and click on Properties
  3. Uncheck Allow files on this drive to have contents indexed in addition to file properties
If you have any other tips & tricks that have worked well for you, please let me know!

Monday, 14 July 2014

WorkAround: Hybrid Approach to Create and Update SharePoint 2013 List from Excel Sheets for large volume of data.

Introduction


In many organizations the management people in any department use Excel to maintain their records, data, history etc. And now a days SharePoint based applications are often used to share and collaborate these files. Many organizations use SharePoint sites as their portal sites for the purpose of sharing information, data within department or organization.


Problem


So now let’s say your manager or client has an Excel sheet with valuable records and now he/she wants to put it on the SharePoint as a list, and asked you to do that. Now what will you do? You cannot use Excel services nor upload excel sheet as it is (even though you can  ) because manager wants SharePoint list only. As a SharePoint developer you will think : first create a SharePoint List and then write one tool which will read Excel sheet records one by one and add it to the SharePoint list. Right? But is it really needed if SharePoint and Office provides it itself? 


Solutions


1. From Excel Sheet


Yes you can quickly create SharePoint List out of Excel sheet without writing any single line of code. Let's see how we can do. Note: I have performed these steps on SharePoint 2013 with Microsoft office 2010.
Steps:
  1. Open the Excel 2010 application.
  2. Insert/create the table. 
     


3. Pressing OK button will generate the below scenario:



4. Pressing the above link will give you the below screen where you need to put the value of a SharePoint application running at your environment, name of the list and description of the list(optional) as shown below:




5. Pressing Next will cause the following Login screen to appear:


6. Press OK and  On the next screen you will see columns with data types which are going to create in SharePoint list.







        



    7. Now click finish and wait until the operation gets finished. You will see that list gets created in SharePoint site with the records.    

        





That's it. 

   

2. Using Import SpreadSheet App


SharePoint 2013 provides 'Import SpreadSheet App' to do the same thing. Follow below steps to import SpreadSheet.

1. Go to Site Contents -> add an app, and search for 'Import SpreadSheet' app.

            
  

2.  Add 'Import SpreadSheet' app and provide information as shown in below screen. You need to select the excel file. [Before that you need to save the excel file in table format as shown in Approach 1.]

        

3. Click on the import, wait for few seconds and your spreadsheet gets imported in list.

        

That is also fine.

But the problem is in both of the above cases, the generated list is completely new with the data on the excel sheet. But how to handle the scenario when there is an old list and you need to update the list from an external excel sheet without loosing the previously stored data. For this reason, we need to programmatically read the newly created list and insert into our target list which can be done by a simple visual web part and on a button click. The click event and the function written  for this purpose is given below which I have applied as a workaround to solve the problem. Below are the source code samples:

 protected void btnUpload_Click(object sender, EventArgs e)
        {
       

            SPSecurity.RunWithElevatedPrivileges(delegate
                {
                    try
                    {
                        
                        LoadData();
                    }
                    catch(Exception ex)
                    {
                        Logger.Current.Log("Error: " + ex.Message);
                    }
                });

        }

        public void LoadData()
        {
            try
            {
                SPWeb spWeb = SPContext.Current.Site.RootWeb;
                SPList spList = spWeb.Lists.TryGetList("MyList");

                SPQuery qry = new SPQuery();
                string camlquery = "<OrderBy><FieldRef Name='Created' /></OrderBy>";
                qry.Query = camlquery;
                DataTable listItemsTable = spList.GetItems(qry).GetDataTable();

                SPList oList = spWeb.Lists["Employee"];
               

                foreach (DataRow dr in listItemsTable.Rows)
                {
                    SPListItem oSPListItem = oList.Items.Add();

                    oSPListItem["Title"] = dr["Title"].ToString();
                    oSPListItem["Designation"] = dr["Designation"].ToString();                  
                    oSPListItem["Address"] = dr["Address"].ToString();
                    oSPListItem["Salary"] = Conver.ToInt32(dr["Salary"].ToString());
                    oSPListItem["Email"] = dr["Email"].ToString();
                    oSPListItem["PhoneNumber"] = dr["PhoneNumber"].ToString();
                    oSPListItem["Organization"] = dr["Organization"].ToString();
                    

                    oSPListItem.Update();
                    
                }

                lblMessage.Text = "Data inserted successfully";
            }
            catch (Exception ex)
            {
                lblMessage.Text = ex.ToString();
            }
        }




Finally create a page  in your web application and add the visual web part to have a visual interface to click on and update SharePoint list from excel sheets dynamically.


Happy Share Pointing...!!!






Monday, 9 June 2014

Configuring Search in SharePoint Server 2013



Configuring Search in SharePoint Server 2013 is not a simple task. There are some critical options that need to be taken care of while configuring. Most of them are illustrated below:

From the Central Admin Home page click Manage service applications under Application Management…



In the Ribbon click New and select Search Service Application.



Name your Search Service Application and Select a service account





Next you’ll select or create an application pool for search.  I’m just going to run on a existing app pool to conserve resources.  If this were production I’d likely create a new app pool for both the search admin web service and the search query and site settings web service.






Click OK and wait…




Upon completion you’ll be presented with the following:





Note: the second time I configured the service app I accessed Central Admin from a computer that wasn’t part of the farm and it appeared to get hung up.  I then browsed to the service applications screen, search showed up just as it does below and everything works as it should.





Now we shall focus on Content Sources. On the Search Administration page there are several links broken into titled categories.  The second group is titled Crawling.


“A content source is a set of options that you use to specify what, when and how to crawl.”

When Search is initially configured the content source “Local SharePoint sites” is created, and as the name implies this includes all SharePoint sites in your farm.  As you create additional web apps they are automatically added to this content source.  Another thing to note is that changing your default AAM will result in that URL being added to your content source in addition to whatever the original URL of your site was, so there may be need for cleanup. 

This is also good to know

“Changing a content source requires a full crawl for that content source”

To read more about add, edit or delete content sources in SharePoint Server 2013


Clicking on Content Sources will bring you to the Manage Content Sources page…


Clicking on the dropdown will result in the following menu appearing…



  
Clicking Edit or on the Name will bring you to the edit content source page.






From this page your initial options are to name your content source, view content source details and add or remove start addresses.  Keep in mind that the Edit and Add pages are basically the same.  Obviously, you are going to need to click the New Content Source button to get to the Add Content Source page etc…   A start address is the point from which the crawler will begin to crawl your site.  Typically Local SharePoint sites is going to have all of your web apps listed by default.
Crawl Settings really only applies to when you are creating a content source because once you have selected a setting you can’t change it.  When creating a content source you have the following options.




Switching between the first 4 of these really just changes the path requirement, as illustrated in the screen shot below.



However, Line of Business data and Custom Repository require significantly different information…  Line of Business Data requires you to select a BCS Service application which of course requires that you have BCS provisioned and a Service Application is connected to some LOB System. 
More information on both source types can be found here…

And here…






A Custom Repository requires that you have a Custom Connector registered.




Your only edit options are,

“Crawl everything under the hostname for each start address”

Or

“Only crawl the Site Collection of each start address”. 


The second option would be used if you want to crawl some site collections in a web app less or more often than others.  There are several factors that would go into a decision like this.  For instance, varying content change frequency between site collections.

The next section deals with Crawl Schedules.  Crawl Schedules has the new option, Enable Continuous Crawls.

“Enable continuous crawls is a crawl schedule option that is new in SharePoint 2013. It is available only for content sources that use the SharePoint sites content source type. A continuous crawl starts at set intervals. The default interval is 15 minutes, but you can set continuous crawls to occur at shorter intervals by using Windows PowerShell.”



The familiar Incremental Crawl and Full Crawl scheduling options are next.  Both of which allow you to create a schedule.    Crawl schedules require a good bit of planning and are very much dependent on the specific needs of the environment.

Last we have Content Source Priority.  Your options here are High and Normal.  The Crawl system uses this to prioritize crawling resources with High content sources being top priority.


From a Content Source’s drop down menu the View Crawl Log options is available.



Clicking on this will bring you to the crawl log 
.


This screen provides you with Average Crawl Duration and Summary information.  This is where you go asses the health of your content source crawls, and is going to be your first stop when you need to troubleshoot content source issues.

Clicking on the number of errors will bring you to this page…



You will be taken to this same page, filtered appropriately, if you click on Warnings or Successes.  URL View allows you to search for crawled documents.  Databases provides a list of your crawl store databases and the number of items in each.  For more information I suggest reading this TechNet article…