Tuesday, March 10, 2015

Office 365 head aches

Jumping into a new technology is tough.  I remember the first time I used PHP or C# I couldn't fathom how it all worked together until I ran through several tutorials and Hello Worlds.

Office 365 is no different, probably worse: it's the same technology that I'm used to (SharePoint) utilized in an entirely different way that I'm used to.  Further, it's in constant flux.  Where I could use master pages to customize sites in the past, now I have to do it all with client side JavaScript (admittedly more fun, but more work).  It was a tremendous pain, however, when I ran into this error trying to use a content editor webpart to add JavaScript to the page:

This HTML cannot be inserted because you don’t have access to add scriptable Web Parts in this site. Please contact your administrator if you think you should have rights to do so.

There was nothing in the site settings or site collection settings to modify this. I could only find one spot in my searches that had the solution: in the admin web application for your tenancy there is a spot buried in the settings that lets you allow scripting on site collections.  

You also have to wait at most 24 hours for this to take place.

*sigh*

To get there from your site collection:

  • Click the the squares in the top left, click admin.   
  • At the bottom under the apps menu, click SharePoint.  (this will take you to "https://yoursite-admin.sharepoint.com/_layouts/15/online/TenantSettings.aspx", you can go directly there, if you would like as well)

  • Click Settings on the bottom of the navigation menu on the left.
  • In this page there is a: Custom Scripts section about halfway down.  Enable" Allow users to run custom script on self-service created sites"
  • Click save and wait a day.  


  • It's always the simple things that seem to cause me the most trouble at first.

    ETA: at first I didn't have the web designer galleries available to me.  I gave myself full rights to my site.

    LESSON LEARNED: Everything takes forever to provision in O365

    Friday, January 2, 2015

    TROUBLESHOOTING RULE #1: Check HBSS

    McAfee's Host Based Security Software's Host-based Intrusion Protection System (HIPS) is the cause of more problems than I care to think.

    When troubleshooting something strange, always check you HIPS (or equivalent virus scanning) logs.

    Especially if that something weird throws permissions errors or no errors at all.

    Tuesday, December 30, 2014

    How to troubleshoot SharePoint Usage Reports Usage Reports

    We got our farm set up and running and after a few days went to check on the popularity trends.  The reports were coming up empty.

    Through my searching I discovered the following troubleshooting steps to diagnose the issue:


    1. Make sure the Usage and Health Data Collection was configured:
      1. Go to Central Admin Monitoring - Configure Usage and Health Data Collection.
      2. Validate that Enable Usage Data Collection was checked.  
      3. Validate that Events were selected to log.
      4. Validate the Log File Location exists (SAVE THIS FOR THE NEXT STEP), it's usually "c:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\15\LOGS\" 
      5. Validate the Database Server and Name of database are set and correct
    2. Verify that the WSS_Admin_WPG and WSS_WPG groups both had full access to the Log File Location (FROM ABOVE) on all the servers.
    3. I made sure both the Microsoft SharePoint Foundation Usage Data Import and Microsoft SharePoint Foundation Usage Data Processing timer jobs were enabled and able to run successfully. 
      1. The easiest way to get there is to go to Central Admin - Monitoring - Configure Usage and Health Data Collection Log Collection Schedule link.  
      2. This will take you to the Timer job definitions.  
      3. Make sure both definitions are enabled, the data import job should run every 5 minutes and the data processing job should run daily.  
      4. Validate that they run by clicking run now and monitor the running jobs and job history.
    4. Check the WSS_Logging  (the database name configured in the usage service application configuration) database's permissions, the web service web application account and the farm account both have ownership rights over it.
    5. Check the contents of the WSS_Logging database, the request tables are populated with information.
    6. Validate that there are receivers configured per this blog post: http://geekswithblogs.net/bjackett/archive/2013/08/26/powershell-script-to-workaround-no-data-in-sharepoint-2013-usage.aspx.  There is a fix within if they don't exist
    7. If none of this works, you can try adding a SQL server account to the database authentication, but every time I set this it reverted...you may only be able to set this on initial configuration.
    8. Check to make sure that data is being generated in the Log File Location in the RequestUsage folder.  There should be a .tmp file and occasionally a .usage file will appear.  After a few minutes the .usage files will disappear and be imported into the Event Store(every time the data import job runs).
    9. Check the EventStore (found on the analytics server (check in Search administration topology) usually in C:\Program Files\Microsoft Office Servers\15.0\Data\Office Server in a folder named Analytics with a GUID, you can also try looking at the file shares on the active analytics server) verify information is being created there, it won't be identical to what is in the usage folder because that information gets filtered out.  
      1. Check to see if there's anything there
      2. Make sure file sharing is enabled between the servers to the analytics processing server and that the WSS groups have write access in both file sharing and NTFS permissions to the EventStore.
    As a note for all of these changes, nothing will be seen in the reports until the processing job runs, often you'll have to wait over night to see if your changes worked.

    In the end, I noticed the eventstore hadn't generated anything since a couple weeks ago.  I had to add permissions as described above to the log folder, the event store, enabled file sharing (port 445) and permissions, and made sure that permissions were set to the database.  After all that, nothing was working still.

    After setting all of that I deleted the service application from Manage Service Applications and recreated it by going to Monitoring - Configure Usage and Health Data Collection  and checking the Enable usage data collection box. After that, the next day, we had results.

    When in doubt, kick it.

    For more info, here's an interesting blog post on how the data collection works with the EventStore: http://blogs.msdn.com/b/spblog/archive/2014/04/03/sharepoint-2013-usage-analytics-the-story.aspx 

    Monday, December 8, 2014

    More SharePoint 2013 UserProfileService Oddities....

    We are using some of the organizational AD attributes slightly different from how they intended the attributes to be used based on our user base.  For example, we used the "company" attribute for what our client uses for "department", the "office" attribute for what they wanted to be called "organization."  We mapped these attributes in the user profile properties to the appropriate names (changing "office" to "Organization" -- noting that the internal SharePoint name is still "Office").  The "Profile Info" web part does some interesting things with this information, however.  By default, it displays:

    [Title]
    [Company]

    Email [email]
    Phone [work phone]
    Office [Office]

    As far as I can tell you cannot change this behavior.  The items listed here appear to be hard-coded in the web part, so if you change how they read it will still display "Email," "Phone," and "Office." If I go into the "Office" property and change the display name, it will not be reflected in this web part.  By default, this property has the "Show in users' profile info" UN-checked.  If you check it the web part will have a link below the information displayed above that says: "display more." Clicking this link will display the "Office" property using the name you edited, for instance it will look like:

    CIO
    Contoso

    Email bart.loesley@contoso.org
    Phone 555-555-5555
    Office IT

    Organization IT

    Why even let us change the property name?

    Friday, December 5, 2014

    SharePoint 2013 userdisp.aspx issue

    Our farm is set up so that internally our web applications use http with NTLM, mainly for the search crawler. 

    Externally, http isn't allowed into the environment, everyone has to connect through https. They also authenticate using smart cards via a trusted identity provider.

     Our client is ancient...they were still using IE 8, so I noticed that everytime they made a call to userdisp.aspx via javascript to populate some fields for a form, I'd get a pop up saying that I was trying to access insecure content. I did some drilling down, and it turned out that the picture being displayed through userdisp.aspx was being accessed by http (the URL displayed is http://mysiteurl:80/User Photos/Profile Pictures/[mysite identifier]_MThumb.jpg?t=[picture timestamp])

     In IE 10 this isn't quite as big a deal, at least to us, because we're currently not trying to pull the picture through ajax for anything and IE 10 doesn't pop up that insecure error anymore.

    I've duplicated this issue in another environment where the authentication was all NTLM. I've set the https for mysites to be the default AAM, as well as the default portal URL to be https. I've tried setting the mysite url in our UPS to be https. Nothing I've done has been able to get the userdisp.aspx page to display the picture using https.

    From looking at the userdisp.aspx page in the hive, it looks like it's pulling the data from the UPS and displaying it programmatically (you don't see what exactly is being pulled, it looks like it just enumerates it and dumps what it finds). However, when I go in to the UPS via powershell to see what is set for my picture, it comes up as https. It's almost as though it's actively replacing what's in the profile with http.

     Has anyone else run into this before? Is there a configuration I've missed somewhere?

    Tuesday, November 4, 2014

    PowerShell Script to remove all permissions for SharePoint user

    Threw this powershell script together to wipe out a user from your farm...I have this running in conjunction with other scripts that disable the user in active directory.


    Add-PSSnapin Microsoft.SharePoint.Powershell -ErrorAction SilentlyContinue
    [Void][Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")
    
    $claimsPrefix = "[Prefix for the user you're removing from the farm]"
    $site = New-Object Microsoft.SharePoint.SPSite("[a site]");
    $service = Get-SPServiceContext $site;
    $site.Dispose();
    
    $upm = New-Object Microsoft.Office.Server.UserProfiles.UserProfileManager($service);
    
    $user = Get-ADUser -Identity [THE USER]
    
    #I use email as the identity in my farm, thus this line
    
    $mail = Get-ADUser $user -Properties mail | Select -ExpandedProperty mail
    $profile = $upm.GetUserProfile($claimsPrefix+$mail);
    
    #Delete MySite
    
    Remove-SPSite $profile.PersonalSite -Confirm:$false;
    
    #Delete Profile
    
    $upm.RemoveUserProfile($claimsPrefix+$mail);
    
    $webApps = Get-SPWebApplication -IncludeCentralAdministration
    foreach ($webApp in $webApps){
    
        #Remove explicit rights given in web application policies
        $webApp.Policies.Remove($claimsPrefix+$mail);
    
        $siteCollections = Get-SPSite -WebApplication $webApp -Limit All
    
        foreach($site in $siteCollections){
            $spuser = Get-SPuser -Web $site.RootWeb -Identity $claimsPrefix+$mail;
    
            #Remove from site collection admins
            $site.RootWeb.SiteAdministrators.Remove($spuser);
            foreach($web in $site.AllWebs){
                $spuser = Get-SPUser -Web $web -Identity  $claimsPrefix+$mail;
                foreach($group in $web.Groups){
                    
                    #Remove from all groups in website
                    $group.RemoveUser($spuser);
                }
                if ($web.HasUniqueRoleAssignments -eq $true){
    
                    #Remove any explicit role assignments
                    $web.RoleAssignments.Remove($spuser);
                }
                foreach($list in $web){
                    if ($list.HasUniqueRoleAssignments -eq $true){
    
                        #Remove any explicit list role assignments
                        $list.RoleAssignments.Remove($spuser); 
                    }#list
                }#lists
            }#webs
        }#sites
    }#webapps
        
    

    SQL Backup Job

    In the past I've made jobs that iterate through every user database and run a backup query.  The problem with this is it doesn't give much feedback about what's going on.  I decided to expand on this concept and have the job create new steps for every new database that's added.  The job history will give some more details about what step failed while allowing it to complete every step. Anyway here's what I've come up with (note, you'll have to create the job first, then get the job id. Also you should add a terminating step before running the first time):


    DECLARE @strDBName VARCHAR(200)
    DECLARE @strSQL NVARCHAR(1000);
    
    DECLARE myCursor CURSOR FOR
    SELECT NAME AS DB
    FROM sys.databases
    WHERE state_desc='ONLNE'
        AND name NOT IN ('master', 'model', 'msdb', 'tempdb');
    
    OPEN myCursor
    FETCH NEXT FROM myCursor INTO @strDBName
    WHILE @@FETCH_STATUS = 0
    BEGIN
    IF NOT EXISTS
        (
        SELECT 1
        FROM msdb.dbo.sysjobs j
        JOIN msdb.dbo.sysjobsteps js ON js.job_id = j.job_id
        JOIN master.dbo.sysservers s ON s.srvid=j.originating_server_id
        WHERE js.command LIKE '%' + @strDBName + '%'
        )
        BEGIN
            DECLARE @stepname NVARCHAR(100);
            SET @stepname = N'Backup ' + @strDBName;
            DECLARE @commandtext NVARCHAR(2000);
            SET @commandtext = N'DECLARE @date VARCHAR(10);
    DECLARE @path NVARCHAR(200);
    SET @date=format(getdate(),''yyyyMMdd'');
    SET @path=N''[YOUR BACKUP PATH]' + @strDBName + ''' + @date + ''.bak'';
    BACKUP Database [' + @strDBName + ']
        TO DISK = @path
        WITH FORMAT';
            EXEC msdb.dbo.sp_add_jobstep @job_id='[THE ID OF THIS JOB]',
                @step_name=@stepname,
                @step_id=2,
                @subsystem='TSQL',
                @command=@commandtext,
                @database_name=@strDBName,
                @on_success_action=3,
                @on_fail_action=3
        END
        FETCH NEXT FROM myCursor INTO @strDBName
    END
    CLOSE myCursor
    DEALLOCATE myCursor
    DECLARE @laststep int;
    SELECT @laststep = count(*)
    FROM msdb.dbo.sysjobs j
    JOIN msdb.dbo.sysjobsteps js ON js.job_id = j.job_id
    JOIN master.dbo.sysservers s ON s.servid=j.originating_server_id
    WHERE j.job_id = '[THE ID OF THIS JOB]'
    
    EXEC msdb.dbo.sp_update_jobstep @job_id='[THE ID OF THIS JOB]'
        @step_id=@laststep,
        @on_fail_action = 2,
        @on_success_action = 1;
    

    One note, the first time you run this it will create the steps but it will only run the steps that existed when the job started running, So the next time the job runs it will use all the steps.

    Some things I still want to work out are if a DB goes offline or is removed, I should remove that step...but that's for later.