Follow me on Twitter @AntonioMaio2

Monday, October 12, 2015

Securing Office 365 with Activity Monitoring

Thanks to everyone that attended my webinar last week on Securing Office 365 with Activity Monitoring.  We had a great turn out and the slides presented can be found here:

Securing information systems is a very broad topic.  Monitoring and auditing these systems, and in particular the activities of users, is just one important aspect of securing our corporate IT environments. 

In July of this year, Microsoft announced some new capabilities around this within Office 365 – these are new Activity Monitoring and Reporting features.  These capabilities are designed to help organizations that are continually facing challenges with security, privacy and compliance.  In running and supporting the Office 365 service themselves, Microsoft has found that that they're capturing large amounts of data on which activities end users and administrators are performing.  They typically refer to this data as telemetry, and they've built great mechanisms into Office 365 to allow them to efficiently capture (and now share) this telemetry data.

These new capabilities provide greater visibility for administrators, and ultimately compliance and risk officers, into the actions taken by users on corporate content. They also allow us to apply greater access control over data, and if needed, they give us the capability to now investigate (at a very detailed level) user actions that might be against corporate or regulatory policies.

Why is Monitoring Activity and Auditing our Systems Important?

Monitoring user activity and auditing our information systems is important for many reasons.  

Regulatory Compliance

Regulatory compliance requirements are one key driver.  For example, many financial institutions often deal with MNPI, or Material Non-Public Information. Generally, this is information that’s not distributed to the public that an investor would likely consider important in making an investment decision.  Many institutions must put up Compliance walls to ensure that specific aspects of the business don’t communicate with each - this helps to avoid conflicts of interest and helps to ensure that they don’t inappropriately exchange MNPI.  

In particular, this is required in institutions which have both a corporate-advisory unit and a brokering unit, in order to separate those people giving corporate advice on takeovers from those advising clients about buying shares.  The wall is thrown up to prevent leaks of internal corporate information, which could influence the advice given to clients making investments

Detailed monitoring and auditing of user activity allows us to have a detailed view into which users are accessing sensitive content along who they’re sharing it with, and it provide assurances that our regulatory compliance obligations around in these business scenarios are being met.

Investigating Data Breaches
We've heard a lot about data breaches in recent years.  Data breaches can be small or they can be very large. They can be malicious or they can be accidental.   As well, data breaches can be caused by external actors like cyber criminals, or by insiders like system administrators or employees with broad levels of access.  Generally, we tend to see data breaches caused more often by external actors, but we see data breaches by insiders to involve larger quantities of data or more significant data. When data breaches do occur, it’s important for organizations to investigate and find the root cause so that they can both measure the scale of a breach (ie. how much data was leaked) but also to put in place measures to prevent these breaches in the future.  

When data breaches occur as a result of an insider threat, monitoring user activity at a detailed level allows us to perform investigations and root cause analysis to determine exactly who accessed data, when was it accessed and which actions were taken on that data - like who it was shared with.

Audit Access to Sensitive Information
In many organizations it’s important to audit the current access controls in place for sensitive content. This is sometimes referred to re-certifying permissions, or getting data owners to review and sign off that permissions are accurately set for data that they are responsible for.  In large organizations with large diverse information systems it can be really difficult to identify who is responsible for different data repositories.  

Monitoring user activity at a detailed level allows us to gain insight into who is accessing data on a regular basis, along with the level of access that they have.  This can greatly help us in identifying data owners to ultimately review and re-certify permissions.


Office 365 Activity Monitoring and Reporting

The new activity monitoring and reporting capabilities include:
  • Office 365 Activity Report (built into the Office 365 experience)
  • Comprehensive Event Logging
  • Search PowerShell Cmdlet
  • Management Activity API (in preview)

1. Office 365 Activity Report

You can access and run the Activity Report by:

  • Logging into your Office 365 tenant
  • Navigating to Admin in the App Launcher > Compliance Center > Reports > Office 365


[Activity Report Screen Shot]

You can use the Office 365 activity report to view detailed user and administrator activity in your tenant.  It contains data across SharePoint Online, One Drive for Business, Exchange Online and Azure Active Directory.  You can use this report to search and investigate user activities by searching for a user, a file or folder or even a site.  You can filter based on a date range or type of activity.  And within the report window you can view details of each activity in the Details Pane. The report is available to run on demand as needed.

When you find what you're looking for, you can either review activities and details right within this window or you can download the list of activities to a CSV file.

With each event captured there are up to 37 different properties logged.  Not all properties apply to all Office 365 services.  Some only apply to SharePoint Online and OneDrive for Business, whereas others only apply to Exchange.  The list of properties captured is shown here, with my favorites highlighted in red – my favorites are data like:

  • Actor - The user that performed the action; can be a service principle
  • ClientIP - The IP address of the device that was used when the activity was logged. The IP address can be either IPv4 or IPv6.
  • EventSource – Identifies that an event occurred in SharePoint, OneDrive for Business or the ObjectModel.
  • LogonType – Applies to Exchange only; this is the type of user who accessed an Exchange mailbox: mailbox owner, administrator, delegate, the Exchange Transport Service, a service account or a delegated administrator.
  • Subject – Applies to Exchange only; this is the subject line of the message that was accessed.
  • UserSharedWith – The user that a resource was shared with.
  • UserType - The type of user that performed the operation: a regular user, an administrator in your Office 365 tenant or a Microsoft data center administrator.

You can see documentation on the full list of properties here:
2. Comprehensive Event Logging
In order to enable the Activity Report and make it really useful, events related to user and administrator activities are logged as users work across SharePoint Online, One Drive for Business, Exchange Online and Azure Active Directory.  

Currently there are over 150 events that are logged, and these are divided into 9 categories:

  • Exchange admin events
  • Exchange mailbox events
  • File and folder events (SharePoint and OneDrive for Business)
  • Invitation and access request events (SharePoint and OneDrive for Business)
  • Sharing events (SharePoint and OneDrive for Business)
  • Site administration events (SharePoint and OneDrive for Business)
  • Synchronization events (SharePoint and OneDrive for Business)
  • Azure Active Directory events (Admin Activity and User Login)


You can view documentation on the full list of events here:

The events logged are diverse and very comprehensive, with Microsoft continually working to log more events.  When it comes to investigating data leaks, this gives administrators very detailed investigation capabilities to determine how leaks occur and how to prevent them.

3. Search Powershell Cmdlet

You can also search for events in the activity logs that we’ve been looking at using Powershell.  This is a new Powershell cmdlet to search all the event logs based on date range, the user who performed an action, the type of action, or the target object.

Examples of using this cmdlet are:

Search-UnifiedAuditLog -StartDate September 1, 2015 -EndDate September 30, 2015

Search-UnifiedAuditLog -StartDate 9/1/2015 -EndDate 9/30/2015 -RecordType SharePointFileOperation -Operations FileViewed -ObjectIds docx

With this capability we can script our searches of the event logs.  We can also have these searches output the results to a file.  And ultimately, this can allow us to schedule our reports to occur automatically on a regular basis so that administrators or infosec people can get insight into specific activities either every morning, every week or whenever the business schedule demands.


4. Management Activity API (in limited preview)
The final capability provided with this release is a new Management Activity API, which allows developers to integrate Office 365 activity and event data with either internal tools or with 3rd party monitoring and reporting solutions.

Full documentation on the Management Activity API can be found here:

There are a couple of important points about the API:

  • This API is in limited preview now, and during the preview anyone can use the API, but only those registered with Microsoft will be able to actually retrieve data from Office 365.
  • Actions and events are stored in content blobs in a database, and they are gathered across multiple servers and datacenters. As a result of this distributed collection process, the actions and events contained in the content blobs will not necessarily appear in the order in which they occurred. One content blob could contain actions and events that occurred prior to the actions and events contained in an earlier content blob.


Enjoy.
   -Antonio

Friday, October 9, 2015

Data Visualization Options in SharePoint and Office 365

A big thank you to everyone that attended my recent presentation last week in Houston on Data Visualization Options in SharePoint 2013 and Office 365.  We had a great turn out for our round table presentation with a lot of great dialog and questions.

You can view the presentation deck from our session here:


To summarize a few points from the session, Microsoft has several great data visualization tools available for SharePoint, including:

  • Excel Services
  • PowerPivot
  • SSRS
  • PowerView
  • PowerBI
  • Custom code with JavaScript
  • PowerBI
  • Datazen
  • Performance Point
  • Visio Services

Our presentation did not cover Performance Point or Visio Services due to the relatively low usage we see of those components.

Knowing which tool to use in which scenario can be really challenging.  So as part of the presentation we talked about the 3 questions you need to ask your self when choosing a tool, and we went through following use cases to help you decide the best tool for the job.  To recap that information...

The 3 questions to ask yourself when selecting a SharePoint data visualization tool:

  • What do you want to do with your data? Do you want to create reports, create dashboards, data analysis, data discovery?
  • Which devices are users consuming data on?  Are they using desktops, tablets, smart phones?
  • Where is your data located?  Is your data on premise or in the cloud?


The various use cases we went through were the following, with our recommended data visualization tool.


Use Case 1

I am an Excel pro.  I have a lot of data. I have SharePoint on-prem… and I need to provide and share info to many users on Intranet

Recommended tool: Power Pivot


Use Case 2

I have SharePoint on-prem.  I want users to do data analysis and discovery on the intranet (SharePoint 2010/2013) on their own.

Recommended tool: PowerView


Use Case 3

I have SharePoint on-prem, and need to provide reports to business users on my intranet  which they will print.  I would like power users to be able to create reports.

Recommended tool: SQL Server Reporting Services (SSRS)


Use Case 4

I have both Office 365 and SharePoint on-prem, do not have Power View, cannot use my on-prem data in the cloud, but still need to do some kind of Visualization.

Recommended tool: Javascript code using standard Javascript libraries (D3.js, chart.js)


Use Case 5

I am using Office 365, have my data on-prem and want users to be able to use different devices to do data discovery on the data.  I want the users to do create these “reports”.

Recommended tool: Power BI


Use Case 6

I have on-prem SharePoint, the data is in lists and need to create responsive dashboards that work on many different devices.  I want my users to create and consume these.

Recommended tool: Datazen


Please let me know if you have any questions.
   -Antonio

Wednesday, October 7, 2015

How does Microsoft Protect Our Data in Office365?

I’ve received this question many times over the last year – clients who are considering Office 365 to store their corporate data asking:

How does Microsoft really protect our data as it sits within their data center?

Given the nature of my past security work, this is always a question that I’m happy to share the details about.  I often start by telling people that Microsoft has implemented an extremely robust, multi-layered security strategy for protecting data at rest in Office 365.  That sounds great, but what does that really mean?

Well, specific to SharePoint, OneDrive for Business and other solutions, Microsoft uses a multi-leveled encryption strategy with keys that are rotated (ie. regenerated) on a regular basis.  Actually the strategy is broader than that – it uses a combination of multiple levels of encryption, automatic key rotation, random distribution of data, drive level encryption and data spread across multiple systems each with their own network, OS, malware and physical protection.

In the on premise world, SharePoint data sits within content databases inside SQL Server.  You can certainly configure SSL communication between clients and SharePoint and between SharePoint and SQL to secure data in motion.  You can even enable Transparent Data Encryption (TDE) within SQL to secure your SharePoint data while at rest within the SQL Server database.  However, in the online world how exactly does Microsoft use encryption and other techniques to protect our corporate information?

Let’s look at how the strategy is applied in detail to your content within SharePoint and OneDrive for Business:

  • Files within SharePoint Online and OneDrive for Business are shredded into fragments and each fragment is encrypted with a different key, using AES 256 bit crypto.  When files are modified, each delta is encrypted with a different key.
  • Encrypted fragments are randomly distributed and stored across multiple Azure storage accounts.  These storage accounts are generated on demand and stored in separate systems.
  • The keys used to encrypt fragments are regenerated once per day (key rotation).
  • These keys are themselves encrypted using a master key that is specific to the customer.
  • The master key is stored in a highly secured and monitored “key store” which is completely separate from SharePoint content databases.  The key store is the most secured asset in the Microsoft data center.
  • The keys used to encrypt fragments, which are themselves encrypted with the master key, are stored in the SharePoint & OneDrive content databases along with a map to the fragments.
  • Microsoft also uses BitLocker to encrypt all of the disks on all systems.

So let’s consider scenarios where the data center is attacked:

  • If a content database is attacked, the attacker only gets access to a bunch of keys, which are encrypted and therefore unusable, and a map to the encrypted chunks that are stored in a different system with its own protections (the Azure storage accounts).  
  • If the Azure Storage Accounts are attacked, the attacker only gets access to a bunch of random fragments, which are encrypted… and did I mention that the distribution of those fragments is random.  Even if they could decrypt the fragments, they will not be able to put a file back together due to the random distribution.
  • Again, the key store is the most secure asset in the Microsoft data center.  Even if an attacker could get to the key store and attack it, at most they can get a key.  
  • If the physical environment is attacked, and drives are physically removed and stolen, none of the data on the disk will be accessible due to BitLocker drive encryption.

Keep in mind all of the physical, network level, malware protections and internal procedures which strictly limit access to internal employees in the data center which are also in place.  In addition, Microsoft works every day to improve the security of their Office 365 offering by attacking and defending their own environments:

  • They have a dedicated RED team, which sits outside of the Office 365 environment whose job it is to constantly attack the Office 365 environment looking for vulnerabilities and holes.
  • They have a dedicated BLUE team which sites within the Office 365 environment whose job it is to constantly defend the Office 365 environment looking for ways to better protect our data from would be attackers.


Don’t those sound like the coolest jobs in the world?!

In The Future…

The next thing that Microsoft is working on to further enhance this strategy is to allow customers to bring their own master key, so that even if an insider wanted to access your data or a government request is made to Microsoft to access your data, Microsoft will not be able to retrieve it themselves.

You can find a great video on this topic here:  http://www.microsofttrends.com/2014/05/26/technical-details-on-office-365-fort-knox-encrypted-storage/.

As well, there was a great session at the Microsoft Ignite conference on this topic here:  https://channel9.msdn.com/Events/Ignite/2015/BRK3182.

Enjoy.
   -Antonio

Friday, October 2, 2015

UPDATED: Changing the SharePoint 2013 Farm Administrator Password

I wrote this post earlier this year regarding how to change the SharePoint 2013 farm admin password, and today I found an interesting situation that required a couple of extra steps.  You can find these extra steps below in red.  Big thanks to Doug Hemminger (@DougHemminger) for his assistance in finding a solution.

When setting up a new SharePoint 2013 farm, as a best practice we typically create service accounts for very specific purposes. The idea here is that we deploy SharePoint 2013 using a least privileged model, where very specific service accounts are created for very specific purposes and those accounts are only granted the permissions required to fulfill that purpose. That way, if such a service account is compromised by a malicious user, that user does not gain access to the entire farm. One such account is the SharePoint 2013 farm account.

When creating these service accounts, for various reasons, we typically create a domain account in Active Directory and configure it such that the passwords do not expire. As well, we find that the passwords for these service accounts typically are not changed often. However, there are circumstances in which the password for the SharePoint 2013 farm account must be changed.
  • One example of such a circumstance is if we suspect that the farm account has been compromised by a malicious user.
  • Another example is when consultants, such as myself, are brought in to deploy new SharePoint 2013 environments. Once that deployment process is complete and the client is happy with the environment, rightfully so, the client typically wants to take complete control of the environment and restrict farm admin level access to only a small set of internal employees - essentially they want to prevent the consultants that deployed the environment from continuing to have farm administrative level access.

Changing the SharePoint 2013 farm account is a manual process.  Its not something that is done often, so people often aren't sure which steps are required to ensure that it has been changed in all required locations.  Always be sure to test this process in a TEST SharePoint 2013 environment and monitor that environment for a period of time before performing this process in a PRODUCTION environment.  Your SharePoint 2013 farm may be configured differently that other standard configurations and your process may require extra steps.


For a standard SharePoint 2013 farm, the following are the steps required for modifying the SharePoint 2013 farm account:


1. Navigate to SharePoint 2013 Central Administration interface, click Security in the left hand menu, and click ‘Configure Managed Accounts’.  Select the farm administrators account in the account list shown, click the Edit icon and change the password.

I recently found that if the Central Administration application is running on a server in your farm which is not hosting the distributed cache service, this step will fail.  Luckily it fails early in the process and you get a "Sorry Something Went Wrong" message with a correlation ID.  If you look into the ULS logs you'll find that there is an Unexpected Error with the Distributed cache saying the SPDistributedCacheServiceInstance is not valid.  To resolve this, you can do the following:

  • Launch a SharePoint Command Shell window as an administrator
  • Run the following PowerShell command to temporarily enable the Distributed Cache service on this server:  Add-SPDistributedCacheServiceInstance.  The command should take a few seconds to run and completely successfully without any feedback on the command prompt.  Once the process is complete below, you'll remove the Distributed Cache service from this server.
  • Repeat step 1.  This step should now complete successfully.  Leave the SharePoint management console running.
  • You may find that this step stops the User Profile Synchronization Service on the server on which it is running.  At this point, check whether this service is still running in the 'Manage Services on Server' page and if not, start it now on the same server on which it was previously running.


2. Manually change the User Profile Synchronization Service password.  As required by SharePoint, this service uses the farm administrator account, however SharePoint 2013 does not treat this account as a managed account so it must be changed manually.
  • The farm administrator account must be made a local administrator on the server hosting the user profile service during the password change. 
  • Once that step is complete, launch SharePoint Central Admin, navigate to System Settings and click ‘Manage Services on Server’.  This page is used to start and stop services on each machine in the farm.  Select the machine hosting the user profile service and find that service.  It should say started. 
  • Stop the service.
  • Start the service again – when starting the script you’ll be asked for the new password
  • Ensure that you monitor the user profile service and ensure that the service starts correctly.
  • Once started, you may remove the farm administrator account as a local administrator.  However, we often recommend leaving it as a local admin on the server for simplicity of making such changes in the future.


3. Check if any applications in the Secure Store service use the farm administrator account, and if they do change the password there.
  • Launch SharePoint Central Admin, click Application Management in the left hand menu, click Manage Service Applications, click the Secure Store Application and click Manage Target Applications.
  • Select a single Target Application from the list.
  • In the Credentials group on the ribbon, click Set. This opens the Set Credentials for Secure Store Target Application dialog box.  If any target application uses the farm administrators account, change the password here. 
  • Repeat this process for all secure store applications.
  • Note: Be cautious when entering the password. If a password is entered incorrectly, no message will be displayed about the error. Instead, you'll be able to continue with configuration. However, errors can occur later, when you attempt to access data through the BCS.  If the password for the external data source is updated, you have to return to this page to manually update the password credentials.

At this point, if you added the Distributed Cache service as part of step 1, now we should remove this service.  In the SharePoint management console we previously opened running the following command:  Remove-SPDistributedCacheServiceInstance.

4. Reboot all the servers in the SharePoint farm, except for SQL server.  SQL Server does not need to be restarted.


Please let me know if you have any questions or comments about this process.  There may be other services that have been configured with the farm administration account, so your process may vary somewhat, but typically the farm administrator account is reserved for specific purposes.  As a best practice, due to its high level of access, the farm administrator account should not be used widely other than for the purposes in which it was designed.

   -Antonio