Follow me on Twitter @AntonioMaio2

Thursday, May 19, 2016

Overcoming Threats and Vulnerabilities in Your SharePoint Environments

Thank you to everyone that came out to the Atlanta SharePoint User Group meeting on May 16th!  We had a great turnout and it was really nice to talk with everyone.

You can find my presentation here:



During the presentation I did a demonstration of the DLP capabilities within Office 365 SharePoint Online, and I discussed the DLP capabilities within SharePoint 2016 server.

We saw SharePoint Online DLP policies applied to documents containing sensitive data (credit card numbers in this case) to provide policy tips for some documents, and to block access to other documents.  We provided policy tips to documents containing between 1 and 4 credit card numbers, and we blocked access to documents containing more than 5 credit card numbers.  This worked very well - as discussed it took between 15 and 30 minutes for SharePoint Online policies to be applied to new documents that were uploaded to a small library in my tests. For SharePoint Server 2016, this same test took approximately 14 hours to discover the sensitive content and apply the same DLP policies.  If you're wondering, I did have a default continuous crawl configuration in place for the RTM version of SharePoint 2016 server during these tests.

A related question that came up was whether SharePoint Online DLP policies apply to list items which contain sensitive data as well.  Well, after running a couple of tests in the last few days unfortunately the DLP policies are not applied to list items containing credit card numbers.  I will continue to run tests and report back here any findings I hvae.

Although I didn't demo the new DLP policies within SharePoint Server 2016, you can find my presentation on DLP within SharePoint 2016 Server on my blog here:



...and you can find my webcast with a demonstration of DLP within SharePoint 2016 server here:



Enjoy.
   -Antonio

Thursday, May 12, 2016

Don't Count Content Out of Your Security Audit:
ECM Must Be In!

ECM or Enterprise Content Management are the systems in our enterprise which store and manage corporate content. We often think of these systems as applications like SharePoint, Documentum or FileNet, but they can also represent network file shares, NAS drives and custom internal web sites or applications. ECM systems can now exist on premise within data centers that we manage; through a cloud provider like Microsoft Office 365 or Amazon Web Services (AWS); or through a hybrid combination of both. They have grown within most organizations to store sensitive data and to represent critical infrastructure that employees rely on to accomplish day to day work.

We rely on cyber security audits to evaluate the safety of our corporate environments. Cyber security audits give us an indication of our security posture and identify areas of improvement for cyber defense. As part of an audit, we typically look at things like network security, firewall configuration, communication protocols, intrusion detection -- systems which protect us from external threats, email phishing, malware and URLs leading to malicious websites. A security audit certainly must include these functions; however, ECM systems are often overlooked due to the specific domain knowledge required to properly evaluate all of the systems which make up the corporate ECM. Considering the criticality and often large quantity of data stored in our ECM, it’s important to consider why that is and how we can leverage what we know to facilitate inclusion of ECM systems in a cyber security audit.

An ECM is typically made up of multiple enterprise applications working together to efficiently store and provide access to content.  They surface a robust set of capabilities to bring additional business value to the organization. These systems are often overlooked in audits due to the specific domain knowledge required to properly evaluate all of the systems which make up the corporate ECM.

Microsoft SharePoint is a great example – it is a web application with a large set of built in document management features, sitting on top of SQL Server for content storage, surfaced through IIS for web access, with responsive pages for mobile access, deployed to a farm of servers with firewalls, proxies or a combination of both. It can be connected to other systems for authentication, retrieving business data and integration with reporting or business intelligence tools. It integrates with Active Directory for identity management, people search and user profiles which surfaces presence data and user attributes. Custom solutions can also be deployed to SharePoint to fulfill a specific business needs through the robust APIs it makes available. It provides a forms and workflow engine, allowing organizations to gain efficiency through business process automation. It can be configured with an enterprise class search farm to efficiently index content and provide lightning fast search. Search can include content within SharePoint and outside of SharePoint, like file shares. These features may be used to provide an intranet for collaboration, an extranet to interact with partners, a public facing web site or any combination of these. Finally, enterprises typically don’t have just one such SharePoint environment – you often see development, staging and production environments, along with a separate environment for disaster recovery.

This is really just a small sample of capabilities provided by SharePoint, but represents what general ECM systems look like and what we hope to get out of them. With this in mind and considering all of the systems involved in providing such a robust set of services, a security audit can seem daunting.

An ECM security audit does require some domain specific knowledge of many of these systems, however we find that the security review process often comes down too many of the same questions or areas of investigation that are used in reviewing other systems, such as:
  • Can we identify all repositories that store enterprise content?
  •  Do we know what types of data are sensitive and do we know where it resides? Do we need to scan repositories for data that is sensitive data from a compliance or risk perspective? Essentially, this is data that puts the organization at risk should a data breach occur, either inadvertently or maliciously. This can be data such as PII, PCI, PHI, MNPI (material non-public information), CPNI (customer proprietary network information), etc.
  • Are data owners for all repositories defined, in particular those storing sensitive data? Are data owner responsibilities clearly defined and are data owners aware of those responsibilities? Do those responsibilities include approving and denying requests for access?
  • Does the organization have record retention policies and schedules? Does the organization have a classification policy? Are information handling and acceptable use policies clearly defined for each type of sensitive data, and are end users educated about these policies on a regular basis? Is it clear to end users when they are working with sensitive data and how to handle it? Are these policies enforced or automated?
  •  Is the process for requesting access to data clearly defined and are end users aware of the process they must use? Are all access requests logged? Are access reviews performed on a regular basis, in particular for privileged or administrative users?
  •  Does the organization have an information governance plan and a governance committee? Does the governance plan cover specific practices related to the ECM system?
  • Do you have the right team in place to manage the ECM? Does the team have enough people and do they have the right skill sets or certifications? Your ECM administrative team needs the appropriate skills to manage it from strategic and tactical perspectives, from a security perspective and from the point of view of the business users.
  •  Is an activity monitoring and reporting system in place? Do those systems interact appropriately with the various components making up the ECM environment?
  • Are the servers making up the corporate ECM environment security hardened?
These questions typically make up the core aspects of an ECM security audit. The only questions that require specific domain knowledge are the last question, and perhaps the second to last. All others can apply to all ECM environments regardless of the systems involved or integrated. These questions apply to many different corporate systems -- they provide us with insight into which data is sensitive to the business, where it resides, who is responsible, how access is controlled, how policies are enforced, and finally how the system is secured and monitored.

Due to the criticality of the data stored within ECMs and the fact that typically a majority of employees in the enterprise access and rely on the ECM to accomplish daily work, including the corporate ECM in a cyber security audit is not only recommended but a requirement. As well, we can often leverage what we already know to ask the right questions to help us determine the security posture of our environment and where improvements may be necessary.

   -Antonio

Tuesday, May 10, 2016

Data Loss Prevention in SharePoint Server 2016

Thank you to everyone that attended my webinar on April 28th on Data Loss Prevention in SharePoint 2016.  We had a great turn out.  As I said in the web cast, Microsoft is bringing great features from SharePoint Online like the Compliance Center to the on premise world with SharePoint Server 2016.  This is an excellent advancement for our on premise SharePoint deployments.  My webinar was a review or tour of the Data Loss Prevention features we now have for our on premise deployments with SharePoint 2016.

Introduction

Today we're constantly hearing about or experiencing threats to our business, and in particular to our business data.  To protect the business and its reputation, or to comply with business standards or industry regulations, organizations need to protect their sensitive corporate data and put in place measures to prevent its disclosure.  Sometimes that disclosure is inadvertent, or accidental. Sometimes its intentional or malicious.  In either case, a data loss prevention solution (or DLP) is one one of the necessary solutions to this problem.  SharePoint 2016 now includes a robust Data Loss Prevention capability that can help us protect that data.

More specifically, when we talk about data loss prevention we're talking about automated systems which scan our data for keywords, regular expressions and patterns looking for specific types of data, and then either reporting or enforcing policies on that data.  This includes many different types of data, including:
  • Personally Identifiable Information (PII): passport numbers, social security numbers, tax identification numbers or even drivers licenses
  • Payment Credit Information (PCI or PCI DSS): credit card numbers
  • Financial Data: debit card numbers, bank account numbers, SWIFT codes or routing numbers
  • Health Insurance Data: medical record numbers, policy numbers, patient information
If we look back, data loss prevention has been a long time feature of Office 365.  However, until recently it was only available in Exchange Online with policy tips and policy enforcement.  In mid-2015, Microsoft announced that the Office 365 Compliance Center would include DLP for SharePoint Online and OneDrive for business, and that solution was released to the various Office 365 tenants through the last part of 2015 and into early 2016.  Now, with the imminent release of SharePoint 2016 Server we also have this capability for our SharePoint on premise deployments.

Prerequisites

You need the following prerequisites in place before configuring the SharePoint 2016 DLP within the new Compliance Center:

  • Create a Search Service Application (mandatory)
    • Start the search service, Define a crawl schedule, Perform a full crawl
    • Must have a healthy search index and crawl for DLP policies to be effectively applied
  • Configure out-going email (recommended)
  • Turn on Usage Reports (recommended)
  • Create the eDiscovery or Compliance Center site collections (mandatory)
    • eDiscovery - for DLP queries to identify where sensitive data exists
    • Compliance Policy Center - for DLP policies to monitor and enforce policies
    • One or the other is mandatory - you can create both, but both are not mandatory
  • Assign the organization's Compliance Team with permissions to access the eDiscovery and/or Compliance Center site collection(s) through the Site Collection Members group

    Issue Follow-up: DLP Policy Enforcement Time & List Items

    Some of you will remember that there was an issue with the enforcement of the DLP policies in the library that I was demoing during the presentation.  I was showing a library with a document that contained 5 credit card numbers and the blocking policy was enforced on that document.  However, there were 5 other documents that had 1 credit card number each in which the 'policy tip' policy (or monitoring and warning policy) was not getting enforced on those - even after waiting 12 hours.

    Well, after waiting 14 hours the 'policy tip' policy final got applied to those 5 additional documents.  Considering those documents contained a credit card number, and my understanding is that the policy templates which check for credit cards are considered high priority internally, I would have expected those to be enforced sooner.  

    Please note: I was launching full crawls and the timer jobs which are supposed to enforce DLP policies manually over and over, to no avail.  

    What I have read, that it can take up to 24 hours for DLP policies to be enforced, is obviously true.  If I had a 1000s of users, with millions of documents, and users updating documents adding/removing sensitive information all the time, I could understand the 24 hour wait time.  But I added 5 documents into 1 library, in a farm of 20 documents total, with only 1 user using the farm - in this case I would have expected faster performance.  Enforcing DLP policies is often time sensitive and waiting 24 hours can mean in some situations that sensitive information is already exposed.

    As well, one of the tests I was running in my farm was if the DLP policies get enforced on list items.  Even after waiting over 24 hours, we can see that the DLP policies are not applied to the list items in this list containing sensitive information (credit card numbers) even after waiting 5+ days and ensuring that full crawls are successfully run and all associated timer jobs are also successfully run:


    Presentation and Recording

    The presentation slide deck can be found here:



    The recording to last Thursday's webinar can be found here:


    Final Thoughts

    Data loss prevention is just 1 critical part of securing our sensitive data.  This includes identifying sensitive data, monitoring its usage and enforcing polices which control its usual and disclosure.

    • The fact that SharePoint 2016's DLP uses the search index is both a blessing and can cause issues - traditional SharePoint DLP systems have had challenges accessing and scanning content from outside of SharePoint and using the search index allows the DLP capability to operate as quickly and efficiently as possible.
    • If something is not in the search index, it will not be found by your DLP policies.  So, the freshness of your search index and the health of your search crawls will affect how effectively your DLP policies are applied.


    The DLP capability within SharePoint 2016 is a great start!  However, I would like to see it evolve particularly in the following areas:

    • Apply DLP policies to list items, in addition to documents.  I have seen several incidents in the field where clients store sensitive information in list item metadata columns.
    • More policy templates
      • Exchange Online has 80 templates and SharePoint online has 51; SharePoint 2016 on premise has only 10.
      • Especially include health related policy templates, looking for HIPAA and medical related sensitive information.
    • Customizable sensitive data types - you can do this in many other DLP systems now.
    • Today policies are location based (which site collection it resides in) and condition based (what sensitive data types are included and the number of instances.  I would like to see event based policies included as well - like enforcing policies upon upload, upon adding an item, upon deleting an item, upon editing an item, etc.
    • More actions available for policies, like encrypt content with IRM upon download.
    • One compliance center for all site collections, including OneDrive for Business (MySites), across all web applications.  We can do this today with the eDiscovery Center in SharePoint 2016 and in SharePoint Online with OneDrive for Business in Office 365 - why can we not do this in the SharePoint 2016 Compliance Center.
    • More control over when policies are run - the ability to say "Evaluate all policies now" as opposed to running timer jobs and guessing which is the right timer job to run and getting no results when you run them all.
    • Document matching capability - allow us to specify a library of documents which are compared to documents in the system as part of a DLP policy, along with a required percentage of match for each document for a policy to fire.  This would help us prevent disclosure off executive communications, intellectual property, documents that follow a common format, etc.

    Finally, I encourage everyone to starting learning about and testing the SharePoint 2016 eDiscovery DLP queries and the Compliance Center's DLP policies.  You do need to test DLP policies in a TEST or STAGING farm (or TEST site collection if you don't have a separate farm) before deploying them to a PRODUCTION environment because they often need tweaking and you don't want to do that while Production users are trying to access data they need.  And one more time... ensure you have a healthy search index and crawl: if something isn't in the search index it will not be found by the DLP policies you put in place.

    Enjoy.
       -Antonio

    Wednesday, May 4, 2016

    The Future of SharePoint Security and Governance

    May the 4th Be With You

    Today is typically set aside to celebrate Star Wars movies and culture that many of us have enjoyed for years.  I'm hoping to take in one of the Star Wars movies with my family this evening, once the kids finish their homework of course.

    In other important news...
    Today Microsoft announced the general availability of SharePoint Server 2016!

    Microsoft also announced major new directions that we're going to see for SharePoint over the coming year!  People that read my ramblings know that I focus much of my work on security, so I'd like to share some of the security related capabilities that are included in Microsoft's roadmap for SharePoint. Microsoft has also reaffirmed their commitment to security, privacy and compliance with some significant new capabilities in their roadmap.

    Dynamic Conditional Access Policies

    One major new feature we're going to see is dynamic conditional access policies that administrators can define, allowing them to control the content that users can access based on the user's identity, the application or device they're using and their network location.

    Administrators will be able to effectively prevent users from accessing high-security files in SharePoint from a mobile device or home network which the organization doesn't control, but allow the user to access those files from a corporate laptop.

    Microsoft Windows Server 2012 has had this capability for years with its Dynamic Access Control (DAC) capability where you can define policies based on attributes in a user's identity (ex. security clearance) and metadata associated with a document (ex. its classification) and have those policies automatically enforced on Windows file servers.  It typically required use of the Windows File Classification Infrastructure (FCI).  As well, some third party security tools have layered these types of security policies on top of on premise SharePoint deployments in the past.

    From a security perspective, it will be fantastic to see this or a similar capability finally making its way to SharePoint.

    Site Classification

    In an update later this year, customers will be able to classify SharePoint sites so that security policies are scoped and enforced on all content in the site. When creating a new site, whether a team site or a publishing site, you'll be able to select the classification of the site. A site's classification is typically related to the sensitivity of the content you plan to store or present within the site, which will be displayed right below the site's name by default. This will really help users to understand when they are accessing sites with sensitive corporate data.

    This feature sounds simple, but its extremely significant because it allows customers to identify where sensitive data exists in their environment. Identifying where sensitive data lives is traditionally the first battle you fight, when trying to protect your sensitive corporate data. This is a great advancement in improving the governance of our SharePoint environments.

    Hybrid SharePoint Insights - Hybrid Activity Monitoring and Reporting

    Between the fall of 2015 and early 2016, Microsoft released the activity monitoring and reporting features within SharePoint Online and OneDrive for Business. This is a great capability for either monitoring user activity within your tenant, or performing forensic analysis into data breaches. I wrote an article about this capability here: Securing Office 365 with Activity Monitoring.

    By the end of 2016, Microsoft will release a preview of Hybrid SharePoint Insights which will aggregate data from both your on premise SharePoint 2016 environment and your SharePoint Online/OneDrive for Business tenant. This will allow you to monitor and report on user activity from both your on premise and Office 365 environments through one easy to use interface.

    Bring Your Own Encryption Keys

    We've have heard over the last year about how Microsoft encrypts all content stored within SharePoint Online and OneDrive for Business with a complex system that partitions data, uniquely encrypts each partition with a different key, randomly distributes and stores those encrypted partitions in Azure Storage Blobs, encrypts the keys themselves and stores those in a master key store and rotates all keys every 24 hours.  I've written about this myself here:  How Does Microsoft Protect Our Data in Office 365.  This is already happening in Office 365 and its completely transparent to customers.

    What's new is that later this year customers will be able to bring their own encryption keys to further lock down their data, preventing even Microsoft technical staff running the Office 365 service from accessing your data.  These continued efforts continue to help protect our data and our privacy.

    Data Loss Prevention Improvements

    I recently gave a webinar on the new SharePoint 2016 data loss prevention feature which can be found here: Data Loss Prevention in SharePoint 2016. Later this year you will be able to apply data loss prevention policies down to the site level. Today you can only apply those policies at the site collection level. This will allow us to get more specific about where and which content data loss prevention policies are applied to.

    External Sharing Improvements

    In Office 365, you can now whitelist and blacklist specific domains for external sharing. As well, later this year we'll be able to set an expiry period for external sharing, so content is only shared externally for a specific period of time.

    Other Exciting Additions - New Mobile Experience, Team Sites...

    There are many other welcome new additions planned over the coming year including a new SharePoint Mobile app experience, allowing users to easy access news from across the company, the sites that people use and access most, quick links to important pages and a list of their coworkers or those they collaborate most with.  This new app apparently uses Microsoft's investments in machine learning and the Office Graph to help surface the most relevant content and people for you, and present that ahead of less relevant information.  This sounds a lot like a mobile version of Delve doesn't it?!   The new mobile app will be available towards end of June 2016 for iOS, with Android and Windows versions coming later this year.  The OneDrive mobile app will also be getting enhancements through machine learning to provide users with suggestions of useful content through both OneDrive for Business and SharePoint.

    As well, team sites will get a new home page which gives users a quick look at team sites which they are part of, along with updates that have been recently made to those sites.  The idea here being that users can more quickly get to the work and sites that are more relevant to them at that moment.

    There's tons of other exciting updates... within Office 365, we hear that SharePoint team sites will be coming together with Office 365 groups as well - whenever a new Office 365 group is created, a new team site will be created as well. As a result, you'll be able to share team sites within Office 365 groups with external users through the Office 365 external sharing feature. This is a nice addition, but can create some security issues as well if you don't have appropriate governance in place.

    We will likely see these updates come out through Microsoft's new SharePoint 2016 Feature Packs planned over the next year.

    May the 4th be with you!
       -Antonio