Follow me on Twitter @AntonioMaio2

Wednesday, February 7, 2018

Step by Step: How to Fine Tune Sensitive Data Types in Office 365

Office 365 provides the Data Loss Prevention feature or DLP which allows you to automatically identify sensitive data across workloads in Office 365. This means that you can have 1 set of policies that all apply to SharePoint Online sites, OneDrive for Business sites and Exchange email, or you can have different policies that apply to each workload.

The Office 365 DLP policies currently supports many sensitive data types which represent all types of information that an individual or business may want to product, information like US social security numbers, various European passport numbers or identity card numbers, credit card numbers and so on. At the time this post was written it support 82 sensitive types. These sensitive data types in many cases include a regular expression that is matched, an extensive list of keywords that are searched for within a proximity to the sensitive data, and in many cases a checksum that is calculated (for example, running the Luhn's algorithm on a suspected credit card number). An inventory of the sensitive data types supported along with exactly what each looks for can be found here: .


You are also able to modify existing sensitive data types or create custom sensitive data types which not only get used by Office 365 DLP, but also by features like Office 365 Labels, and Office 365 Advanced Data Governance. You may want to create a custom sensitive data type if you have a custom piece of data within your organization that follows a particular well-defined pattern and that you need to look for within documents or emails. An example is if you have a custom format for an employee number. Another case is if you live in a country that has an identity card number format or driver's license format, for example, which is not represented by the built-in Office 365 sensitive data types.

There are a few sites/articles out there that shares steps on how to do this, but I have found many of them to be incomplete. Recently we had to create some custom sensitive data types for a GDPR project and I wanted to share my experience at creating custom sensitive data types.

In our walk-through, the example of a custom sensitive data type I'm going to use is EU Debit Card Number. We're going to envision a scenario where we're looking for this type of sensitive data as part of a GDPR project and we are getting enough false positives that we want to try to make looking for this type more accurate, by introducing new keywords, adjusting the proximity parameter and modifying the confidence level.

Most Practical Approach

The most practical approach when creating or customizing a sensitive data type is to create a new sensitive data type based on an existing one, giving it a unique name and identifiers. For example, if you wish to adjust the parameters of the “EU Debit Card Number” sensitive data type, you could name your copy of that rule “EU Debit Card Enhanced” to distinguish it from the original. In your new sensitive data type, simply modify the values you wish to change to improve its accuracy. Once complete, you will upload your new sensitive data type and create a new DLP rule (or modify an existing one) to use the new sensitive data type you just added. Modifying the accuracy of sensitive data types could require some trial and error, so maintaining a copy of the original type allows you to fall back to it if required in the future.

Customize the Sensitive Data Type

The following is the detailed step by step process that is necessary to create a custom sensitive data type or modify an existing one.

1. Export the existing Rule Package of built in sensitive data types that are available in Office 365

a. At the PowerShell command prompt create a connection to Exchange Online:
$ UserCredential = Get-Credential
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://ps.compliance.protection.outlook.com/powershell-liveid/ -Credential $UserCredential -Authentication Basic -AllowRedirection
Import-PSSession $Session

Note: At the start of this example we have to use the Exchange Online PowerShell module and at the end of the example we're using the newer Security and Compliance Center PowerShell module. This is intentional.

b. Export the current rule collection to an XML file:
$ruleCollection = Get-ClassificationRuleCollection
Set-Content -path "C:\exportedSensitiveTypes.xml" -Encoding Byte -Value $ruleCollection.SerializedClassificationRuleCollection

c. Copy the XML file you just exported and give it a different file name – for example: MyNewDLPRule.xml. Keep the original exported rule xml file for reference while you’re constructing your new file.

2. Open the XML file you just copied/renamed in your favorite XML editor

3. You will need to generate 2 GUIDs and replace those in the sensitive data type that you are modifying. At the PowerShell command prompt, type the following and record your new GUID. Do this a second time and record that new GUID as well.

New-Guid

4. The XML file is large, so for simplicity we’re going to start with an existing sensitive data type and isolate it in our new rule xml file by removing all other sensitive data types. In this case, we’re going to modify the “EU Debit Card Number” sensitive data type.

Note: the following details are important when modifying the rule’s XML structure.

a. The <rulepack> element of the file contains information about the publisher, in this case Microsoft. It also contains localized versions of all the Publisher strings. The <rulepack> definition contains an id property which is the first place where you’ll need to replace the existing GUID with a new one you just generated.

b. The <rules> element contains the definition of your sensitive data type.

c. The <rules> element is made up of the <entity> element which defines the pattern to use, the <keyword> element which defines the list of keywords to match, a <regex> element which defines the regular expression that’s used in the pattern, and a <localizedstrings> element which defines the default and localized strings displayed in the DLP rule UI.

d. The <entity> element contains an id property which is the second place where you’ll need to replace the existing GUID with a new one you just generated. This GUID is referenced further down within the <localizedstrings> element, as the id for a <resource> element. This <resource> element identifies both default and localized strings to use when displaying your sensitive data type in the DLP UI. If you are deleting <entity> and <resource> elements in order to simplify the XML file you are working with, ensure that you do not delete the <resource> element that is referenced using the GUID by your sensitive data type’s <entity> element.

e. The <entity> element also contains several <match> elements, which identify the names of the keyword lists that are used by that sensitive data type. These appear as follows:

<any minmatches="1">
<match idref="Keyword_eu_debit_card">
<match idref="Keyword_card_terms_dict">
<match idref="Keyword_card_security_terms_dict">
<match idref="Keyword_card_expiration_terms_dict">
<match idref="Func_expiration_date">
</any>

f. The keyword list named "Keyword_card_terms_dict" is defined further down in the XML file within a <keyword> element. In this case, the <keyword> element contains a <group> element and multiple <term> values, each of which define a keyword that is used by the Office 365 DLP to identify sensitive information. Keyword lists are often referred to as “corroborative evidence” when defining DLP rules. Keyword lists defined within the XML structure can be used by multiple sensitive data types – for example, the Credit Card Number and EU Debit Card Number sensitive data types share some of the same keyword lists. The same is true for <regex> definitions. Keyword lists defined in this file can be modified to add additional, more specific keywords or to remove keywords. Once again, if you are deleting <keyword> lists throughout the original XML file, ensure that you do not delete a <keyword> element that is referenced within your sensitive data types <entity> element.

g. The <entity> element contains three elements which are used to define a pattern that the sensitive data type must match: IdMatch, Match, Any. In order to promote re-usability of definitions across multiple patterns, both the IdMatch and Match elements do not define the details of what content needs to be matched but instead reference it through the idRef attribute.

5. Now that we have our sensitive data type XML file ready to edit, we’ll make some basic modifications in order to identify our sensitive data type as a new unique type when configuring a DLP rule.

a. Modify the id property of the <RulePack> element. This id should be replaced with one of the new GUIDs we created.
<RulePack id="bd2568b4-b331-4387-b399-7e46065f6994">

b. Within the <RulePack> element, find the <Publisher> element and replace its id property with the second of the new GUIDs we created.
<Publisher id="ac9a7b29-870f-4810-a96f-6b4080c67e5d" />

c. Modify the id property of the <Entity> element which represents our sensitive data type. This id should be replaced with third of the new GUIDs we created.
<Entity id="48da7072-821e-4804-9fab-72ffb48f6f78" patternsProximity="300" recommendedConfidence="85">

d. Within the <RulePack><Details><LocalizeDetails> element, find the <PublisherName>, <Name> and <Description> elements. Modify the value of these elements to unique values.
<Details defaultLangCode="en">
  <LocalizedDetails langcode="en">
    <PublisherName>Contoso</PublisherName>
    <Name>Contoso Rule Package</Name>
    <Description>Defines the set of classification rules for Contoso</Description>
  </LocalizedDetails>

e. Within the <LocalizedStrings> element, find the <Resource> element which had the same id as our <Entity> element and modify its id to match the GUID that we assigned to the id of the <Entity> element.

f. Within the <LocalizedStrings><Resource> element we just modified, find the <Name> and <Description> elements and modify their values so that our sensitive data type has a unique name and description. This will help us better select the correct sensitive data type when we configure a DLP rule.
<Resource idRef="48da7072-821e-4804-9fab-72ffb48f6f78">
   <Name default="true" langcode="en-us">EU Debit Card Number Enhanced</Name>
   <Description default="true" langcode="en-us">Detects European Union debit card number with enhanced accuracy.</Description>
</Resource>

g. If you are going to start with an existing sensitive data type and that type contains existing keyword lists in the definition, it is important to note that some keyword lists are in use by Microsoft’s built in sensitive data types and names of those lists may not be reused in other sensitive data type definitions. In the case of the “EU Debit Card Number” example, it makes use the following Microsoft keyword lists:
<Keyword id="Keyword_card_expiration_terms_dict">
<Keyword id="Keyword_card_security_terms_dict">
<Keyword id="Keyword_card_terms_dict">

Those lists are currently in use by multiple built in sensitive data types and we are not permitted to reuse those names. Therefore, we must modify those names in the <Keywords> element such that they are unique, as follows:
<Keyword id="Keyword_card_expiration_terms_dict_enhanced">
<Keyword id="Keyword_card_security_terms_dict_enhanced ">
<Keyword id="Keyword_card_terms_dict_enhanced ">

Those names must also be modified within the <Entity> element where they are referenced:
<Entity id="48da7072-821e-4804-9fab-72ffb48f6f78" patternsProximity="150" recommendedConfidence="85">
      <Pattern confidenceLevel="85">
        …
        <Any minMatches="1">
          …
          <Match idRef="Keyword_card_terms_dict_enhanced" />
          <Match idRef="Keyword_card_security_terms_dict_enhanced" />
          <Match idRef="Keyword_card_expiration_terms_dict_enhanced" />
          …
        </Any>
      </Pattern>
</Entity>

Fine Tune a Sensitive Data Type to Avoid False Positives or Look for Organization-Specific Information


Now we’ll make some modifications to our custom sensitive data type in an attempt to improve its accuracy. Improving the accuracy of DLP rules in any system requires testing against a sample data set, and may require fine tuning through repetitive modifications and tests. When searching for an EU Debit Card Number in our example, the definition of that number is strictly defined as 16 digits using a complex pattern, and being subject to the validation of a checksum. We cannot alter this pattern due to the string definition of this sensitive data type. However, we can make the following adjustments in order to improve the accuracy of how Office 365 DLP finds this sensitive data type in our content within Office 365:

1. Proximity Modifications
We can modify the character pattern proximity to expand or shrink the window in which keywords must be found around the sensitive data type. In our case we’ll shrink the window by modifying the patternProximity value in our <Entity> element from 300 to 150 characters. This means that our corroborative evidence, or our keywords, must be closer to our sensitive data type in order to signal a match on this rule.
<Entity id="48da7072-821e-4804-9fab-72ffb48f6f78" patternsProximity="150" recommendedConfidence="85">

2. Keyword Modifications
We can add keywords to one of our <Keywords> element in order to provide our sensitive data type more specific corroborative evidence to search for in order to signal a match on this rule. These keywords could be organization specific keywords, or language specific keywords. Alternatively, we might find that some keywords are causing false positives to occur and as a result we may want to remove keywords. Keywords are added by navigating to our <Keywords> element for one of the three keyword lists provided with the “EU Debit Card Number” definition and adding additional <Term> elements, with the keywords as values (one <Term> per additional keyword).
<Keyword id="Keyword_card_terms_dict">
      <Group>
        <Term>corporate card</Term>
        <Term>organization card</Term>
        <Term>acct nbr</Term>
        <Term>acct num</Term>
        <Term>acct no</Term>
        …
     </Group>
</Keyword>

3. Confidence Modifications
We can modify the confidence with which the sensitive data type must match the criteria specified in its definition before a match is signaled and reported. This is done by modifying the confidenceLevel property on the <Entity><Pattern> element. The more evidence that a pattern requires, the more confidence you have that an actual entity (such as employee ID) has been identified when the pattern is matched. If you remove keywords from the definition, you would typically want to adjust how confident you are that this sensitive data type was found by lowering it from its default level of 85 in the case of the EU Debit Card Number type.
<Entity id="48da7072-821e-4804-9fab-72ffb48f6f78" patternsProximity="150" recommendedConfidence="85">
      <Pattern confidenceLevel="85">
      …
       </Pattern>
</Entity>

The following is a screenshot of our final sensitive data type definition file (with some elements collapsed):


Upload a New Sensitive Data Type

Now that we've defined our sensitive data type in the XML file structure, we are going to upload it to our Office 365 tenant.

1. At the PowerShell command prompt create a connection to the Office 365 Security & Compliance Center:
$ UserCredential = Get-Credential
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://ps.compliance.protection.outlook.com/powershell-liveid/ -Credential $UserCredential -Authentication Basic -AllowRedirection
Import-PSSession $Session

Note: At the start of this example we have to use the Exchange Online PowerShell module and at the end of the example we're using the newer Security and Compliance Center PowerShell module. This is intentional.

2. Create a new Classification Rule in Office 365 and upload your sensitive data type XML definition file:
New-DlpSensitiveInformationTypeRulePackage -FileData (Get-Content -Path "C:\EUDebitCardNumberEnhanced.xml" -Encoding Byte)

When the upload has completed successfully, the following output will appear in the PowerShell console:


3. Trigger a re-crawl of the content within the site collections potentially containing the new sensitive data type

4. Login to Office 365 as an administrator, navigate to the Security & Compliance Center, create a new Data Loss Prevention policy and select the new sensitive data type you just created.


Full Disclosure

You may find that some of this information is reprinted from a Microsoft article titled Office 365 Information Protection for GDPR. This is because I worked with Microsoft to help write and produce that content, and I wanted to re-use some of that material to highlight another scenario. Of course, please refer to both this article and the Microsoft content to get your custom sensitive data type working correctly.

More Information

For more information, refer to the following helpful articles:

Enjoy.
-Antonio

Tuesday, November 14, 2017

A Beginner's Guide to Administering Office 365 with PowerShell

With Office 365 PowerShell, you can manage Office 365 for your organization using commands and scripts that streamline your day to day work. Microsoft provides several easy to use admin centers to help manage Office 365. However, whether you’re an Office 365 administrator yourself or a service owner for Office 365 in your organization (working with other administrators), you’ll quickly find that you need to go beyond the capabilities that these admin centers provide. PowerShell can help you automate tasks so that they are easily repeatable, it can help you script management tasks so that they are automatically performed on a schedule and it can help you quickly output large amounts of data about your Office 365 environment. As well, some Office 365 settings are only manageable using PowerShell, with no UX provided. In this session, you’ll learn how to get started with Office 365 PowerShell and how to quickly become productive with it, making you more productive and empowered as you manage your Office 365 environment.

Thank you to those that attended my session today on this topic at SPTechCon DC! You were a great crowd with lots of great questions.

My updated slides can be found here:

Enjoy.
-Antonio

Monday, November 6, 2017

Set a SharePoint Online Managed Metadata (Taxonomy) Field Value on a Document or Item

Just last week I had an interesting little project I was helping a client with where we needed to use PowerShell to set 2 metadata field values for a document in a SharePoint Online Document Library. I've done this before many times. However, the challenge here was that the metadata fields were managed metadata fields, configured as part of a corporate taxonomy using the Managed Metadata Service.

My preferred method to accomplish this would have been SharePoint PnP PowerShell module using Set-PnPTaxonomyFieldValue (reference: https://msdn.microsoft.com/en-us/pnp_powershell/setpnptaxonomyfieldvalue). Unfortunately, for various reasons that wasn't possible in my client's environment so we had to resort to writing the PowerShell script ourselves.

As many of you know, you cannot set managed metadata field values the same way you would a regular metadata field. In searching the web for some guidance on which methods to use, I found that out of the blogs and documentation available, much of it was either incomplete or incorrect. So, with the help of a few kind folks on Twitter, namely Erwin Van Hunen (@erwinvanhunen) and Chris Kent (@thechriskent), I was able to work out a solution. I'd like to share here how that was accomplished so that others might benefit from Erwin's and Chris' assistance and from my experience. I'll try to be as complete as possible here, so that you have a full solution.

An important step is to download and install the latest SharePoint Online Client Components SDK. At the time of publishing, Microsoft had recently released a new version for September 2017, which can be downloaded from here: https://www.microsoft.com/en-ca/download/details.aspx?id=42038. This must be installed on the computer that will be running this script. Now onto our PowerShell code:

First, we'll set some basic variables:
$UserCredentials = Get-Credential
$webUrl = "https://mytenant.sharepoint.com/sites/mySiteCol/mySubsite/"
$listName = "MyList"
$itemToEdit = "/sites/mySiteCol/mySubsite/MyList/MyDocument.docx"
$termGroupName = "My Term Group"
$targetField1Name = "Field 1"
$targetField2Name = "Field 2"
$targetField1Value = "Value 1"
$targetField2Value = "Value 2"

Now, we setup our paths to the SharePoint Online Client Components SDK. Specifically, notice that we are using Microsoft.SharePoint.Client.Taxonomy.dll. These are the default paths where these DLLs should be installed on your computer.
$sCSOMPath = "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI"
$sCSOMRuntimePath=$sCSOMPath +  "\Microsoft.SharePoint.Client.Runtime.dll"
$sCSOMTaxonomyPath=$sCSOMPath +  "\Microsoft.SharePoint.Client.Taxonomy.dll"
$sCSOMPath=$sCSOMPath +  "\Microsoft.SharePoint.Client.dll"
Add-Type -Path $sCSOMPath
Add-Type -Path $sCSOMRuntimePath
Add-Type -Path $sCSOMTaxonomyPath

Next, we create our context and authenticate:
$context = New-Object Microsoft.SharePoint.Client.ClientContext($WebUrl)
$context.AuthenticationMode = [Microsoft.SharePoint.Client.ClientAuthenticationMode]::Default
$credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($UserCredentials.UserName, $UserCredentials.Password)
$context.Credentials = $credentials

Now, we retrieve our managed metadata fields from our Document Library and create our Taxonomy fields:
#get the collection of lists in our site and find our list by name
$lists = $context.web.Lists
$context.Load($lists)
$list = $lists.GetByTitle($listName)

#get our target field objects
$field1 = $list.Fields.GetByInternalNameOrTitle($targetField1Name)
$field2 = $list.Fields.GetByInternalNameOrTitle($targetField2Name)
$context.Load($field1)
$context.Load($field2)
$Context.ExecuteQuery()

$txField1 =[Microsoft.SharePoint.Client.ClientContext].GetMethod("CastTo").MakeGenericMethod([Microsoft.SharePoint.Client.Taxonomy.TaxonomyField]).Invoke($Context, $field1)
$txField2 = [Microsoft.SharePoint.Client.ClientContext].GetMethod("CastTo").MakeGenericMethod([Microsoft.SharePoint.Client.Taxonomy.TaxonomyField]).Invoke($Context, $field2)

Next, we create a session with the Managed Metadata Service and retrieve the terms we wish to set. This allows us to validate that the term values we're trying to set actually do exist in the term store.
$session = $spTaxSession = [Microsoft.SharePoint.Client.Taxonomy.TaxonomySession]::GetTaxonomySession($context)
$session.UpdateCache();
$context.Load($session)

$termStores = $session.TermStores
$context.Load($termStores)
$Context.ExecuteQuery()

$termStore = $TermStores[0]
$context.Load($termStore)
$Context.ExecuteQuery()

$groups = $termStore.Groups
$context.Load($groups)
$Context.ExecuteQuery()

$groupReports = $groups.GetByName($termGroupName)
$context.Load($groupReports)
$context.ExecuteQuery()

$termSetField1 = $groupReports.TermSets.GetByName($targetField1Name)
$termSetField2 = $groupReports.TermSets.GetByName($targetField2Name)
$context.Load($termSetField1)
$context.Load($termSetField2)
$context.ExecuteQuery()

$termsField1 = $termSetField1.GetAllTerms()
$termsField2 = $termSetField2.GetAllTerms()
$context.Load($termsField1)
$context.Load($termsField2)
$context.ExecuteQuery()

foreach($term1 in $termsField1)
{
    if($term1.Name -eq $targetField1Value)
    {
        Write-Host "Found our term in the termset: $($term1.Name) with id: $($term1.id)"
        break
    }
}

foreach($term2 in $termsField2)
{
    if($term2.Name -eq $targetField2Value)
    {
        Write-Host "Found our term in the termset: $($term2.Name) with id: $($term2.id)"
        break
    }
}

if(($term1.Name -ne $targetField1Value) -or ($term2.Name -ne $targetField2Value))
{
    Write-Host "Missing term set values.  Double check that the values you are trying to set exit in the termstore. Exiting."
    exit
}

Finally, find the item we want to edit, check it out (if file checkout is required), update the taxonomy metadata fields and check the document back in:

#get all items in our list
$listItems = $list.GetItems([Microsoft.SharePoint.Client.CamlQuery]::CreateAllItemsQuery())
$context.Load($listitems)
$context.ExecuteQuery()  
$itemFound = $false

foreach($item in $listItems)  
{        
    If($($item["FileRef"]) -eq $ItemToEdit)
    {
        Write-Host "Found our item..."
        $itemFound = $true

        $context.Load($item.File) 
        $context.ExecuteQuery();

        #Checkout the item and assign the content type to the document
        if($item.File.CheckOutType -eq "None")
        {
            $item.File.CheckOut();
            Write-Host "Item has been checked out"
        }      

        #create a brand new field and set our Year managed metadata field
        #you cannot simply reuse the term object you found above in the term store
        $txField1value = New-Object Microsoft.SharePoint.Client.Taxonomy.TaxonomyFieldValue 
        $txField1value.Label = $term1.Name           # the label of your term 
        $txField1value.TermGuid = $term1.Id          # the guid of your term 
        $txField1value.WssId = -1                    # the default value

        $txField1.SetFieldValueByValue($item,$txField1value)

        #create a brand new field and set our Period managed metadata field
        #you cannot simply reuse the term object you found above in the term store
        $txField2value = New-Object Microsoft.SharePoint.Client.Taxonomy.TaxonomyFieldValue 
        $txField2value.Label = $term2.Name           # the label of your term 
        $txField2value.TermGuid = $term2.Id          # the guid of your term 
        $txField2value.WssId = -1                    # the default value

        $txField2.SetFieldValueByValue($item,$txField2value)

        #update the item with all changes
        $item.Update()
        $item.File.CheckIn("Item metadata values have been updated", 1)  

        Write-Host "Item has been checked in with updated metadata values"

        $context.ExecuteQuery();

        break;
    }
}

if($itemFound)
{
    Write-Host "Updating the item's metadata is complete."
}
else
{
    Write-Host "Item not found: $itemToEdit"
}

I'm sure there are ways to make some of this more efficient, and of course the most efficient method of all would be to use the SharePoint PnP PowerShell module using Set-PnPTaxonomyFieldValue (reference: https://msdn.microsoft.com/en-us/pnp_powershell/setpnptaxonomyfieldvalue) if you're environment allows it at the time. However, I wanted to paint a complete picture of what's involved in building such a script if you have to create it from scratch in PowerShell yourself.

Enjoy.
-Antonio

Monday, August 14, 2017

Raise Your Office 365 Secure Score

Thanks to everyone that attended our webinar last week on Office 365 Secure Score.

For many organizations moving to Office 365 or other Cloud services, the concepts of security, compliance and risk are complex. They require learning about how these security concepts have changed and how they’re now implemented in a Cloud first, Mobile friendly world. They often require working with security experts to evaluate the current state of the security for the Cloud application that you’re concerned about… and determining which security capabilities and features you are and are not yet making use of.

When we worked in on premise server environments, things seemed almost easier in some ways because our server farms which hosted Exchange, SharePoint, Skype for Business and so on were all within our corporate networks. They were more under our control, and we felt some level of comfort from being able to stop internet traffic at the network boundary, usually through our firewalls or gateways.
Regardless of how truly secure our not our networks and applications in fact were, we often gained some comfort from this boundary.

With the advent of Cloud computing, with the desire to do work on a whole range of Mobile devices, even our own personal devices, and with the desire to access our services for work from anywhere in the world, moving to services which are hosted on servers and in data centers that are not under our control often feels like we’ve lost that comfort… that assurance that we’re controlling the security of our critical IT services, or it feels that we’ve given the management of our security over to someone else (that we can’t see, that we can’t talk to and that we don’t know).

When, in actuality, often services like Office 365 are more secure than we could have ever hoped to deploy in our own environments… often we have more control over how our services are secured than we’ve ever had. We often just aren’t aware yet of the security benefits that come out of box with Office 365, and we’re not aware of the security capabilities that are available for us to use.

Office 365 Secure Score is a security analytics tool from Microsoft that comes with your Office 365 subscription. Its built to help us understand and navigate all of the security options that are available to us in Office 365.

It’s a relatively new feature from Microsoft, released early this year. Its purpose is really to:
  • Help us understand our current security posture
  • Help us understand which security features we are using and not yet using
  • Help us understand the impact of rolling out new security features to our end users and administrators, and what the security benefits are to us
  • Help us understand how we can improve our security posture, and it even tracks our progress over time

My presentation slides are available here:


Please reach out and let me know if you have any questions.

Enjoy.
-Antonio

Monday, July 31, 2017

SPSNYC: Office 365 Security - MacGyver, Ninja or SWAT Team

Thanks to everyone that attended my session at SharePoint Saturday NYC this past weekend. We had a great group in the room and some really good questions.

This presentation was designed to address 3 different roles that may be charged with the responsibility of managing and securing their organization's Office 365 environment:
  • MacGyver - or the IT Team Member that's self-trained, has been handed Office 365 and told to manage and secure it for the organization
  • Ninja - or the Security Expert who is formally trained, knows their stuff when it comes to information security and was given responsibility for securing their organization's Office 365 environment
  • SWAT Team - or the Information Security Team comprised of multiple security experts, with distributed roles and responsibilities

You can find the slides from my presentation here:

Please feel free to reach out to me if you have any questions at all.

Enjoy.
-Antonio

Tuesday, May 16, 2017

SharePoint Virtual Summit 2017 - Share with Confidence! #SPSummit


Today Microsoft hosted one of the most highly anticipated SharePoint events:
SharePoint Virtual Summit!

Many of us have been looking forward to this event for weeks and today's event did not disappoint. I tend to focus on the security and governance capabilities when it comes to SharePoint and Office 365, and one of the lines in today's #SPSummit that struck me most was the phrase 'Share with Confidence'! Those of us that work with information every day, even those whose job it is to secure information or oversee the security of information systems, we want to share information with others. Information sharing is a key principle of any collaboration solutions like SharePoint Online. However, we want to be confident that we're sharing with the right people, under the right conditions, and that the information we share is still being protected. Some of today's SharePoint Online announcements really do help improve the Sharing experience in Office 365 so that we can Share with Confidence!


Here are some of my favorite announcements from today that I believe help us better secure our content and share it confidently with others...

Monday, April 10, 2017

Office 365 Audit Log Data - How long are my logs retained for?

I'm a big fan of the Unified Audit Log in Office 365. Its a fantastic tool for monitoring user activity for suspicious behavior, getting automated alerts when particular activities occur and investigating data breaches. I'm talking about the central logging facility within Office 365 that collects log data from many Office 365 workloads, and can be searched in the Office 365 Security and Compliance Center: Go to https://protection.office.com > Click Search & Investigate > Click Audit Log Search.

I often get asked the question, how long are Office 365 log entries stored or retained for? There are several answers...

Monday, April 3, 2017

Security Controls in the OneDrive for Business Admin Center

Microsoft recently added a new and extremely helpful Admin Center to Office 365 specifically for OneDrive for Business.

In terms of additional security controls this is a great addition because it allows us to more easily control access and sharing specifically in OneDrive for Business, and not just SharePoint Online. Many of the external sharing settings overlap with those already available for SharePoint Online sites. However, this is a very good start and we look forward to seeing more capabilities added over time to help us control and manage how our users share content with those outside of our organizations.

For now, let's take a closer look at the security controls now available for OneDrive for Business..

Friday, February 10, 2017

A Practical Overview of Office 365 Advanced Security Management - Part 3
Security Policies

Microsoft Office 365 Advanced Security Management is a capability within the Office 365 platform that allows organizations to go above and beyond the typical security management features, helping them to better secure users, permissions, content and apps. This multi-part blog series will look at how to use the features that make up Advanced Security Management (ASM) and share technical details that will help you to understand the benefits of these robust tools.

In part 1, we provided an Introduction to Advanced Security Management and shared technical information about how it works with the Office 365 Unified Audit Log: A Practical Overview of Office 365 Advanced Security Management - Part 1.

In part 2, we reviewed ASM's Productivity App Discovery Dashboard in depth to see how log files can be imported, how to create reports & interpret the analysis results and how you can try it with built-in sample logs: A Practical Overview of Office 365 Advanced Security Management - Part 2.

In part 3, we review the Security Policies that may be configured to control, monitor and alert on specific user behaviors.

Wednesday, January 25, 2017

A Practical Overview of Office 365 Advanced Security Management - Part 2
Productivity App Discovery Dashboard

In the middle of 2016, Microsoft released the first version of Office 365 Advanced Security Management, a new capability within the Office 365 platform that allows organizations to go above and beyond the typical security management features, helping them to better secure users, permissions, content and apps. This multi-part blog series will look at how to use the features that make up Advanced Security Management (ASM) and share technical details that will help you to understand the benefits of these robust tools.

In part 1, we introduced Advanced Security Management and shared technical information about how it works with the Office 365 Unified Audit Log:
A Practical Overview of Office 365 Advanced Security Management - Part 1.


In part 2, we review the Productivity App Discovery Dashboard capability of ASM to see how log files are imported, how to create reports and review the results of ASM's analysis of those logs, and how you can try it out with some built-in sample logs.


Monday, December 19, 2016

A Practical Overview of Office 365 Advanced Security Management - Part 1
Introduction & Audit Logs

In June 2016, Microsoft released its first iteration of Office 365 Advanced Security Management, a new capability within the Office 365 platform that allows organizations to go above and beyond the typical security management features that let them secure users, permissions, content and apps. In September the team added the Productivity App Discovery feature, and in October the solution continued to progress with additional capabilities to manage app permissions.

This multi-part blog series will look at how to use the features that make up Advanced Security Management and share some technical details that you will hopefully find helpful.

In part 1 of this series we introduce Advanced Security Management and share technical details about how it works with the Office 365 Unified Audit Log.
Let's jump in...

SharePoint 2010 Security Patches - How Vulnerable Are You?
UPDATED: December 2016

YES, this blog post is about SharePoint 2010! 

YES, SharePoint 2010 is old, over 6 years old actually. 
YES, its no longer officially supported by Microsoft, without very specific Premiere Support that is.
YES, we still see a lot of it out there!
YES, if you're going to continue to stick with SharePoint 2010 for now, you must keep current with security patches!

One of the most common security issues we see with SharePoint 2010 farms is that administrators have not kept up with security patches and updates.  This not only makes it difficult to support and maintain the environment, but it also opens your farm up to security vulnerabilities - security vulnerabilities that have already been fixed! 

This article reviews all SharePoint 2010 security updates that have been released in the last 5+ years since Service Pack 1, and discusses the importance of keeping up to date with those patches.