Follow me on Twitter @AntonioMaio2

Wednesday, January 1, 2020

Introducing My New Blog: AntoniO365

After 8 years of using this blogging platform (Blogger), I've decided to move my writings to a new platform with a new name.

At AntoniO365, I'll write about more than SharePoint security and identity.  I'll touch on a lot of different aspects of Microsoft 365, including the different services available, how to manage your environment and how to architect solutions within the Microsoft Cloud platform.  My work in recent years has focused heavily on cloud architecture, compliance, data governance, records management and information protection.  I want to share information about all of those topics, and I felt that needed a new platform.

I hope you have enjoyed TrustSharePoint!  It has been a pleasure to write.  I'll keep it live for a few more years so you can keep coming back to it if you need to.

Please join me at for new articles and information on these and other topics.


Monday, April 22, 2019

SPSTC: An Introduction to Enterprise Mobility + Security

[I posted this to the blog back on Apr 7, 2019, but just realized blogger had not actually published it]

Thanks to everyone that attended my session at SharePoint Saturday Twin Cities in Minneapolis on April 6/19!  You can find my presentation slides below, and should be able to download them from SlideShare.  This session had a lot of information to in it, walking through the various components that make up the Microsoft Enterprise Mobility + Security offering and the licensing options around those offerings.

The licensing in particular is complex, but ultimately your options for this offering are:

Enterprise Mobility + Security E3

This subscription includes:

  • Azure Active Directory Premium P1
  • Microsoft Intune
  • Azure Information Protection P1
  • Microsoft Advanced Threat Analytics
  • Azure Rights Management (part of Azure Information Protection) and the Windows Server CAL rights.

Enterprise Mobility + Security E5  

This subscription includes all the capabilities of Enterprise Mobility + Security E3 plus:

  • Azure Active Directory Premium P2
  • Azure Active Directory Identity Protection (as a feature of AAD Premium P2)
  • Azure AD Privileged Identity Management (as a feature of AAD Premium P2)
  • Azure Information Protection P2
  • Microsoft Cloud App Security
  • Azure Advanced Threat Protection

You can find more information about Microsoft Enterprise Mobility + Security licensing here:  There are other licensing options as well, but these are the primary ones that organization consider when they look to increase the security and compliance features which their organization is leveraging.

Table of Enterprise Mobility + Security

You can learn a lot more about each feature included across all Microsoft Security and Compliance tools from our Table of Enterprise Mobility + Security.  The features and tools are grouped together to identify the tools that help you to accomplish specific related tasks, and clicking on each tile in the table will take you to the Microsoft documentation which is specific to that service or feature.


Thursday, April 4, 2019

The Table of Microsoft Enterprise Mobility & Security!

I'm very happy to announce that I’ve teamed up with jumpto365’s Matt Wade and Niels Gregers Johansen to publish The Table of Microsoft Enterprise Mobility & Security, which is a new addition to the Microsoft Periodic Table series!

Niels, Matt, and I decided at Microsoft Ignite 2018 to work together on a tool that’s been one of the top requested additions to the Periodic Table of Office 365.  That is overview for Microsoft’s Cloud capabilities related to security, compliance and information protection.  Similar to the Office 365 periodic table, the Table of EM&S categorizes similar services together to make the overall offering easier to navigate, and easier to determine which tools are available to you.

 Table of Enterprise Mobility & Security

Considering the breadth of tools available with the EM&S offering, which is maintained by many teams across Microsoft, it can be hard to find central resources providing an overview of the entire suite, which group and describes the tools with respect to each other. This work aims to bring everything together in one spot and make jumpto365 your entry point to understanding the Microsoft Cloud tools that are available to you.

  Table of Enterprise Mobility & Security

Each tile represents a Microsoft service, feature or tool which is related to information protection, security, compliance, and enterprise mobility. Some features are provided as part of the Microsoft Enterprise Mobility + Security offering.  Some come with Office 365 enterprise licenses, some are just built in protections that are critical for people to understand, and some go beyond the Microsoft Enterprise Mobility + Security offering, helping you to understand some of the Advanced Options available for security and compliance

We're highlighting the features and capabilities that are important when considering the security of your Microsoft 365 environment and the tools available to you to help with regulatory compliance.

   Table of Enterprise Mobility & Security

I've worked in the security and compliance space for a very long time, and there are many great solutions built into the Microsoft Cloud which help customers protect their information, secure their tenant, and comply with the regulations that are important to them.  I truly love working with these tools!  In working with many customers though, I find that they often don't know that these tools exist, and learning which one is best for which task is one of the hardest tasks in moving to a more secure and compliant state in Microsoft 365.

The security and compliance tool landscape is vast in Microsoft 365, with a lot of great services, features and tools!  One thing that excites me most about this table is sharing that knowledge with people and giving them an easy way to explore the many security and compliance features available to them.

Links to Documentation and Product Pages

You can jump to the product pages and documentation for each tile in the Table for the particular service or feature, giving you both an overview and access to the in-depth details about how to make use of the service or feature. All of those product pages offer links to the technical documentation, pricing, getting started guides, and live demos.  This lets you check if the service offers what you are looking for before spending money and time on the idea.


All information in the EM+S Table can be found across Microsoft’s Enterprise Mobility + Security service websites. If any changes are made by Microsoft to the EM+S services, we will update the Table as well, so that it stays up to date.

More to Come...

This is the beginning of the Table of Microsoft EM&S. We will continuously update the Table with more features and functions to make it better over time. If you have anything you would want to see on the next version, please let me know in a comment below.

To learn more about my work and what I do, please visit my other blog and follow me on Twitter.

Friday, January 25, 2019

A Practical Introduction to Microsoft Forms & Microsoft PowerApps

As an enterprise architect, working primarily in the Microsoft Cloud, I often get asked questions about which form solution a Client should move forward with in their enterprise. It usually starts with one form that a business stakeholder has requested or suggested, or with one team that wishes to publish a few forms. The request and the questions quick spread to multiple teams that want to do something similar in the spirit of "Going Digital"! Do we continue to use InfoPath like we used to? Do we use SharePoint Designer to create a list form? Are those things still supported, because we've heard they're not? Do we create a custom form on a SharePoint site page with a custom web part... maybe a full page web part? Do we use Microsoft PowerApps, Microsoft Forms, or a third party solution?

So, I thought I'd share my practical thoughts here to hopefully benefit many people wondering about the same question. Microsoft has a long history of form solutions which have come and gone, especially in the case of SharePoint. The SharePoint and Microsoft technology stack for building and hosting online forms has gone through significant flux in recent years. It started with the announcement that Microsoft would discontinue InfoPath back in January 2014 (anyone remember the InfoPath funeral at the SharePoint conference). After several years of flux, we finally have a clear path forward for online forms in SharePoint and in the Microsoft Cloud.

Let's Be Clear on InfoPath and SharePoint Designer
First of all, let's be clear on InfoPath and SharePoint Designer - Microsoft has clarified in recent years that InfoPath and SharePoint Designer will in fact be supported in their last versions, InfoPath 2013 (the client application, as a separate download, and not included with Office 2016 or later) and SharePoint Designer 2013, until July 2026. This means that current and recently released versions of SharePoint, so SharePoint 2016 and SharePoint 2019, will support artifacts created in InfoPath 2013 and SharePoint Designer 2013. As will SharePoint Online, until further notice. However, Microsoft has also been clear that no new work, not features, not updates, not patches, will be put into InfoPath 2013 or SharePoint Designer 2013. Those are the last versions of those applications.
This effectively means that InfoPath and SharePoint Designer are on life support, and are still supported for those on premise and online solutions for Microsoft customers that have a large investment in using InfoPath and SharePoint Designer and cannot yet move to the new modern capabilities.

This also means that new modern capabilities added to SharePoint Online and the Microsoft Cloud will likely not work or integrate with InfoPath or SharePoint Designer. Effectively, the real use cases in which InfoPath and SharePoint Designer may be used in conjunction with SharePoint Online sites to fulfill a business need will get more and more narrow, over a long period of time, until 2026 in fact.

What you've built in the past is still supported, and you could still likely use the tools for something simple, but its highly recommend that you don't look to these technologies to try to build anything modern, or supported on mobile, or integrated across the Microsoft Cloud. You will have a long up hill battle!

Microsoft's Go Forward Online Form Solutions: PowerApps & Forms
As many will tell you, Microsoft's go forward solutions for Online Forms in the Cloud include both Microsoft PowerApps and Microsoft Forms.

Microsoft Forms was released to general availability on April 27, 2018, so its only a little over a year old in general availability (it had a long preview program before that which many of us participated in). The solution is still fairly young, but its intended use cases and purposes are fairly narrow and focused, so it does what it does very well.

Microsoft PowerApps was released to general availability on October 31, 2016. It is only a little over 2 years old. Its important to keep that in mind, because it tells you the technology is young and still evolving. That said, the technology has come a long way in just 2 short years.

Microsoft Forms
Microsoft Forms is essentially a light weight, very basic tool for creating surveys, quizzes and polls that are intended to quickly collect information. Some general use cases in which we have seen Microsoft Forms are:
  • Surveys to collect end user feedback
  • Short forms asking users to register for an event or to gauge interest in an event
  • Simple forms requesting contact information from users
  • Polls to gather employee or customer satisfaction
  • Educational environments where teachers wish to publish a quiz to students to measure information retention, or to test knowledge of a topic and evaluate progress

With Microsoft Forms, you really can create a simple online form in minutes which fulfills these use cases. Microsoft Forms does not replace InfoPath or SharePoint Designer list forms, due to the simple nature of forms it can create. But it does very quickly fill one particular need with respect to Forms. It allows you to very quickly and easily:
  • Create forms for surveys, polls or quizzes with a simple set of varied controls, and using simple conditions
  • Publish those forms to the internet for users to fill out anywhere, any time, on any device (desktop, laptop, tablet or mobile)
  • Collect submitted data in a central place, which can be aggregated, summarized and analysed by other tools
  • Automatically trigger workflows created in Microsoft Flow which can integrate the collected data from Microsoft Forms into other systems

One feature that Microsoft Forms has over PowerApps, is that Forms can be optionally be published so that they can be accessed anonymously. That's correct, if you need to publish a form to the internet that you want people on the internet to access and fill out and not require them to login (because maybe they don't have a user account in your Office 365 tenant) you cannot do that with Microsoft PowerApps, but you can do that with Microsoft Forms. This is not the default configuration, but when you publish a form in Microsoft Forms you can choose to publish it anonymously and not require users to login - when you do this, any person on the internet with a link to the form can respond to it.

Access to Microsoft Forms
Access is controlled through your Office 365 license, and all Microsoft Office 365 enterprise licenses include one flavor or another of the Microsoft Forms SKU, including Office 365 Enterprise E1, Enterprise E3 and Enterprise E5.

There are numerous flavors of the Microsoft Forms license itself, including those focused on the enterprise (Microsoft Forms plan E1, plan E3, plan E5), those focused on kiosks or unattended applications (plan K), and those focused on Education (Plan 2 and Plan 3). You can control if a user has access to Microsoft Forms for the purpose of creating a publishing a form by turning ON or OFF the Microsoft Forms SKU in their Office 365 license.

You can learn more about which licenses include Microsoft Forms here.

Microsoft Forms is also available for free to Hotmail and Outlook/Live Microsoft accounts, with some limitations.

Controls Available in Microsoft Forms and Other Options
There are many common control options available when you're designing your forms, which are:
  • Choice fields where you only select 1 answer (radio buttons or dropdown)
  • Choice fields where you select multiple answers (check boxes)
  • Text Fields
  • Ratings
  • Dates
  • Net Promoter Score Fields (announced at Ignite 2018; for example, "How likely are you to recommend this to a friend?" with a choice from 1 to 10)

Other options include:
  • Options to make fields required
  • Options to order fields as desired
  • Options to shuffle the options presented to users
  • Options for titles and subtitles on form questions
  • Branding options in the form title
  • Suggested questions based on how you start your form
  • Creative ideas presented to you as you are developing your form

Important Technical Notes about Microsoft Forms
The following are other important technical notes and limitations related to Microsoft Forms:
  • All data submitted through Microsoft Forms is stored on servers in the United States or Europe (only if your Office 365 tenant is hosted in Europe). So, if your Office 365 tenant was created and is hosted in a data center outside of the United States or Europe, your form, its configuration and any data submitted through your form is stored and hosted in servers within a US data center. This may or may not fit with your data residency requirements, so please consider the use of Microsoft Forms carefully with this in mind.
  •  If a user who creates and publishes a form using Microsoft Forms, leaves the organization and their account is disabled and/or their Microsoft Forms license is removed, then all Microsoft Forms configuration and data, including submitted form responses, will be deleted 30 days after their user account is deleted from your Azure AD instance.
  •  Conditional Access does integrate with Microsoft Forms. You can select Microsoft Forms as a Cloud App in the Cloud Apps assignment.

There are some limitations as to how many forms a user account may create, and how many responses they can receive. Forms created using an enterprise or commercial accounts:
  • A single user account may create up to 200 forms
  • A single form may have up to 100 questions
  • A single form may receive up to 50,000 responses

Finally, surveys and quizzes allow you to collaborate with others during the creation process by creating and sharing a link to the form with other users. You can use this same method to save forms as templates and reuse them over and over again.

Microsoft PowerApps
Microsoft PowerApps is cloud based technology only available in the Microsoft Cloud, which allows business analysts as well as software developers to build custom business applications.  It is Microsoft’s go-forward solution for online forms, and is the intended replacement technology for InfoPath forms, as well as all previous form technologies.

The solution is not only targeted at software developers.  The solution is targeted at business analysts or technical specialists within a business function (as opposed to business users) as some technical abilities are typically required to build simple PowerApps solutions.  Often business users can easily start a PowerApps solution, but very quickly they find that some knowledge of JSON or expressions/formulas is required to achieve the business functionality they wish.

Therefore, PowerApps is typically viewed by most enterprises as a “low-code” and “rapid application development” solution for building custom business applications in the Microsoft Cloud.  When developing a PowerApps application, there are two (2) types of applications that may be created:

Canvas App
A canvas app allows the app developer to layout supported controls wherever they wish on the page and construct multi-page applications.

Model Driven App

A model driven app is created and designed for the most part based on the data fields you select for the app.  They are tightly integrated with the Common Data Service (CDS) which is the common data model used within Dynamics 365.  As you develop a model driven app, you create entities and fields within the CDS, and the controls are automatically laid out on your form to support reading and writing of data from and to those data structures.

All Office 365 enterprise licenses include a PowerApps for Office 365 license.  This provides all Office 365 users with standard PowerApps designer capabilities, in effect enabling all users to create their own PowerApps.  The PowerApps for Office 365 license provides access to Canvas Apps, and it provides access to the Common Data Service (CDS), however only in the default environment.

Any user that will run a PowerApps, meaning if they will fill out an online form built on PowerApps, will run it under the context of their own user account and therefore requires a PowerApps license.

The default PowerApps for Office 365 license has limitations in the capabilities which are available to users.  PowerApps also provides higher level licenses, named Plan 1 and Plan 2:
  • PowerApps Plan 1 provides access to the Common Data Service for Apps to store and manage data in additional environments. Users can run canvas apps that are built on the Common Data Service for Apps, use premium connectors, access data in custom applications or on-premises data.
  • PowerApps Plan 2 allows users to run model-driven apps with code plug-ins and real-time workflows.

 For more information on PowerApps license plans please refer to the Microsoft article here.

Friday, December 28, 2018

SharePoint Conference North America 2019: Discover End to End Records Management

I am very pleased to announce that I will be co-presenting at the SharePoint Conference North America 2019 (May 21 to 23 in Las Vegas) with Erica Toelle. Our session is titled Discover end-to-end records management in Microsoft 365 and we're very excited to share with you all that we know about records management in the Microsoft 365 platform.

Erica and I have been friends for several years, and we often discuss how organizations are managing their records and we talk about the capabilities available from Microsoft in the cloud platform. These capabilities have evolved significantly over the last year, and we're really excited to share what's now available.

To give you a quick preview of our session, Erica and I will be covering the following topics...

Office 365 Labels and Label Policies
At Ignite 2018 Microsoft announced Unified Labeling Management. With unified labels, you have a single place to manage sensitivity labels that help classify and protect your sensitive data, as well as manage retention labels that help govern the lifecycle of your data. This unification brings together Azure Information Protection (AIP) labels and Office 365 labels into one management interface and a set of functionalities that can be used to govern data.
Advanced Data Governance for Automatic Labeling The Office 365 Advanced Data Governance (ADG) feature set works together with unified labels so you can automatically apply labels to content that meet certain criteria. This way end user does not have to manually label content, ensuring it is appropriately managed for compliance and protected from misuse. Protecting Sensitive Information In the Unified Labeling Management experience, Microsoft as integrated the management of Microsoft Information Protection (MIP) labels as well. So you can now configure and manage both labels for retention and levels for sensitivity. Sensitivity labels allow us to protect sensitive content based on a classification. Data Loss Prevention (DLP) identifies sensitive information, such as social security numbers and bank account information, and adds additional policies and protections. DLP can govern sensitive information where it lives, when we route it, and when we share information. Retaining Information and Disposition Most important to Records Management is the ability to retain and delete information. We will look at how you build a file plan for Office 365 and use it to enforce retention to prevent content from being deleted. On the flip side, you can also use retention policies to delete content. For example, some organizations want to delete Microsoft Teams conversations after 30 days, similar to what was previously done with Skype conversations. Label Analytics and the Activity Explorer How can you tell if your label and records management strategy is working? That’s where the Label Activity Explorer can help. It provides analytics about the application of labels. You can look at what labels have been manually and automatically applied, where, and by whom in addition to other data. End User Experience in Microsoft 365 What does records management look like to the end user? Is it intuitive? We will look at how an end user can manually apply a label to content in SharePoint and Exchange. We'll look at how you can avoid impacting end users but still enforce record management policies for your content. We will also demonstrate what the end user sees when a label is applied automatically. Real Life Stories Implementing Microsoft 365 Records Management Finally, we'll talk about some real-life case studies of how we use Microsoft 365 Records Management in the real world. This overview includes the deployment approach, tips and tricks, and best practices. We hope to add a few more topics if Microsoft releases additional functionality before the conference in May.
We really hope you'll be able to join us for this session! See you in Las Vegas this May...

Wednesday, February 7, 2018

Step by Step: How to Fine Tune Sensitive Data Types in Office 365

Office 365 provides the Data Loss Prevention feature or DLP which allows you to automatically identify sensitive data across workloads in Office 365. This means that you can have 1 set of policies that all apply to SharePoint Online sites, OneDrive for Business sites and Exchange email, or you can have different policies that apply to each workload.

The Office 365 DLP policies currently supports many sensitive data types which represent all types of information that an individual or business may want to product, information like US social security numbers, various European passport numbers or identity card numbers, credit card numbers and so on. At the time this post was written it support 82 sensitive types. These sensitive data types in many cases include a regular expression that is matched, an extensive list of keywords that are searched for within a proximity to the sensitive data, and in many cases a checksum that is calculated (for example, running the Luhn's algorithm on a suspected credit card number). An inventory of the sensitive data types supported along with exactly what each looks for can be found here: .

You are also able to modify existing sensitive data types or create custom sensitive data types which not only get used by Office 365 DLP, but also by features like Office 365 Labels, and Office 365 Advanced Data Governance. You may want to create a custom sensitive data type if you have a custom piece of data within your organization that follows a particular well-defined pattern and that you need to look for within documents or emails. An example is if you have a custom format for an employee number. Another case is if you live in a country that has an identity card number format or driver's license format, for example, which is not represented by the built-in Office 365 sensitive data types.

There are a few sites/articles out there that shares steps on how to do this, but I have found many of them to be incomplete. Recently we had to create some custom sensitive data types for a GDPR project and I wanted to share my experience at creating custom sensitive data types.

In our walk-through, the example of a custom sensitive data type I'm going to use is EU Debit Card Number. We're going to envision a scenario where we're looking for this type of sensitive data as part of a GDPR project and we are getting enough false positives that we want to try to make looking for this type more accurate, by introducing new keywords, adjusting the proximity parameter and modifying the confidence level.

Most Practical Approach

The most practical approach when creating or customizing a sensitive data type is to create a new sensitive data type based on an existing one, giving it a unique name and identifiers. For example, if you wish to adjust the parameters of the “EU Debit Card Number” sensitive data type, you could name your copy of that rule “EU Debit Card Enhanced” to distinguish it from the original. In your new sensitive data type, simply modify the values you wish to change to improve its accuracy. Once complete, you will upload your new sensitive data type and create a new DLP rule (or modify an existing one) to use the new sensitive data type you just added. Modifying the accuracy of sensitive data types could require some trial and error, so maintaining a copy of the original type allows you to fall back to it if required in the future.

Customize the Sensitive Data Type

The following is the detailed step by step process that is necessary to create a custom sensitive data type or modify an existing one.

1. Export the existing Rule Package of built in sensitive data types that are available in Office 365

a. At the PowerShell command prompt create a connection to Exchange Online:
$ UserCredential = Get-Credential
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri -Credential $UserCredential -Authentication Basic -AllowRedirection
Import-PSSession $Session

Note: At the start of this example we have to use the Exchange Online PowerShell module and at the end of the example we're using the newer Security and Compliance Center PowerShell module. This is intentional.

b. Export the current rule collection to an XML file:
$ruleCollection = Get-ClassificationRuleCollection
Set-Content -path "C:\exportedSensitiveTypes.xml" -Encoding Byte -Value $ruleCollection.SerializedClassificationRuleCollection

c. Copy the XML file you just exported and give it a different file name – for example: MyNewDLPRule.xml. Keep the original exported rule xml file for reference while you’re constructing your new file.

2. Open the XML file you just copied/renamed in your favorite XML editor

3. You will need to generate 2 GUIDs and replace those in the sensitive data type that you are modifying. At the PowerShell command prompt, type the following and record your new GUID. Do this a second time and record that new GUID as well.


4. The XML file is large, so for simplicity we’re going to start with an existing sensitive data type and isolate it in our new rule xml file by removing all other sensitive data types. In this case, we’re going to modify the “EU Debit Card Number” sensitive data type.

Note: the following details are important when modifying the rule’s XML structure.

a. The <rulepack> element of the file contains information about the publisher, in this case Microsoft. It also contains localized versions of all the Publisher strings. The <rulepack> definition contains an id property which is the first place where you’ll need to replace the existing GUID with a new one you just generated.

b. The <rules> element contains the definition of your sensitive data type.

c. The <rules> element is made up of the <entity> element which defines the pattern to use, the <keyword> element which defines the list of keywords to match, a <regex> element which defines the regular expression that’s used in the pattern, and a <localizedstrings> element which defines the default and localized strings displayed in the DLP rule UI.

d. The <entity> element contains an id property which is the second place where you’ll need to replace the existing GUID with a new one you just generated. This GUID is referenced further down within the <localizedstrings> element, as the id for a <resource> element. This <resource> element identifies both default and localized strings to use when displaying your sensitive data type in the DLP UI. If you are deleting <entity> and <resource> elements in order to simplify the XML file you are working with, ensure that you do not delete the <resource> element that is referenced using the GUID by your sensitive data type’s <entity> element.

e. The <entity> element also contains several <match> elements, which identify the names of the keyword lists that are used by that sensitive data type. These appear as follows:

<any minmatches="1">
<match idref="Keyword_eu_debit_card">
<match idref="Keyword_card_terms_dict">
<match idref="Keyword_card_security_terms_dict">
<match idref="Keyword_card_expiration_terms_dict">
<match idref="Func_expiration_date">

f. The keyword list named "Keyword_card_terms_dict" is defined further down in the XML file within a <keyword> element. In this case, the <keyword> element contains a <group> element and multiple <term> values, each of which define a keyword that is used by the Office 365 DLP to identify sensitive information. Keyword lists are often referred to as “corroborative evidence” when defining DLP rules. Keyword lists defined within the XML structure can be used by multiple sensitive data types – for example, the Credit Card Number and EU Debit Card Number sensitive data types share some of the same keyword lists. The same is true for <regex> definitions. Keyword lists defined in this file can be modified to add additional, more specific keywords or to remove keywords. Once again, if you are deleting <keyword> lists throughout the original XML file, ensure that you do not delete a <keyword> element that is referenced within your sensitive data types <entity> element.

g. The <entity> element contains three elements which are used to define a pattern that the sensitive data type must match: IdMatch, Match, Any. In order to promote re-usability of definitions across multiple patterns, both the IdMatch and Match elements do not define the details of what content needs to be matched but instead reference it through the idRef attribute.

5. Now that we have our sensitive data type XML file ready to edit, we’ll make some basic modifications in order to identify our sensitive data type as a new unique type when configuring a DLP rule.

a. Modify the id property of the <RulePack> element. This id should be replaced with one of the new GUIDs we created.
<RulePack id="bd2568b4-b331-4387-b399-7e46065f6994">

b. Within the <RulePack> element, find the <Publisher> element and replace its id property with the second of the new GUIDs we created.
<Publisher id="ac9a7b29-870f-4810-a96f-6b4080c67e5d" />

c. Modify the id property of the <Entity> element which represents our sensitive data type. This id should be replaced with third of the new GUIDs we created.
<Entity id="48da7072-821e-4804-9fab-72ffb48f6f78" patternsProximity="300" recommendedConfidence="85">

d. Within the <RulePack><Details><LocalizeDetails> element, find the <PublisherName>, <Name> and <Description> elements. Modify the value of these elements to unique values.
<Details defaultLangCode="en">
  <LocalizedDetails langcode="en">
    <Name>Contoso Rule Package</Name>
    <Description>Defines the set of classification rules for Contoso</Description>

e. Within the <LocalizedStrings> element, find the <Resource> element which had the same id as our <Entity> element and modify its id to match the GUID that we assigned to the id of the <Entity> element.

f. Within the <LocalizedStrings><Resource> element we just modified, find the <Name> and <Description> elements and modify their values so that our sensitive data type has a unique name and description. This will help us better select the correct sensitive data type when we configure a DLP rule.
<Resource idRef="48da7072-821e-4804-9fab-72ffb48f6f78">
   <Name default="true" langcode="en-us">EU Debit Card Number Enhanced</Name>
   <Description default="true" langcode="en-us">Detects European Union debit card number with enhanced accuracy.</Description>

g. If you are going to start with an existing sensitive data type and that type contains existing keyword lists in the definition, it is important to note that some keyword lists are in use by Microsoft’s built in sensitive data types and names of those lists may not be reused in other sensitive data type definitions. In the case of the “EU Debit Card Number” example, it makes use the following Microsoft keyword lists:
<Keyword id="Keyword_card_expiration_terms_dict">
<Keyword id="Keyword_card_security_terms_dict">
<Keyword id="Keyword_card_terms_dict">

Those lists are currently in use by multiple built in sensitive data types and we are not permitted to reuse those names. Therefore, we must modify those names in the <Keywords> element such that they are unique, as follows:
<Keyword id="Keyword_card_expiration_terms_dict_enhanced">
<Keyword id="Keyword_card_security_terms_dict_enhanced ">
<Keyword id="Keyword_card_terms_dict_enhanced ">

Those names must also be modified within the <Entity> element where they are referenced:
<Entity id="48da7072-821e-4804-9fab-72ffb48f6f78" patternsProximity="150" recommendedConfidence="85">
      <Pattern confidenceLevel="85">
        <Any minMatches="1">
          <Match idRef="Keyword_card_terms_dict_enhanced" />
          <Match idRef="Keyword_card_security_terms_dict_enhanced" />
          <Match idRef="Keyword_card_expiration_terms_dict_enhanced" />

Fine Tune a Sensitive Data Type to Avoid False Positives or Look for Organization-Specific Information

Now we’ll make some modifications to our custom sensitive data type in an attempt to improve its accuracy. Improving the accuracy of DLP rules in any system requires testing against a sample data set, and may require fine tuning through repetitive modifications and tests. When searching for an EU Debit Card Number in our example, the definition of that number is strictly defined as 16 digits using a complex pattern, and being subject to the validation of a checksum. We cannot alter this pattern due to the string definition of this sensitive data type. However, we can make the following adjustments in order to improve the accuracy of how Office 365 DLP finds this sensitive data type in our content within Office 365:

1. Proximity Modifications
We can modify the character pattern proximity to expand or shrink the window in which keywords must be found around the sensitive data type. In our case we’ll shrink the window by modifying the patternProximity value in our <Entity> element from 300 to 150 characters. This means that our corroborative evidence, or our keywords, must be closer to our sensitive data type in order to signal a match on this rule.
<Entity id="48da7072-821e-4804-9fab-72ffb48f6f78" patternsProximity="150" recommendedConfidence="85">

2. Keyword Modifications
We can add keywords to one of our <Keywords> element in order to provide our sensitive data type more specific corroborative evidence to search for in order to signal a match on this rule. These keywords could be organization specific keywords, or language specific keywords. Alternatively, we might find that some keywords are causing false positives to occur and as a result we may want to remove keywords. Keywords are added by navigating to our <Keywords> element for one of the three keyword lists provided with the “EU Debit Card Number” definition and adding additional <Term> elements, with the keywords as values (one <Term> per additional keyword).
<Keyword id="Keyword_card_terms_dict">
        <Term>corporate card</Term>
        <Term>organization card</Term>
        <Term>acct nbr</Term>
        <Term>acct num</Term>
        <Term>acct no</Term>

3. Confidence Modifications
We can modify the confidence with which the sensitive data type must match the criteria specified in its definition before a match is signaled and reported. This is done by modifying the confidenceLevel property on the <Entity><Pattern> element. The more evidence that a pattern requires, the more confidence you have that an actual entity (such as employee ID) has been identified when the pattern is matched. If you remove keywords from the definition, you would typically want to adjust how confident you are that this sensitive data type was found by lowering it from its default level of 85 in the case of the EU Debit Card Number type.
<Entity id="48da7072-821e-4804-9fab-72ffb48f6f78" patternsProximity="150" recommendedConfidence="85">
      <Pattern confidenceLevel="85">

The following is a screenshot of our final sensitive data type definition file (with some elements collapsed):

Upload a New Sensitive Data Type

Now that we've defined our sensitive data type in the XML file structure, we are going to upload it to our Office 365 tenant.

1. At the PowerShell command prompt create a connection to the Office 365 Security & Compliance Center:
$ UserCredential = Get-Credential
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri -Credential $UserCredential -Authentication Basic -AllowRedirection
Import-PSSession $Session

Note: At the start of this example we have to use the Exchange Online PowerShell module and at the end of the example we're using the newer Security and Compliance Center PowerShell module. This is intentional.

2. Create a new Classification Rule in Office 365 and upload your sensitive data type XML definition file:
New-DlpSensitiveInformationTypeRulePackage -FileData (Get-Content -Path "C:\EUDebitCardNumberEnhanced.xml" -Encoding Byte)

When the upload has completed successfully, the following output will appear in the PowerShell console:

3. Trigger a re-crawl of the content within the site collections potentially containing the new sensitive data type

4. Login to Office 365 as an administrator, navigate to the Security & Compliance Center, create a new Data Loss Prevention policy and select the new sensitive data type you just created.

Full Disclosure

You may find that some of this information is reprinted from a Microsoft article titled Office 365 Information Protection for GDPR. This is because I worked with Microsoft to help write and produce that content, and I wanted to re-use some of that material to highlight another scenario. Of course, please refer to both this article and the Microsoft content to get your custom sensitive data type working correctly.

More Information

For more information, refer to the following helpful articles:


Tuesday, November 14, 2017

A Beginner's Guide to Administering Office 365 with PowerShell

With Office 365 PowerShell, you can manage Office 365 for your organization using commands and scripts that streamline your day to day work. Microsoft provides several easy to use admin centers to help manage Office 365. However, whether you’re an Office 365 administrator yourself or a service owner for Office 365 in your organization (working with other administrators), you’ll quickly find that you need to go beyond the capabilities that these admin centers provide. PowerShell can help you automate tasks so that they are easily repeatable, it can help you script management tasks so that they are automatically performed on a schedule and it can help you quickly output large amounts of data about your Office 365 environment. As well, some Office 365 settings are only manageable using PowerShell, with no UX provided. In this session, you’ll learn how to get started with Office 365 PowerShell and how to quickly become productive with it, making you more productive and empowered as you manage your Office 365 environment.

Thank you to those that attended my session today on this topic at SPTechCon DC! You were a great crowd with lots of great questions.

My updated slides can be found here:


Monday, November 6, 2017

Set a SharePoint Online Managed Metadata (Taxonomy) Field Value on a Document or Item

Just last week I had an interesting little project I was helping a client with where we needed to use PowerShell to set 2 metadata field values for a document in a SharePoint Online Document Library. I've done this before many times. However, the challenge here was that the metadata fields were managed metadata fields, configured as part of a corporate taxonomy using the Managed Metadata Service.

My preferred method to accomplish this would have been SharePoint PnP PowerShell module using Set-PnPTaxonomyFieldValue (reference: Unfortunately, for various reasons that wasn't possible in my client's environment so we had to resort to writing the PowerShell script ourselves.

As many of you know, you cannot set managed metadata field values the same way you would a regular metadata field. In searching the web for some guidance on which methods to use, I found that out of the blogs and documentation available, much of it was either incomplete or incorrect. So, with the help of a few kind folks on Twitter, namely Erwin Van Hunen (@erwinvanhunen) and Chris Kent (@thechriskent), I was able to work out a solution. I'd like to share here how that was accomplished so that others might benefit from Erwin's and Chris' assistance and from my experience. I'll try to be as complete as possible here, so that you have a full solution.

An important step is to download and install the latest SharePoint Online Client Components SDK. At the time of publishing, Microsoft had recently released a new version for September 2017, which can be downloaded from here: This must be installed on the computer that will be running this script. Now onto our PowerShell code:

First, we'll set some basic variables:
$UserCredentials = Get-Credential
$webUrl = ""
$listName = "MyList"
$itemToEdit = "/sites/mySiteCol/mySubsite/MyList/MyDocument.docx"
$termGroupName = "My Term Group"
$targetField1Name = "Field 1"
$targetField2Name = "Field 2"
$targetField1Value = "Value 1"
$targetField2Value = "Value 2"

Now, we setup our paths to the SharePoint Online Client Components SDK. Specifically, notice that we are using Microsoft.SharePoint.Client.Taxonomy.dll. These are the default paths where these DLLs should be installed on your computer.
$sCSOMPath = "C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\16\ISAPI"
$sCSOMRuntimePath=$sCSOMPath +  "\Microsoft.SharePoint.Client.Runtime.dll"
$sCSOMTaxonomyPath=$sCSOMPath +  "\Microsoft.SharePoint.Client.Taxonomy.dll"
$sCSOMPath=$sCSOMPath +  "\Microsoft.SharePoint.Client.dll"
Add-Type -Path $sCSOMPath
Add-Type -Path $sCSOMRuntimePath
Add-Type -Path $sCSOMTaxonomyPath

Next, we create our context and authenticate:
$context = New-Object Microsoft.SharePoint.Client.ClientContext($WebUrl)
$context.AuthenticationMode = [Microsoft.SharePoint.Client.ClientAuthenticationMode]::Default
$credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($UserCredentials.UserName, $UserCredentials.Password)
$context.Credentials = $credentials

Now, we retrieve our managed metadata fields from our Document Library and create our Taxonomy fields:
#get the collection of lists in our site and find our list by name
$lists = $context.web.Lists
$list = $lists.GetByTitle($listName)

#get our target field objects
$field1 = $list.Fields.GetByInternalNameOrTitle($targetField1Name)
$field2 = $list.Fields.GetByInternalNameOrTitle($targetField2Name)

$txField1 =[Microsoft.SharePoint.Client.ClientContext].GetMethod("CastTo").MakeGenericMethod([Microsoft.SharePoint.Client.Taxonomy.TaxonomyField]).Invoke($Context, $field1)
$txField2 = [Microsoft.SharePoint.Client.ClientContext].GetMethod("CastTo").MakeGenericMethod([Microsoft.SharePoint.Client.Taxonomy.TaxonomyField]).Invoke($Context, $field2)

Next, we create a session with the Managed Metadata Service and retrieve the terms we wish to set. This allows us to validate that the term values we're trying to set actually do exist in the term store.
$session = $spTaxSession = [Microsoft.SharePoint.Client.Taxonomy.TaxonomySession]::GetTaxonomySession($context)

$termStores = $session.TermStores

$termStore = $TermStores[0]

$groups = $termStore.Groups

$groupReports = $groups.GetByName($termGroupName)

$termSetField1 = $groupReports.TermSets.GetByName($targetField1Name)
$termSetField2 = $groupReports.TermSets.GetByName($targetField2Name)

$termsField1 = $termSetField1.GetAllTerms()
$termsField2 = $termSetField2.GetAllTerms()

foreach($term1 in $termsField1)
    if($term1.Name -eq $targetField1Value)
        Write-Host "Found our term in the termset: $($term1.Name) with id: $($"

foreach($term2 in $termsField2)
    if($term2.Name -eq $targetField2Value)
        Write-Host "Found our term in the termset: $($term2.Name) with id: $($"

if(($term1.Name -ne $targetField1Value) -or ($term2.Name -ne $targetField2Value))
    Write-Host "Missing term set values.  Double check that the values you are trying to set exit in the termstore. Exiting."

Finally, find the item we want to edit, check it out (if file checkout is required), update the taxonomy metadata fields and check the document back in:

#get all items in our list
$listItems = $list.GetItems([Microsoft.SharePoint.Client.CamlQuery]::CreateAllItemsQuery())
$itemFound = $false

foreach($item in $listItems)  
    If($($item["FileRef"]) -eq $ItemToEdit)
        Write-Host "Found our item..."
        $itemFound = $true


        #Checkout the item and assign the content type to the document
        if($item.File.CheckOutType -eq "None")
            Write-Host "Item has been checked out"

        #create a brand new field and set our Year managed metadata field
        #you cannot simply reuse the term object you found above in the term store
        $txField1value = New-Object Microsoft.SharePoint.Client.Taxonomy.TaxonomyFieldValue 
        $txField1value.Label = $term1.Name           # the label of your term 
        $txField1value.TermGuid = $term1.Id          # the guid of your term 
        $txField1value.WssId = -1                    # the default value


        #create a brand new field and set our Period managed metadata field
        #you cannot simply reuse the term object you found above in the term store
        $txField2value = New-Object Microsoft.SharePoint.Client.Taxonomy.TaxonomyFieldValue 
        $txField2value.Label = $term2.Name           # the label of your term 
        $txField2value.TermGuid = $term2.Id          # the guid of your term 
        $txField2value.WssId = -1                    # the default value


        #update the item with all changes
        $item.File.CheckIn("Item metadata values have been updated", 1)  

        Write-Host "Item has been checked in with updated metadata values"



    Write-Host "Updating the item's metadata is complete."
    Write-Host "Item not found: $itemToEdit"

I'm sure there are ways to make some of this more efficient, and of course the most efficient method of all would be to use the SharePoint PnP PowerShell module using Set-PnPTaxonomyFieldValue (reference: if you're environment allows it at the time. However, I wanted to paint a complete picture of what's involved in building such a script if you have to create it from scratch in PowerShell yourself.


Monday, August 14, 2017

Raise Your Office 365 Secure Score

Thanks to everyone that attended our webinar last week on Office 365 Secure Score.

For many organizations moving to Office 365 or other Cloud services, the concepts of security, compliance and risk are complex. They require learning about how these security concepts have changed and how they’re now implemented in a Cloud first, Mobile friendly world. They often require working with security experts to evaluate the current state of the security for the Cloud application that you’re concerned about… and determining which security capabilities and features you are and are not yet making use of.

When we worked in on premise server environments, things seemed almost easier in some ways because our server farms which hosted Exchange, SharePoint, Skype for Business and so on were all within our corporate networks. They were more under our control, and we felt some level of comfort from being able to stop internet traffic at the network boundary, usually through our firewalls or gateways.
Regardless of how truly secure our not our networks and applications in fact were, we often gained some comfort from this boundary.

With the advent of Cloud computing, with the desire to do work on a whole range of Mobile devices, even our own personal devices, and with the desire to access our services for work from anywhere in the world, moving to services which are hosted on servers and in data centers that are not under our control often feels like we’ve lost that comfort… that assurance that we’re controlling the security of our critical IT services, or it feels that we’ve given the management of our security over to someone else (that we can’t see, that we can’t talk to and that we don’t know).

When, in actuality, often services like Office 365 are more secure than we could have ever hoped to deploy in our own environments… often we have more control over how our services are secured than we’ve ever had. We often just aren’t aware yet of the security benefits that come out of box with Office 365, and we’re not aware of the security capabilities that are available for us to use.

Office 365 Secure Score is a security analytics tool from Microsoft that comes with your Office 365 subscription. Its built to help us understand and navigate all of the security options that are available to us in Office 365.

It’s a relatively new feature from Microsoft, released early this year. Its purpose is really to:
  • Help us understand our current security posture
  • Help us understand which security features we are using and not yet using
  • Help us understand the impact of rolling out new security features to our end users and administrators, and what the security benefits are to us
  • Help us understand how we can improve our security posture, and it even tracks our progress over time

My presentation slides are available here:

Please reach out and let me know if you have any questions.


Monday, July 31, 2017

SPSNYC: Office 365 Security - MacGyver, Ninja or SWAT Team

Thanks to everyone that attended my session at SharePoint Saturday NYC this past weekend. We had a great group in the room and some really good questions.

This presentation was designed to address 3 different roles that may be charged with the responsibility of managing and securing their organization's Office 365 environment:
  • MacGyver - or the IT Team Member that's self-trained, has been handed Office 365 and told to manage and secure it for the organization
  • Ninja - or the Security Expert who is formally trained, knows their stuff when it comes to information security and was given responsibility for securing their organization's Office 365 environment
  • SWAT Team - or the Information Security Team comprised of multiple security experts, with distributed roles and responsibilities

You can find the slides from my presentation here:

Please feel free to reach out to me if you have any questions at all.


Tuesday, May 16, 2017

SharePoint Virtual Summit 2017 - Share with Confidence! #SPSummit

Today Microsoft hosted one of the most highly anticipated SharePoint events:
SharePoint Virtual Summit!

Many of us have been looking forward to this event for weeks and today's event did not disappoint. I tend to focus on the security and governance capabilities when it comes to SharePoint and Office 365, and one of the lines in today's #SPSummit that struck me most was the phrase 'Share with Confidence'! Those of us that work with information every day, even those whose job it is to secure information or oversee the security of information systems, we want to share information with others. Information sharing is a key principle of any collaboration solutions like SharePoint Online. However, we want to be confident that we're sharing with the right people, under the right conditions, and that the information we share is still being protected. Some of today's SharePoint Online announcements really do help improve the Sharing experience in Office 365 so that we can Share with Confidence!

Here are some of my favorite announcements from today that I believe help us better secure our content and share it confidently with others...

Monday, April 10, 2017

Office 365 Audit Log Data - How long are my logs retained for?

I'm a big fan of the Unified Audit Log in Office 365. Its a fantastic tool for monitoring user activity for suspicious behavior, getting automated alerts when particular activities occur and investigating data breaches. I'm talking about the central logging facility within Office 365 that collects log data from many Office 365 workloads, and can be searched in the Office 365 Security and Compliance Center: Go to > Click Search & Investigate > Click Audit Log Search.

I often get asked the question, how long are Office 365 log entries stored or retained for? There are several answers...