Review and Remove inactive guest users from Microsoft Teams through Identity Governance – Access reviews

When an office 365 group is created, we have options to collaborate with public partner accounts .As a result of this People outside organization can see and have access to office365 public groups contents when they are been invited as guests.

When we have allowed the end users to create the office 365 groups and invite the external partners to collaborate,over a period of time the groups left unattended without the access reviews. There is a high possibility of an user having access to the sensitive documents which they don’t need them anymore.

In order to alleviate these security issues , we can influence the Microsoft Azure Identity Governance – Access reviews

With the access reviews created for office365 groups , we can let the group owners review their office 365 public group guests present on them and take necessary action based on the requirement.

In order to create access review navigate to azure portal – Identity Governance – Access reviews – Click on access review – Select New access review.

Now we can create them with name ,description , start date and frequency of how often the access reviews needs to take place for the office365 groups.

We can set the number of times, end date and the scope to guest users only. And target the external groups which have the guest users added. Probably this part needs to be reviewed periodically and add the new groups in this list.

Furthermore we have the options to customize the reviewers who will be the reviewers of this access review task.

Upon completion we have the action to choose – Remove,Approve or take recommendations.

Finally we have few options which is present in the advanced settings. Once the customization is done as per the requirement we can start the review.

Once the schedule is triggered as per the configuration the reviewers get an email with the timeline.

Once clicked on review the user gets the guest user details and the options to take action based on the business requirement.

The reviewer gets an option to type the comment and take the necessarily action.

We have the review results section where we have an option to download the access review tasks and save them for ISO audit compliance which will help during the ISO Audit Evaluation cycles.

This is usual in most of the organizations when the guest accounts are provided access to the business sensitive content. Ultimately its the group owner’s responsibility to periodically review them and take necessary actions.

There is lot more to get benefited with Identity Governance access reviews. The above method will help us in evaluating and having right access only to the required individuals in Office 365 Groups.

Regards

Sathish Veerapandian

Microsoft Teams – Enable data loss prevention,ATP safe attachments,retention of files and conversations

Security is considered one of the success factor for any implementations.With Office 365 security and compliance there are lot of options to enforce the security across Office 365 suite of products.We can enforce DLP on Microsoft Teams based on our requirement. ATP can be turned on for all file upload activities in Microsoft Teams. The best part is that now we do have option to enable retention as lesser as 1 day in Microsoft teams channel messages and chats.

Microsoft Data Loss Prevention have been protecting sensitive information across all Office365 platforms. The easiest part is that we already have more custom built-in templates which will be easier for us to create,test,evaluate the results and finally create one for the production.

DLP Policy in Teams:

To create a dedicated DLP policy for Teams navigate to security and compliance center – Create a new policy.

In our example we are creating a new policy which will block the sharing of PAN card number via teams channels and chats.

In locations tab ensure that we are selecting teams chat and channel messages if the location is going to be only teams. If we need on all locations then we can keep them all enabled.

Under policy settings we do have lot of prebuilt templates which is super simple for us to just select and apply. In our case we are just selecting Block Indian PAN CARD number not to be shared via teams channels and chat messages.

Now we’ve created the teams data loss prevention policies and its time for us to test the created policy.

Have just logged into my test account and attempted to send a PAN Card to my account. The moment the PAN card is shared it is immediately blocked from the DLP policy.

And from the recipient end received the following message and the message is not delivered since it matches our DLP policy.

With the DLP policy we will be able to secure our sensitive information in Teams Channels and chat conversations.

Enable safe attachments on Teams Channels and chats :

Enabling ATP on Teams is pretty straight forward.

We need to navigate to protection security center – threat management – policy – select safe attachments.

All we need to do is to just select turn on ATP for SharePoint, One Drive and Microsoft Teams.

Once the policy is enabled and when somebody attempts to share an infected file the file is blocked but still present in the library, however no one will have the ability to open them from their side.

Files are scanned asynchronously, through a process that uses sharing and guest activity events along with smart heuristics and threat signals to identify malicious files.

To review the quarantined files we can go to threat management – review – select view quarantined files

Enable Retention in Microsoft Teams channels and chat conversations:

By default teams conversations and files are retained forever. With the new retention policy introduced in Microsoft teams channels and chats now admins have the option to customize the retention and delete the data forever if it is considered as liability according to the company retention policy.

In-order to create retention policies navigate to security center – select information governance – select retention – click create

Have created once dedicated policy for Teams Retention.

Now we choose the retention settings as per our requirement. The good part is that we do have the option now to retain the content lesser to even 1 day time.

Now we need to create a new retention policy for Microsoft Teams. If we try to edit the old retention policy there wouldn’t be an option to include Team Channel Messages and Chats , since these locations were on-boarded recently in the retention policy scopes.

Once selected based on the retention period all the Teams channel messages and chats are retained.

If end users delete their Teams messages, these messages are still preserved and available for search through eDiscovery for particular years based on the retention period set in the policy.

In order to recover a deleted file from channels – navigate to the channels – files tab – select open in sharepoint

Now after clicking on open in SharePoint – navigate to recycle bin and we could see the deleted file present.

We do have the same restore option like what we see in SharePoint sites.

With all the new security enhancement and retention channels enabled in Microsoft Teams it makes more convenient better communication platform for all users in the enterprise environment.

Microsoft Teams – Enforce Multifactor Authentication on guest accounts

Post the ignite sessions last month on Microsoft Teams, we have enhancements on security perspective that can be enabled which adds extra protection in any organization.

Inviting the external guest users to the teams channel have been a welcoming option for all of us which increases the communication between them and surges the productivity. However, there are few security guidelines that needs to be followed to ensure that our data is always secure even when they are shared outside the boundary. For instance, a guest account getting compromised where he is a member of a finance team will become a major security incident in any organization.

This article outlines the steps that can be carried over to enhance the security on Microsoft Teams guest accounts by enforcing the multi factor authentication.

Below are the steps to enforce the MFA on guest accounts:

First create a dynamic distribution group and target the guest account

Login to Azure AD Tenant with Admin privilege’s- Go to Groups – Create new group – make them security – membership type make them dynamic.

Now we need to add a dynamic query where the property is usertype  and the value is guest.

Once done populate the rule syntax and save them.

After some time now, we could see that the populated guest users in our Azure AD tenant will become the members of this group. Since it’s a dynamic query all the new upcoming accounts will be getting occupied automatically.

Create conditional access policy for guest accounts:

Now we need to create a conditional access policy for the Microsoft Teams guest accounts.

Navigate to enterprise applications – click on conditional access.

Now we need to target the dynamic group on this conditional access policy.

In cloud apps select Microsoft Teams , also better to select Sharepoint online which will enforce MFA for these Sharepoint guest users as well.

In conditions we are selecting only the locations. Further it can be manipulated based on the business prerequisite.

In the access control we are selecting only require MFA and the IT policy.

Now we have the MFA enforced on the guest accounts and we will see the action of this configuration from the invited user.

Experience of the guest users enforced with MFA:

In order to simulate this behavior , we are just adding one guest user a teams channel

Post after that the invited user receives  a welcome email and this is usual behavior for any invited Azure AD guest user accounts.

When clicking to login the user will be prompted to register and enroll in MFA.

User will be prompted to enter the mobile number in the invited tenant for MFA and needs to complete the initial authentication process.

If we have enabled the IT policy user will be prompted to read and accept the IT policy.

Finally the user is logged in with the guest account and able to participate on the invited team through a secured way of authentication.

With very nominal steps through the conditional access it creates a overall better security for Microsoft Teams.

Use Azure Automation accounts, Run Books and Schedules to start stop VMs automatically running in Azure

By using this article, we can start/stop VMs during off-business houses.This greatly benefits the customers especially in cost optimization and manual task overhead of performing this action manually. But we need to make sure that the VMs that we are selecting is present in the same subscription where the automation account and this schedule is created by selecting only the required VMs and excluding the other VMs.

Login to Azure portal

Go to ALL Services and Type Automation Account and Create Automation Account.

Under Process Automation, click Runbooks to open the list of runbooks.

Click on + Create a runbook button to create a new runbook

Provide and Name and Runbook Type as PowerShell for the new runbook and then click Create button.

The Run Book is Empty, must deploy the Code  then Save and Publish

Note: You can Find lot of powershell scripts for this task in the github and technet gallery. You can use the below one as well taken from github.

#PowerShell Script to Start and Stop VM's in Azure on Scheduled Intervals using Azure Automation Account#
Param(
[Parameter(Mandatory=$true,HelpMessage="Enter the value for Action. Values can be either stop or start")][String]$Action,
[Parameter(Mandatory=$false,HelpMessage="Enter the value for WhatIf. Values can be either true or false")][bool]$WhatIf = $false,
[Parameter(Mandatory=$false,HelpMessage="Enter the VMs separated by comma(,)")][string]$VMList
)

function ValidateVMList ($FilterVMList)
{
[boolean] $ISexists = $false
[string[]] $invalidvm=@()
$ExAzureVMList=@()

foreach($filtervm in $FilterVMList) 
{
    $currentVM = Get-AzureRmVM | where Name -Like $filtervm.Trim()  -ErrorAction SilentlyContinue

    if ($currentVM.Count -ge 1)
    {
        $ExAzureVMList+= @{Name = $currentVM.Name; Location = $currentVM.Location; ResourceGroupName = $currentVM.ResourceGroupName; Type = "ResourceManager"}
        $ISexists = $true
    }
    elseif($ISexists -eq $false)
    {
        $invalidvm = $invalidvm+$filtervm                
    }
}

if($invalidvm -ne $null)
{
    Write-Output "Runbook Execution Stopped! Invalid VM Name(s) in the VM list: $($invalidvm) "
    Write-Warning "Runbook Execution Stopped! Invalid VM Name(s) in the VM list: $($invalidvm) "
    exit
}
else
{
    return $ExAzureVMList
}
}

function CheckExcludeVM ($FilterVMList)
{
[boolean] $ISexists = $false
[string[]] $invalidvm=@()
$ExAzureVMList=@()
<pre><code>foreach($filtervm in $FilterVMList) 
{
    $currentVM = Get-AzureRmVM | where Name -Like $filtervm.Trim()  -ErrorAction SilentlyContinue
    if ($currentVM.Count -ge 1)
    {
        $ExAzureVMList+=$currentVM.Name
        $ISexists = $true
    }
    elseif($ISexists -eq $false)
    {
        $invalidvm = $invalidvm+$filtervm
    }

}
if($invalidvm -ne $null)
{
    Write-Output "Runbook Execution Stopped! Invalid VM Name(s) in the exclude list: $($invalidvm) "
    Write-Warning "Runbook Execution Stopped! Invalid VM Name(s) in the exclude list: $($invalidvm) "
    exit
}
else
{
    Write-Output "Exclude VM's validation completed..."
}    </code></pre>
}
<h1>-----L O G I N - A U T H E N T I C A T I O N-----</h1>
$connectionName = "AzureRunAsConnection"
try
{
# Get the connection "AzureRunAsConnection "
$servicePrincipalConnection=Get-AutomationConnection -Name $connectionName
<pre><code>"Logging in to Azure..."
Add-AzureRmAccount `
    -ServicePrincipal `
    -TenantId $servicePrincipalConnection.TenantId `
    -ApplicationId $servicePrincipalConnection.ApplicationId `
    -CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint </code></pre>
}
catch
{
if (!$servicePrincipalConnection)
{
$ErrorMessage = "Connection $connectionName not found."
throw $ErrorMessage
} else{
Write-Error -Message $_.Exception
throw $_.Exception
}
}
<h1>---------Read all the input variables---------------</h1>
$StartResourceGroupNames = Get-AutomationVariable -Name 'RG-Of-VMs-To-Start'
$StopResourceGroupNames = Get-AutomationVariable -Name 'RG-Of-VMs-To-Stop'
$ExcludeVMNames = Get-AutomationVariable -Name 'Excluded-List-Of-VMs'

try
{
$Action = $Action.Trim().ToLower()
<pre><code>    if(!($Action -eq "start" -or $Action -eq "stop"))
    {
        Write-Output "`$Action parameter value is : $($Action). Value should be either start or stop."
        Write-Output "Completed the runbook execution..."
        exit
    }            
    Write-Output "Runbook Execution Started..."
    [string[]] $VMfilterList = $ExcludeVMNames -split ","
    #If user gives the VM list with comma seperated....
    $AzurVMlist = Get-AutomationVariable -Name $VMList
    if(($AzurVMlist))
    {
        Write-Output "list of  all VMs = $($AzurVMlist)"
        [string[]] $AzVMList = $AzurVMlist -split ","        
    }
    if($Action -eq "stop")
    {
        Write-Output "List of  all ResourceGroups = $($StopResourceGroupNames)"
        [string[]] $VMRGList = $StopResourceGroupNames -split ","
    }

    if($Action -eq "start")
    {
        Write-Output "List of  all ResourceGroups = $($StopResourceGroupNames)"
        [string[]] $VMRGList = $StartResourceGroupNames -split ","
    }

    #Validate the Exclude List VM's and stop the execution if the list contains any invalid VM
    if (([string]::IsNullOrEmpty($ExcludeVMNames) -ne $true) -and ($ExcludeVMNames -ne "none"))
    {
        Write-Output "Values exist on the VM's Exclude list. Checking resources against this list..."
        CheckExcludeVM -FilterVMList $VMfilterList 
    } 
    $AzureVMListTemp = $null
    $AzureVMList=@()

    if ($AzVMList -ne $null)
    {
        ##Action to be taken based on VM List and not on Resource group.
        ##Validating the VM List.
        Write-Output "VM List is given to take action (Exclude list will be ignored)..."
        $AzureVMList = ValidateVMList -FilterVMList $AzVMList 
    } 
    else
    {

        ##Getting VM Details based on RG List or Subscription
        if (($VMRGList -ne $null) -and ($VMRGList -ne "*"))
        {
            foreach($Resource in $VMRGList)
            {
                Write-Output "Validating the resource group name ($($Resource))"  
                $checkRGname = Get-AzureRmResourceGroup -Name $Resource.Trim() -ev notPresent -ea 0  

                if ($checkRGname -eq $null)
                {
                    Write-Warning "$($Resource) is not a valid Resource Group Name. Please verify your input"
                }
                else
                {    
                    # Get resource manager VM resources in group and record target state for each in table
                    $taggedRMVMs =  Get-AzureRmVM | ? { $_.ResourceGroupName -eq $Resource} 
                     foreach($vmResource in $taggedRMVMs)
                    {
                        if ($vmResource.ResourceGroupName -Like $Resource)
                        {
                            $AzureVMList += @{Name = $vmResource.Name; ResourceGroupName = $vmResource.ResourceGroupName; Type = "ResourceManager"}
                        }
                    }
                }
            }
        } 
        else
        {
            Write-Output "Getting all the VM's from the subscription..."  
            $ResourceGroups = Get-AzureRmResourceGroup 

            foreach ($ResourceGroup in $ResourceGroups)
            {    

                # Get resource manager VM resources in group and record target state for each in table
                $taggedRMVMs = Get-AzureRmVM | ? { $_.ResourceGroupName -eq $ResourceGroup.ResourceGroupName}

                foreach($vmResource in $taggedRMVMs)
                {
                    Write-Output "RG : $($vmResource.ResourceGroupName) , ARM VM $($vmResource.Name)"
                    $AzureVMList += @{Name = $vmResource.Name; ResourceGroupName = $vmResource.ResourceGroupName; Type = "ResourceManager"}
                }
            }

        }
    }

    $ActualAzureVMList=@()

    if($AzVMList -ne $null)
    {
        $ActualAzureVMList = $AzureVMList
    }
    #Check if exclude vm list has wildcard       
    elseif(($VMfilterList -ne $null) -and ($VMfilterList -ne "none"))
    {
        $ExAzureVMList = ValidateVMList -FilterVMList $VMfilterList 

        foreach($VM in $AzureVMList)
        {  
            ##Checking Vm in excluded list                         
            if($ExAzureVMList.Name -notcontains ($($VM.Name)))
            {
                $ActualAzureVMList+=$VM
            }
        }
    }
    else
    {
        $ActualAzureVMList = $AzureVMList
    }

    Write-Output "The current action is $($Action)"

    $ActualVMListOutput=@()

    if($WhatIf -eq $false)
    {   
        $AzureVMListARM=@()


        # Store the ARM VM's
        $AzureVMListARM = $ActualAzureVMList | Where-Object {$_.Type -eq "ResourceManager"}


        # process the ARM VM's
        if($AzureVMListARM -ne $null)
        {
            foreach($VM in $AzureVMListARM)
            {  
                $ActualVMListOutput = $ActualVMListOutput + $VM.Name + " "
                #ScheduleSnoozeAction -VMObject $VM -Action $Action

                #------------------------
                try
                {          
                    Write-Output "VM action is : $($Action)"
                    Write-OutPut $VM.ResourceGroupName

                    $VMState = Get-AzureRmVM -ResourceGroupName $VM.ResourceGroupName -Name $VM.Name -status | Where-Object { $_.Name -eq  $VM.Name }
                    if ($Action.Trim().ToLower() -eq "stop")
                    {
                        Write-Output "Stopping the VM : $($VM.Name)"
                        Write-Output "Resource Group is : $($VM.ResourceGroupName)"


                        if($VMState.Statuses[1].DisplayStatus -ne "VM running")
                        {
                            Write-Output "VM is not in running state, No actions performed"
                        }
                        else
                        {
                            $Status = Stop-AzureRmVM -Name $VM.Name -ResourceGroupName $VM.ResourceGroupName -Force

                            if($Status -eq $null)
                            {
                                Write-Output "Error occured while stopping the Virtual Machine."
                            }
                            else
                            {
                                Write-Output "Successfully stopped the VM $($VM.Name)"
                            }            
                        }

                    }
                    elseif($Action.Trim().ToLower() -eq "start")
                    {
                        Write-Output "Starting the VM : $($VM.Name)"
                        Write-Output "Resource Group is : $($VM.ResourceGroupName)"

                        if($VMState.Statuses[1].DisplayStatus -eq "VM running")
                        {
                            Write-Output "VM already Running, No actions performed"
                        }
                        else
                        {
                            $Status = Start-AzureRmVM -Name $VM.Name -ResourceGroupName $VM.ResourceGroupName

                            if($Status -eq $null)
                            {
                                Write-Output "Error occured while starting the Virtual Machine $($VM.Name)"
                            }
                            else
                            {
                                Write-Output "Successfully started the VM $($VM.Name)"
                            }
                        }
                    }      

                }
                catch
                {
                    Write-Output "Error Occurred..."
                    Write-Output $_.Exception
                }
                #--------------------------
            }
            Write-Output "Completed the $($Action) action on the following ARM VMs: $($ActualVMListOutput)"
        }


    }
    elseif($WhatIf -eq $true)
    {
        Write-Output "WhatIf parameter is set to True..."
        Write-Output "When 'WhatIf' is set to TRUE, runbook provides a list of Azure Resources (e.g. VMs), that will be impacted if you choose to deploy this runbook."
        Write-Output "No action will be taken at this time..."
        Write-Output $($ActualAzureVMList) 
    }
    Write-Output "Runbook Execution Completed..."
}
catch
{
    $ex = $_.Exception
    Write-Output $_.Exception
}

Now in Automation Account, under Shared Resources click Variables and add these variables [Note: while creating the variables you have to provide some Value, later you can delete the value if required]


Explanation of these variables-

RG-Of-VMs-To-Start: ResourceGroup that contains VMs to be started. Must be in the same subscription that the Azure Automation Run-As account has permission to manage.

RG-Of-VMs-To-Stop: ResourceGroup that contains VMs to be stopped. Must be in the same subscription that the Azure Automation Run-As account has permission to manage.

Excluded-List-Of-VMs: VM names to be excluded from being started/Stopped

Note: These Variables are defined in the PowerShell Code and should not be changed if used by default.

VMstoStartStop: Provide the list of the VMs to be started or stopped.

Now we will create schedule for the start & stop of VMs. Under Shared Resources, click on Schedules and then click + Add a schedule and provide a Name and time and recurrence frequency for the schedule and Click Create

Similarly create schedule for Stop_VM


Now we will attach these schedules with the runbook. Go to Runbook and then open the runbook, click schedules and then + Add a schedule

Now link Start_VM and then provide parameter values. ACTION can have values like start or stop. In VMLIST provide the variable name “VMstoStartStop” which contains the VM names. Click create

Similarly attach the Stop_VM and provide the Action value Stop and VMList value VMstoStartstop.

So now we have two schedules start-vm and stop-vm which would be running on the defined schedules.

Now your runbook will execute the start of VM and stop of VM as per the schedule attached. The results of runbook execution can be seen under Process Automation –> Jobs.

This task greatly benefits the hassle of maintaining and performing this task manually from the admin side.

Extend local AD extension attributes to Azure AD in a non-hybrid exchange online only environment

There might be a scenario where the environment has Azure AD synced users from local Active Directory. The mailboxes will be created directly in exchange online with no hybrid configured from the underlying time as a rule for new businesses.

Usually developers for customizing the login experience for different business units in their application consume the local extension AD attributes and its usually fine for fully on premise environments.

If we have exchange installed in the environment , the active directory schema will be extended to include user extensionattributes in the exchange mailbox properties.

There is another option of Using the Exchange Server install media, extend only the local Active Directory schema. Usually this option is not recommended. Doing this would add Exchange attributes to the local Active Directory. These attributes could then be set, and Azure AD Sync would then be configured to sync these attributes to Office 365.This option requires much testing, and there is always risk associated with AD schema changes.

Even in hybrid setup these values gets populated in Exchange online via exchange hybrid configuration for all users.

In the third scenario where we do not own a exchange hybrid and if the developer is using Azure AD via graph API and expecting these values on azure AD for the customization. In this case we have a better option of extending these values from the Azure AD connect by running them again and selecting only the required AD extension attributes.

Login to Azure AD with global admin credentials and select customize synchronization options

Select directory extension attribute sync.

Here we will have the option to choose the local active directory attributes. In our case we are selecting the two atttributes extensionattribute7 and extensionattribute8 .

Once done go ahead and click on configure.

It must be working usually in this steps but in this case we did a directory refresh schema.

Selected the directory for refresh.

Now went to the local Active Directory and populated the extensionattribute8 for one user.

Once after the sync is completed we can verify if the value is populated in the azure ad via graph explorer.

Login to the graph explorer from the below url.

https://developer.microsoft.com/en-us/graph/graph-explorer

We can login with any valid credentials from your tenant.

We will be asked for the admin consent and needs to be selected based on the requirement.

Run the below query.

https://graph.microsoft.com/v1.0/me?$select=mail,jobTitle,companyName,onPremisesExtensionAttributes

For Reading on premise attributes (mail, jobTitle, company Name and onPremisesExtensionAttributes) using Graph API. You should see the extensionAttribute8 under onPremisesExtensionAttributes which is being used currently.

In our case we can see the extension attribute8 value which has been synched and available in Azure AD.

Using the directory extension option in the azure ad connect achieves this task in a lot less simpler way.

Thanks & Regards

Sathish Veerapandian

Loads of exciting new features announced for Microsoft Teams on ignite 2019

With Microsoft ignite sessions that happened last week there are lots of new end users functionalities, meeting room enhancements and better enhanced administration facilities were announced for Microsoft Teams. Below are the summary of the features .

Watch out more from the Ignite Session videos.

End user functionalities –
1)Ability to create Private Channels – Secure Private channels can be created and shared only with few audiences.This eliminates the need of creating multiple teams for secure communication. We can further restrict the Private Channels creation and visibility from the admin center.
2)Multi window experience between the chats – Ability to chat with multiple people at the same time and switch windows which was much requested feature in the user voice.
3)New Tasks experience in Teams – Helps better tracking of the tasks and have great option to view the stats on charts, schedule, boards and filter.
4)Yammer app in Teams – Allows to jump in yammer communities.Beneficiary especially on larger organizations and useful for employees to join and collaborate in a bigger communities and keep upto date on the new content.
5)Outlook addin for Teams – With the new addin it makes easier for sending the content of the email with all the context body and attachments. Sharing from channels have also been seamless.
6)Background Blur to the next level- We can add customize background blur with our custom images and change the background experience either to show as sitting on a beach or in a hotel etc.,
7)Turn on live captions – It makes easier to follow up on the team meetings. This is a live voice to text translation and helps especially in broadcast meetings as well.

Lot of innovations on Meeting experience –

1)New compact devices – Newly launched Yealink & Polycom Collaboration bars suitable for smaller huddle spaces. It has exciting remote control with which we can join the meeting without the need of Touch Panels and just mounting them on a normal LED TV.

2)Cloud Video Interoperability with Cisco -Cisco webex video devices and cisco SIP conferencing video devices can connect with Microsoft Teams Meeting services. Cisco interop service and will be classified as teams cloud video interop solution.This helps out customers consuming cisco partnership to utilize cisco devices in Teams.

3)Cisco/Zoom Web based interoperability – Interop meeting room devices with direct guest join which enables the user to choose and utilize Teams,Cisco or Zoom from web interfaces however this experience will be seamless to the end users from these devices.

4)Managed Meeting Rooms – Monitor and manage your meeting rooms is a managed service from Microsoft that does room monitoring and advanced insights.

New IT professional capabilities –

1)Easier deployment of Teams Deploying Teams Workload –

Adviser for teams helps easier deployment and customize plans of choosing which one to migrate first whether chats or channel conversations or meeting or conversation.New coexistence modes added to support better coexistence and transition with Skype for Business Enterprise Voice functionalities.

2)ATP is now available in Teams- With ATP enabled Teams does a time of click verification for the links sent in chat conversations and if it finds anything phishy it does block them as we get on email links.

3)Ediscovery Available from Microsoft Teams – We can submit information for ediscovery on Teams contents.

4)Teams Audit logs – With Teams audit logs we can provide information on whether the Message was deleted or edited.

5)Information Barrier – Ability to block the communication between critical users from the admin side.The same capability will be applicable for files sharing between them in Teams.

6)Retention Policies – In Microsoft Teams now we can Setup retention policies that are as low as 1 day.

7)Administration for Microsoft Teams –

8)PowerShell – Bulk update to security group is possible with new commandlets and just now one liner.

9)Hub for Teamwork – Certified app catalog available in the teams admin center and further iteration can be made on this app catalog.

10)Manage Microsoft Teams Rooms – From Teams admin center we have the capability to manage the Microsoft teams meeting rooms devices.We have the capability to restart the devices and troubleshoot them from the admin center.


11)First line workers in Microsoft Teams –
Time Clock in Teams and Shifts in Teams helps managing the first line workers efficiently and tracking them easily.With graph apis we can integrate our workforce management systems.

12)Delegated User Management – First line workers managers have the capacity to reset/block the user accounts.

New Identity and access capabilities for First line workers-
SMS signin
Global sign-out
Off-shift access

Configure SendGrid in Microsoft Azure for email campaigns and smtp relay

With Microsoft Azure and SendGrid sending email campaigns for the organization will be a lot simpler. The SMTP relay configuration on applications for developers will be hassle free and much secure. We can go up to two SendGrid subscriptions on every azure account. Sendgrid gives a lot of adaptability towards utilizing either webapi on the application sending messages or to utilize the normal SMTP relay configuration.

This article outlines the steps carried over to create send grid accounts in Microsoft Azure.
Login to azure portal – Search for SendGrid and create SendGrid account.

We must select the pricing tier. Good thing is that we get F1 free with Azure subscription of 25000 emails per month which has custom API integration with advanced tracking mechanism.

Once created we must run through few initial configuration steps.

Now once the account is created, we would need to authenticate our domain so that the send grid can send emails on behalf of our registered domain.

We need to add the cname records on our DNS portal.

Once after entering the domain we have options to use automated security which will rotate the DKIM keys for our domain, custom return path and use custom DKIM selector.

Create the associated CNAME records for SMTP and DKIM on our public DNS.

Once after publishing the records our domain validation will be successful.

Upon successful verification navigate to the setup option and choose first option to configure Web API or SMTP relay. If we are going with the latter option we just need to generate the API key and use them on the php file or the api depending on the workload of the website which requires this service.

Now we have two options to set up using webapi or SMTP relay

Once completed the below integration we get the option to use API key and regular SMTP relay on the application

One of the best things is that we do have an option to create multiple API keys. This is ideal for developers to use their own API keys which will be tracked and used only by them.

We still do have multiple options to further reiterate the permission levels while creating the API keys. Once the API keys are created it will be displayed only once on the portal and can’t be seen again. This is for security concern and must be copied and shared with the application developer who would be using this API key to send emails.

Plentiful  of options available on the email tracking with send grid like people who have opened, clicked,unsubscribed , emails bounced and all of the actions which are available below.

Below options are present in Suppression

There is design library and template section which is very useful and can be used to create email campaigns

We have decent options to create a well drafted marketing email. There is sufficient amount of modules that can be used for a perfect email campaign.

Before sending the email to all audiences we have option to send test with few recipients and below is the test email received from sendgrid.

There are few templates available in the design libraries which can be utilized for creating new marketing templates.

We have full and partial html and can choose the best html based on our experience.

Statistics overview gives much detailed information on the email campaign delivery and customer interests.

On the activity field we can see the detailed information on the recent mass campaign sent through sendgrid

On a attempt to send an email from an unverified sendgrid account we do get immediate bounce back

Spam Reports are triggered when a user who receives the email marks them as spam button or places the email in their spam folder within their email client gmail,yahoo or other service providers.

With Microsoft Azure and very few clicks we can enable organizations to have a fully capable email campaign and modern smtp relay solution.
This avoids the major efforts of creating a dedicated server on the on premise network , creating allow lists , configuring permissions, performing timely updates ,securing and maintaining them.We need to ensure that the sendgrid SPF,DKIM records are populated in the DNS portal to get aligned with the email authentication policy.

Thanks & Regards

Sathish Veerapandian

Analyze the office 365 adoption with Microsoft 365 usage analytics

Office 365 adoption preview helps to have insights of the Office 365 utilization trends for the whole organization.This helps organization on identifying the departments who needs training and places where there is real success on office 365 aquisition.

With Microsoft 365 usage analytics integrated with Power BI , we get much visibility on how Office365 is been utilized.It is a pre built content pack and do not need to create any customization on getting the reports.

This content pack is free of charge and works well with powerBI free service and can customize the dashboards with reports.We do not need to have a powerbi pro or premium license to utilize this service.Once we connect this content pack it can be shared with anybody. However if the user attempts to share, export the report then powerbi pro license is required. For viewing only the data powerbi free license is much sufficient.

The moment when we connect the data pack it provides the data for last 12 months. Later it refreshes in a weeks time. We do have an option to customize the refresh schedule.

Below are the steps to enable the Microsoft 365 usage analytics:

Sign into office365 admin center with global admin privilege – navigate to reports – click on usage

Enable the option make data available to powerbi

Now login to power bi – navigate to service content packs – select office 365 adoption preview

Now we need to use the tenant id and connect to the services

Now we will have the office 365 adoption preview connected to the associated work space.

Once done we see the visibility on utilization of all the services. Example we have the adoption overview on the Teams.

Exchange online utilization

As of now we have the reports that can be pulled over and customized in powerbi for Exchange, Skype for business , teams, yammer , onedrive, sharepoint, adoption by department, product , region and yammer usage.

Script to generate office 365 groups created on last 30 days

By default it is enabled for users to create the office365 groups. There are few organizations where they do not need to restrict this group creation because these groups are heavily influenced on utilizing the office365 services Sharepoint,Yammer, Microsoft Teams, PowerBI , Outlook, Planner and Road Map which in turn might decline the office 365 user adoption rate.

The below script can be used to run in task scheduler on a monthly basis for reviewing the Office 365 groups which have been created in last 30 days and will email us the report.

Below is the sample output of the script which will provide us the below details.


########################################################################################################################
# Description   :- Powershell Script To extract office365 groups created less than 30 days time and send them in email
# Author        :- Sathish Veerapandian
# Created       :- 15-Jul-2019
# Updated       :- 15-Oct-2019
# Version       :- 0.2
# Notes         :- 
#########################################################################################################################

$Header = @"

TABLE {border-width: 1px; border-style: solid; border-color: black; border-collapse: collapse;}
TH {border-width: 1px; padding: 3px; border-style: solid; border-color: black; background-color: #6495ED;}
TD {border-width: 1px; padding: 3px; border-style: solid; border-color: black;}

"@

# Load MFA Module
$MFAExchangeModule = ((Get-ChildItem -Path $($env:LOCALAPPDATA+"\Apps\2.0\") -Filter CreateExoPSSession.ps1 -Recurse).FullName | Select-Object -Last 1)
. "$MFAExchangeModule"

# Initiate Session
Connect-EXOPSSession -UserPrincipalname mentionadminid@domain.com

# Get Office365 Groups
Get-UnifiedGroup -ResultSize unlimited | select DisplayName,PrimarySMtpAddress,WhenCreated,ManagedBy,RecipientTypeDetails,AccessType | Export-Csv C:\Scripts\365groups.csv -NoTypeInformation

# Define the date variable for Cutoffdate less than 31 days
$CutoffDate = get-date -date $(get-date).adddays(-31) -format "M/dd/yyyy h:mm:ss tt"

# Get the office365 groups created lesser than 31 days
$Data = Import-CSV "C:\Scripts\365groups.csv" | Where-Object {$_.WhenCreated -as [datetime] -gt $CutoffDate}


# Export the office365 groups  created lesser than 31 days in csv file
$data | ConvertTo-Html -Head $Header | Out-File -FilePath C:\Scripts\365.html


# Send the exported csv email to the helpdesk team for evaluation
Send-MailMessage -From senderemailID -To recipientemailid -Attachments "C:\Scripts\365.html" -BodyAsHtml -SmtpServer mentionsmtpserver -Subject Office365GroupStatus 

Thanks

Sathish Veerapandian

Configure Exchange Online to reject emails that fail DMARC validation with organizations having policy of reject

By default Office 365 DMARC validation for internet emails that fails for policy P=Reject will make the email to land in junk folder of the recipient mailbox.Microsoft 365 will treat DMARC policies of quarantine and reject in the same way, which means that if the sender’s DMARC policy is set to reject or quarantine, the emails that fail DMARC will be sent to the junk folder of the recipient mailbox.

Microsoft believes that the main agenda of doing this is to ensure that any legitimate emails which misses in DMARC alignment shouldn’t be lost and its better to get them delivered to recipient junk mail folder. There are few cases wherein few organizations would still need the DMARC policy to be stringent due to their security regulations.

Microsoft validates DMARC and overrides the failure with a header value for a domain whose DMARC TXT record has a policy of p=reject oreject. Instead of deleting or rejecting the message, Office 365 marks the message as spam.

To test it further we are publishing SPF, DKIM and DMARC record for the domain ezcloudinfo.com as below:

SPF record: Adding only Exchange online as authorized sender.

DKIM Record: Having the Signing key only for office 365

DMARC Record: Having strict policy of P=reject

For a successful email from a legitimate sender where it has passed spf, dkim & dmarc we see the below value for DMARC.

dmarc=pass action=none

Now we are triggering an email from a registered mailchimp account for ezcloudinfo.com where we do not have the SPF and DKIM records added in our DNS records.

The email from mailchimp from sender address sathish@ezcloudinfo.com gets landed in junk email.

We can see the header value of above email and the DMARC validation is failed.

WorkAround:

We received a workaround which can be accomplished to reject the emails that fails with DMARC validation from redsift cyber security analysis .

Create a Transport Rule:

Include the below value oreject or action=oreject or dmarc=fail in the message header include option.

Reject the message with the custom status code.

Now if we send a test email after this transport rule from an unauthorized sender the email will be rejected and could see the below NDR message.

So after this transport rule any spoof emails that are coming from a domain that is DMARC protected will not be delivered to the spam folder. They will all be rejected and never reach the recipient.

Thanks & Regards

Sathish Veerapandian

%d bloggers like this: