Category Archives: Azure

Migrate onpremise SQL DB to the Azure SQL Database

Azure dataplatform also provides Azure SQL database as a relational database as a service PAAS which is fully managed by Microsoft.This helps the developers to build their apps very quickly and removes the overhead of database administration.

There are few methods to migrate an on premise SQL database to Azure SQL Database and in this article we will have a look at migrating them with two options.

1) Using BACPAC export and import.

2) Data Migration Assistant.

Using BACPAC export and import:

With BACPAC export and import firstly we need to export the SQL database from the on premise SQL instance as a data tier application.

To export – Open SQL Management Studio – Right Click on the desired database and click on tasks – select export data tier application.

Now we need to save them in bacpac format.

The exported bacpac file will be successful.

Now this bacpac file needs to be imported to the Azure SQL database. Now we need to connect to the Azure SQL database to from SQL Management Studio.

Once after it is connected right click on the database folder and select import data tier application.

Select the exported bacpac file from the local disk and select the new database name that needs to be mentioned. Here we need to choose the Edition of Microsoft Azure , size of the database and the service type for this database in Microsoft Azure.

Having selected the required option select import and the import operation will start.

After a successful import we can see the status to be green and result successful.

Now we can see the migrated database in the Azure SQL database which have been successfully imported. Now we need to provide the username and the connection strings to the application owner to access their data which is present on the Azure SQL database.

Data Migration Assistant:

We can use the SQL migration assistant with source and target end points and migrate the data to SQL PAAS Azure easily.

Below are the readiness to be prepared for migrating the SQL data from on premise to Azure :

  • Download and install SQL data migration assistant.
  • We need to enable TCPIP on source.
  • Port 1433 must be accessible.
  • SQL browser service must be started.
  • Enable SQL remote connection must be enabled.

Once the Data Migration Assistant is installed open  and click on new

Here we have two options assessment or migration. Assessment helps us to identify the readiness required for this migration and will let us know if any connection or prerequisites missing. Here we can click on assessment.

Now we can select the authentication type and click on next

Select the desired DB’s that needs to be migrated to Azure.

Now we have the option to click on start assessment

 

  Check the assessment report once it is completed.  

To Migrate – rerun the agent and choose the option migrate and specify the source server details.

Once after its authenticated successfully now we have an option to choose the database that needs to be migrated.   

Now we need to specify the target Azure SQL PAAS Db details and the credentials.

Once after its been authenticated successfully , we can see the schema objects from the source database which we would like to migrate to Azure SQL database.

For the schema to migrate successfully we need to deploy the schema which will help us to migrate the schema initially.

Later once the schema is populated successfully now we have an option to migrate the data.Click on migrate data.

Choose the list of tables that needs to be migrated.

Once the table have initiated the migration  we can see the progress.

On a successful migration we get the below message.

The result of the online migration is that the data is successfully migrated to the cloud.

Thanks & Regards

Sathish Veerapandian

Microsoft Azure – Copy VHDs, between storage accounts in managed and unmanaged disks

The most common tasks that we might be receiving in Azure is to copy the blobs between the storage accounts. This article outlines the steps involved in copying the VHDs between managed and unmanaged disks

Copying the VHDs from unmanaged disks to a new storage account is pretty simple and we have two options copying via AzCopy or use Storage explorer

Option 1: Using Az Copy

Step 1: Get the VHD URL – 

Navigate to storage account – Choose the Associated VM SG account – Click on Blobs – Select the container name – Choose Properties – Copy the URL 

Step 2 : Copy the access key of the source storage account.

Step 3: Download and install the AZ copy 

Step 3:  Follow Step 1 & 2 and get the URL and access key details  of the target storage account.

Now we need to open command prompt and run the below command replacing the required entries that we have taken from the respective storage accounts

.\AzCopy.exe /Source:https://StorageAccountSource.blob.core.windows.net/vhds /Dest:https://StorageAccountDestination.blob.core.windows.net/vhds /SourceKey:”keyXXX”  /DestKey:”DestinationKeyXX” /SyncCopy /s

We will see the copy progress once after the command have been initiated

Once the task is completed we get the below message

And can see the file successfully copied to the target storage container location

Option 2: Using storage explorer (preview)

Using storage explorer is pretty much simpler task

Open storage explorer – click on the subscription – expand the blob containers of the source VHD – Select the VHD – and click on copy




Now navigate to the destination blob container and paste the copied vhd file.

Copy from Managed disks:

Copying the data from the managed disks is much easier using Azcopy or power shell script. There are lot available in the GitHub and have used this one was taken from GitHub

All we need to provide subscription ID, Resource group name , disk name , target storage account name , storage container , storage account key and destination VHD file name.

#Provide the subscription Id of the subscription where managed disk is created
$subscriptionId = "provide subscriptionID"

#Provide the name of your resource group where managed is created
$resourceGroupName ="Provide RG name"

#Provide the managed disk name 
$diskName = "provide disk name"

#Provide Shared Access Signature (SAS) expiry duration in seconds e.g. 3600.
#Know more about SAS here: https://docs.microsoft.com/en-us/Az.Storage/storage-dotnet-shared-access-signature-part-1
$sasExpiryDuration = "3600"

#Provide storage account name where you want to copy the underlying VHD of the managed disk. 
$storageAccountName = "provide SG account name"

#Name of the storage container where the downloaded VHD will be stored
$storageContainerName = "provide storage container name"

#Provide the key of the storage account where you want to copy the VHD of the managed disk. 
$storageAccountKey = 'provide storage account key'

#Provide the name of the destination VHD file to which the VHD of the managed disk will be copied.
$destinationVHDFileName = "provide destination VHD file"

#Set the value to 1 to use AzCopy tool to download the data. This is the recommended option for faster copy.
#Download AzCopy v10 from the link here: https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10
#Ensure that AzCopy is downloaded in the same folder as this file
#If you set the value to 0 then Start-AzStorageBlobCopy will be used. Azure storage will asynchronously copy the data. 
$useAzCopy = 1

# Set the context to the subscription Id where managed disk is created
Select-AzureRMSubscription -SubscriptionId $SubscriptionId

#Generate the SAS for the managed disk 
$sas = Grant-AzureRmDiskAccess -ResourceGroupName $ResourceGroupName -DiskName $diskName -DurationInSecond $sasExpiryDuration -Access Read 

#Create the context of the storage account where the underlying VHD of the managed disk will be copied
$destinationContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey

#Copy the VHD of the managed disk to the storage account
if($useAzCopy -eq 1)
{
    $containerSASURI = New-AzureStorageContainerSASToken -Context $destinationContext -ExpiryTime(get-date).AddSeconds($sasExpiryDuration) -FullUri -Name $storageContainerName -Permission rw
    .\azcopy copy $sas.AccessSAS $containerSASURI

}else{

    Start-AzureStorageBlobCopy -AbsoluteUri $sas.AccessSAS -DestContainer $storageContainerName -DestContext $destinationContext -DestBlob $destinationVHDFileName
}

Disk name of managed VM can be taken from the disks section tab in the managed VM

Once the script started running we could see the SAS URL being generated through the commandlet Grant-AzureRMDiskaccess and we have an option to download them directly from this URL.

With all the details we run the script from Azcopy or powershell and it will copy the VMs successfully to the destination storage account.

Thanks & Regards

Sathish Veerapandian

Overview of DNS services in Microsoft Azure

Like different DNS hosting suppliers, we have DNS facilitating choice both private and public in Microsoft Azure.We have Azure Provided DNS, Bring your own DNS and use Azure private DNS which is in review starting at now.

Azure Provided DNS: (Azure-provided name resolution)

With Azure provided DNS the deployment is a lot simpler, and no complex setup is required from our side.They come up with highly available model and they can be used with in conjunction with our DNS. There are few caveats in this model which is the DNS suffix can’t be changed since they are auto created and given from Azure. DNS Query Traffic is throttled for each VM’s which might need to be taken into consideration for intensive web applications. Thus Wins and Net Bios are likewise not Supported. At last, manual registration of DNS records isn’t supported.

To create Azure DNS – Login to Azure Portal – Search for DNS – Select DNS Zones- Click on create DNS Zone.

Key in the requested details and create

Once created we can see the name servers which are from Azure.So these Azure name servers are responsible to answer DNS queries for the hosted domain from the users on the internet.

Now we have the option to add the record sets and once these records sets are created they will be available public.

To create DNS name from the Powershell we can use the below command
New-AzDNsZone -Name ezcloudinfo.com -ResourceGroupName Network-NG

To create a DNS Record Set we can use the below parameter

New-AzDnsRecordSet -Name www -RecordType A

Bring Your Own DNS:

Bring your own DNS is regularly utilized in hybrid connectivity scenarios which is connecting Azure assets to on-premise DNS system and connecting Azure to various DNS Networks. This is generally required in situations where our Azure cloud VM’s requires reverse lookup of on-premise internal IP’s or authentication is required in domain controller for applications running on VM in Azure.

The most crucial thing is that when we are implementing the bring your own DNS on Azure we need to turn of DNS Scavenging which will help us to prevent the accidental deletion of DNS records. Also, we need to enable DNS recursion and ensure port 53 is accessible from all the clients.

One crucial point to consider is that we must never specify our own DNS settings within the VM itself because the system is unaware of the settings for DNS. Instead there is configuration options within the virtual network settings which are at VNET level and will be applied to all resources in the network.

We need to register each VM in provided DNS service or configure the DNS servce to accept Dynamic DNS Queries.

We can configure the custom DNS as below from the Azure Portal.

Navigate to Azure Portal – Select the virtual networks that needs to use our own DNS – We will see the default Azure provided DNS.

In-order to use our DNS select Custom and key in the DNS details on the required VNET.

The same steps is applicable for the individual VMs and in those cases we need to enter the DNS servers in the VM network interface.

And change the DNS servers to our custom DNS.

The private DNS can be configured at the VNET , Network Interface level and not at the subnet level. So we need to configure these settings on each VM’s network interface.

Azure private DNS (Preview):

Presently Azure DNS likewise underpins Private DNS areas which is in review starting at now. This is a promising component to give DNS between private virtual networks.

With these private DNS Zones we can utilize our very own custom DNS names without the complexity nature of overseeing and keeping up our own DNS servers.

As of now the name resolution is supported up to ten virtual networks.If we need to resolve the VM names from multiple virtual networks the VMs in any other networks must be registered with the service manuallyAs of now the name resolution is supported up to ten virtual networks. As the name indicates these zones are not exposed to the internet and will be communicating only within the inter linked virtual networks.

The procedure is similar like what we see on Azure DNS – Navigate to Azure portal – select private dns zones.

Once created we will see them to be a private DNS.

We have the options to create records sets which will be communicating between these interlinked Vnets.

Since this Azure private DNS is in review mode without Service Level Agreement it is prescribed not to move this out on production environments . Its better to play around and investigate the utilized cases which will help when it is rolled out live on production environments.

Additional Info:

IP address 168.63.129.16 is a virtual public IP address that is used to facilitate a communication channel to Azure platform. The public IP address 168.63.129.16 is used in all regions and all national clouds. This special public IP address is owned by Microsoft and will not change and offers below features.
1) Enables the VM Agent to communicate with the Azure platform.
2) Enables communication with the DNS virtual server to provide filtered name resolution.
3) Enables health probes from Azure load balancer.
4) Enables the VM to obtain a dynamic IP address from the DHCP service in Azure.

5) Azure DNS also supports importing and exporting zone files by using the az network zone import and export command lets. Importing zone files will create a new zone in the Azure DNS if they are new record sets or they are merged with existing if there is a zone already present with this name in the azure DNS.

Thanks & Regards

Sathish Veerapandian

Plan and configure Azure Information Protection

Corporate data leakage and losing critical confidential information is been often considered as to be an employee negligence. These days the corporate services are available to all end users from anywhere which makes the employee more productive and work from anywhere. On the flip side if there are no security enforced, for instance a sales officer might leave a confidential customers list on a shared computer in a public place. Its very important for the employers to classify, label and protect their electronic data based on their business models.

Using Microsoft azure information protection will augment and sheild all the office 365 and azure workloads. We have option to enforce the classification or to provide users the option to classify on their own. This article emphases on enabling the Azure Information Protection on Office 365 workloads.

Classify the data based on the Business:

Applying the protection on documents is purely based upon the business model. It varies based on every business deliverable and needs to be identified and defined in the first place.  This is the first approach to start with Classifying the documents. Better to involve every team in this initial phase and gather the sensitive data that’s been transmitted via electronic way. Security team plays a key important role at this point, since they would already have the data classification based on the present business operations.

Identify the target Users:

Based on the cataloguing of the document now we need to create labels which will identify the sensitive documents on the transit. Protection can be enforced if the user has Office 365 E5 license or we can recommend classifying the document if the user has office 365 E3 license.

We can categorize the users based on their daily chores and its very important because a license plays a key role in this decision. For instance, there is no much concern on enforcing Azure Information Protection policies on receptionist account, rather it can be recommended to classify the document based on the key words. In a real scenario for critical document operators like finance, procurement, HR and key persons can go with E5 license and rest can be with E3 licenses.

Decide your tenant key:

By default, Microsoft manages the tenant key, and this is the root keys for the entire organization. 

This key will be used to provide cryptographic security to any objects associated in this domain from users, computers and protecting the documents. If the organization does not have any issues with Microsoft holding the tenant key then we can go with this approach. The tenant key life is automatically recycled by Microsoft to ensure the security.

If there are any regulatory requirements, then there is an additional option called BYOK (bring your own key). Here we use the Azure Key Vault and have 2 options. Either create the Key directly from the Azure Key Vault or create the Key in On Premise, export and then import this key into the key vault.

Deploy the Azure Information Protection Client for Targeted Users:

According to utilize this Azure information protection, service the end users must have this client installed on their PC’s including the Outlook add-in and must be logged in with their Microsoft Azure AD synced account. So ensure that this client is installed on the targeted users PC through group policy. This Azure information protection client is free and doesn’t include any license cost.

Enable Protection activation:

Ensure protection activation is enabled.

Navigate to Azure Information Protection – Protection Activation – Ensure its activated



Create the labels:

Once after gathering the important document types from different business units its better to create the labels based on the keywords.  In below examples we ‘ve created three document category A,B & C.

To create Label – Login to Azure Portal – Click on Azure Information Protection – Navigate to Labels and create label

Now we have the permission

Not Configured – Go with this option only if we need to preserve document with the previously created labels.

Protect – We are enforcing the AIP and going with this newly created label.

Remove protection – Select this option to remove protection if a document or email is protected

We have other options to enforce in the document like document visual marking , footer text and footer font name.

When we select on protect now we need to select our key and have 2 options

Azure Cloud Key – Managed by Microsoft.

HYOK –Key generated from the on-premise certificate authority.

The permissions need to be selected based on our requirement.

Co-owner – Full Access.

Co-Author- Editorial Access.

Reviewer- Editor without change rights.

Viewer – Only view access.

Custom – We can create permissions on our own.

Set the file content expiration which will expire the file after this specific period. So, the file travels with the permission enforced from Azure.

User Defined Permissions:

This option lets users specify who should be granted what authorizations. This can be given to end users to enforce them on outlook , word, excel, PowerPoint and file explorer.

Now we have an option target the users based on group and apply this label. However the best viable option is to create classification polices and add the labels to them.

Create Classification policies:

There are default classification policies and templates which can be used for protecting documents. But it’s always recommended to study the business requirements and create the classification policies based on the business requirements.

We need to navigate to the Azure information protection policy and target users and add this label.

In below example we have created a policy for one region, targeted users .

The created labels can be added here.

Additionally, there is an option to select the default label assigned to these users. There are other significant options which needs to be chosen based on the corporate necessities.

Client behavior:

After the policy is targeted users will see the document category available from the Azure information protection policy applied for the user.

Once the client is installed both on sender and recipient side and authenticated and a document is shared we can see the category based on the classification.

When the end user is not enforced but trying to save a credit card information in the word document a suggestion is triggered from the AIP.

When end users receive a protected document, they can see their permission level.

This is only the internal user experience. The external user experience is totally different where they will receive a welcome email with a notification that they have received a protected message. The moment when they click on the link the users can login with the one-time pass code which will come in a separate email or login with gmail credentials.

To conclude the Azure Information Protection is a remarkable offer from Microsoft which must be implemented after several iterations and careful planning. Also, this is a continuous process where the policies must be revisited and updated regularly as per the local regulatory  and business changes. Moreover, stringent polices should not be applied without proper evaluation since it can deteriorate the normal business operations. While this is just an overview of azure information protection and there are lots of features to explore and implement in any environment after vigilant planning.

Thanks & Regards

Sathish Veerapandian

Microsoft forms Error – Sorry something went wrong

Recently while accessing Microsoft forms users were getting the error. Sorry, something went wrong.

The issue was reported by all the users even they have required licenses assigned to them.

The forms were enabled for the affected user



Solution:

We’ve to enable the collab DB service from the azure portal which is required for this Microsoft forms.

Navigate to Azure Portal – Access Azure Active Directory – Select Enterprise Applications – Search for CollabDb Service




Navigate to properties – ensure Enabled for users to sign-in is turned on.

Once after enabling the sign-in option on the azure portal this has fixed the issue.

Delegate resetting azure MFA for helpdesk through azure automation run book and Microsoft Flow

When a user with MFA enabled loses his mobile phone then he wouldn’t be able to login to new devices or in the old devices where the token life time have expired. 

Currently in this scenario the user have to report to help desk team. Unfortunately only the global admins can perform  the force reset of MFA account for the user to reset his Strongauthenticationmethods value to null to clear the  old lost device.  

There is a work around which can be used until we get a delegated RBAC role for performing this action. With Azure Automation account, creating a flow, integrating with flow and delegating this action to helpdesk admins will reduce the load on global admins performing this action. 

Prerequisites:

  1. Create New Automation Accounts from azure portal. Azure subscription required.They provide 500 minutes free every month.
  2. Create new Work Flow from global admin account.This action needs to be performed from global admin account.
  3. Enter the Global admin Credentials in the created automation account. Very Important that this account used to execute must not have MFA enabled.
  4. Import the MSOnline module from the gallery.

Create Azure Automation Account –

Proceed to https://portal.azure.com – Create automation account.

Now add the msonline module-

Add Exchange Online Module – Access Azure Automation account and click Assets > Modules- Add MSOnline Module.

We can see the MSOnline modules are imported successfully.

Enter Global Admin Credentials in the Created Automation account –

Click on Automation accounts – Credentials – Enter Global Admin Credentials. Add scripts(below scripts)

This is the global admin credentials required which will execute the automation when we trigger the work flow from a delegated helpdesk admin account.

Now add the script which is required to execute this operation.

Param
     (
         [Parameter (Mandatory= $false)]

         [String] $UserEmail = ""
     )

     $creds = Get-AutomationPSCredential -Name 'TestDemo’
     Connect-MsolService -Credential $creds
#This command resets the MFA
Set-MSOLUser -UserPrincipalName $UserEmail -StrongAuthenticationMethods @()
#This Command Resets the password  with force login
#Set-MsolUserPassword -UserPrincipalName $UserEmail -NewPassword "S@c@r!ooii" -ForceChangePassword $true

After adding above Publish the scripts.

Now we need to create the flow from the global admin account to execute this action.

Head over to Flow (https://flow.microsoft.com ) and provision a new personal Flow. Click new flow – Click Create from Blank.

Choose – Flow Button for Mobile , Flow Button for Mobile – manually trigger a Flow , Select AA- Type useremail as input flow.

Navigate to triggers – Select Manually trigger a flow.

Type UserEmail as input flow-Click on New Step – Add an Action

Click on Choose an action – Select Azure Automation – Create a Job – Provide the required credentials and subscription details.

Provide the required credentials and subscription details.

This part is very important we need to select the input as UserEmail as below. This parameter is required for the run book to execute the operation.After that we can see that the RunBook Parameter is UserEmail.

Now we will see the flow is connected to Azure automation account

Now Navigate to My Flows- Select the new flow – Click on – Run Now

We can see the flow will be successfully started and execute the requested operation of resetting the MFA value to null for the user.

We can run them on automation accounts and see them for verification and they will be successful.

From the global admin Flow login – Delegate this flow to helpdesk admins as manage run only user permission.

The actual operation is executed by the global admin account however the helpdesk team will be triggering this action through the delegated run only permissions assigned to them in created Microsoft flow.

Thanks & Regards

Sathish Veerapandian

Configure access panel in Azure Active directory

We can enable and provide self service application access to end users.If an organization is using Office 365 applications and the user is licensed for them, then the Office 365 applications will appear on the user’s Access Panel.Microsoft and third-party applications configured with Federation-based SSO can be added into this access panel.

We can create multiple groups example like HR,Marketing and required apps both internal corporate apps and social media apps can be published.

In order to logon access panel we must be authenticated using organizational account in Azure AD.We can be authenticated to azure AD directly or federated authentication and consume this service.

For organizations that have deployed Office 365, applications assigned to users through Azure AD will also appear in the Office 365 portal 

The azure access panel is a web based portal which provides user with below features:

1)View and launch cloud apps.
2)Configure self service password reset.
3)Self manage groups.
4)See account details.
5)Modify MFA settings.

IT admin can be benefited and reduce first level calls by enabling below features:
1)Provide easy portal for users.
2)Launch cloud based, federated onprem apps.
3)Links to URLs.
4)Control access to corporate application.
5)Restrict access to Users by Groups ,device and location.

The portal can be accessed from https://myapps.microsoft.com Azure Admin Can configure the Access panel settings from the below url-

Login to Azure AD – https://portal.azure.com/

Navigate to URL – Azure AD – Enterprise Applications – All applications.

Select the application which we need to add – In below case LinkedIn – Click on Self-Service.

Below are the options we have at this moment:

Select the option allow users to request access to this application. – By enabling this option end users can view and request access to this application.


To which group the users must be added:

Require approval before granting access to this application:

Who is allowed to approve access to this application:

To which role users should be assigned to this application:

We have these option to add an app:

  1. App that your developing- Register an app you’re working on to integrate it with Azure AD.
  2. On prem app (app proxy)- Configure Azure AD Application Proxy to enable secure remote access.
  3. Non gallery app- Integrate any other application that you don’t find in the gallery
  4. Add from the Gallery – There are close to 3000 apps in gallery which can be added.

Example below of when adding an application we have the following options:

In below case we are adding twitter from the gallery- Custom name can be provided for the application.

Single sign on mode-we have 2 options:

  1. Federated SSO – Allow users to access apps with their organizational accounts applicable mostly for on premise apps published here, application you are developing and any application which is integrated with on premise IDP. Only one time login is required.
    After signing in, the user can launch applications from the Office 365 portal or the Azure AD MyApps access panel. 
  2. Password based Sign-on- Users must remember application-specific passwords and sign in to each application. 

Hide application from end user:

This option can be used if we would like to hide application from end user.

We have below option to hide office 365 apps from the access panel. Doing this will allow end users to see office 365 apps only from office 365 portal.

Further more end user settings features for access panel can be managed:

For on premise applications we need to configure federated single sign on and add them on the access panel.

Navigate to Azure AD – Click Enterprise Applications – Click all Applications – Select the application that needs Single sign on configuration

We have the below options:
SAML – Use SAML whenever possible. SAML works when apps are configured to use one of the SAML protocols.For SAML we need to provide the signon url, user attributes , claims , signing certificate

And then we need to provide the azure url in the application to link with azure AD. Here we are creating an relying party trust between the application and Azure AD for the SAML configuration to work.


Linked – Can be used for cloud and on premise apps.we can use this when the application have single sign-on implemented using another service such as Active Directory Federation Services or any other IDP solution.

Disabled – Use this option If your application is not ready and integrated for SSO. Users will need to enter the user name and password every time the application is launched.

End User review from browser –

User can navigate to http://myapps.microsoft.com/

The defaults office 365 apps will be shown if its not hidden.

After Clicking on Add app users can explore the apps added by admin from the admin portal. In our case it shows only LinkedIn since we added only LinkedIn.

If there is any approval process required as per admin config it goes for approval and post approval the application will be available for requested user.

As per the recent update Microsoft recommends to use In-tune Managed Browser My-apps integration for mobile scenarios.
This integration supports lots of additional cool stuff like home screen bookmark integration, azure ad app proxy integration.

The access panel will definitely help end users to access all office and their corporate applications all in one place without any confusion and will reduce the burden on the front line first level end user access requests.

Thanks & Regards

Sathish Veerapandian

%d bloggers like this: