Azure Files – Part 2 – Creating a SMB Share

smb-icon

Hello everyone! As promised in the first post about Azure File, today I will demonstrate how to create an Azure Files SMB share. However, first it is necessary to say that when we implement SMB shares with Azure, there are 2 basic scenarios. The first would be server to server and/or applications, in which case you can use standard admin account and access key. If you wanted to use your Active Directory domain identity with Azure Files, you will need to extend your domain to Azure (You can do this in 2 ways), that is, basically add the domain service in the Microsoft cloud, only in this one scenario you can integrate your storage account with identities, your users can each use their own domain account to use their file access privileges.

That said, let’s get down to the minimum requirements for using Azure Files on Windows machines (MacOS and Linux are also supported, but they’re not in the scope of this post).

image

Let’s get started!

Initially, to create an Azure Files you need to create a new Storage Account, because if you try to search Azure Files when creating a new resource, you will notice that nothing will be found.

image

Of course you can use an existing Storage Account, but for this post I will create a new storage account.

The important steps here is to create a resource group and the storage account itself, everything else you can customize according to your need or leave everything as default (If your don’t know how to create a Storage Account go to this post).

image

Hit ‘Review + Create’ and within 2 or 3 minutes you will have everything you need to create your Azure Files, Then click go to the resource. Once you have your new storage account open, hit the ‘File Share’ blade in the vertical menu on the left side.

image

 Just for observation, at the top of the screen above it says that the Active Directory is not configured, i.e. in this scenario I could not use the identity service without first enabling the domain service in Azure.

Continuing with our configuration, hit ‘+ File Share’, type the name, put the amount of GiB needed then select the access tier needed. For this post I selected the cheapest for demonstration purposes, but you must select it according to your need (You can access here the Microsoft link explaining about each tier and pricing).

image

Now that the share has been created, navigate to the one we just created and you can see that there aren’t many options here. The main option is the ‘Connect’ option.

Hit the ‘Connect’ option and you will see that Azure will provide a script for Windows, Linux and MacOS.

Basically you will need to choose which operating system you will have the driver mapped, the driver letter (For Windows OS only) and which authentication method will be used.

image

To finish the only thing to do is run this PowerShell script on the machine where you want to have the driver mapped, with the only requirement being port 445 open for communication with Azure. In this script provided by azure, it already contains the account and password to access the resource, and at the time of execution of the script there is no need to elevate your PowerShell session.

image

The result I hope after executing the script will be the driver mapped as in the example below.

image

It is also possible to add the mapping manually, you just need to follow the following steps.

1 – On the machine you want to map the drive, open Windows Explorer and hit ‘This PC’, then right click on the white space and select Add network location, after that hit ‘Next’ twice and you will end up to a windows that you need to specify the address for the location you want to add.

2-Go back to Azure Files on Azure portal and hit ‘Properties’.

3- Copy URL without the HTTPS and paste on your Windows Explorer screen, but don’t forget to add the ‘\\’ and also change all others for back slash ‘\’.  The result should be this:

image

4- Hit ‘Next’ and give a name for your network location, and hit ‘Next’ again.

5- Finally, it should ask for the user name and password to access the network location, so to grab that you need to go to Azure Portal again and grab it from your Azure Files Storage Account.

image

The credentials accessing format will be:

User: localhost\StorageAccountName

Password: StorageAccountAccessKey

Now you will be able to put your files these will be automatically synced to the cloud or your on-premises environment (Depending on where you create the file).

image

I would also like to demonstrate the features of Azure Files snapshot and how the backup works, but this post is already too big, for this reason I will reserve these subjects for the next ones. If you have any questions, leave in the comments, see you soon!

Joao Costa

Az-Predictor – How to install and use it

In the day-to-day work in IT, we often need to deal with manipulation of massive resources, and often this task ends up becoming difficult, especially if you are not very familiar with PowerShell cmdlets, after all the PowerShell has approximately 4000 cmdlets. Well, thinking about it, the Microsoft team created this incredible tool that is undoubtedly here to stay, Az-Predictor.

What is Az-Predictor?

Az-Predictor, an intelligent command completion module for Azure PowerShell. Az-Predictor helps our Azure developers/administrators find the cmdlet they are looking for efficiently, identify the required parameters quickly, and experience fewer errors

Of course it is possible to use the auto-complete feature already built into the PowerShell (by pressing the TAB), so that the command is automatically completed by the PowerShell, but Az-Predictor is a few steps beyond that, as the tool will not only complete the cmdlet , but also suggest all the parameters that need to be configured, ie Az-Predictor will basically do all the work based on prediction using all the documentation pages already available and the user leverage in PowerShell.

The idea behind this amazing tool is to assist in the correct syntax of commands and parameters based on AI prediction suggesting all commands to be used, including letting you use historical cmdlets, ie cmdlets already used previously.

Getting started with preview 3 (Keep in mind that at the time of this post we are talking about a preview version, so by the time you read this post there may already be a new update or a different way to install the product.)

  • If you have installed the first preview:
  • Close all your PowerShell sessions
  • Remove the Az.Tools.Predictor module

Install PowerShell 7.2-preview 3 – Go to: https://github.com/PowerShell/PowerShell/releases/tag/v7.2.0-preview.3

Select the binary that corresponds to your platform in the assets list.

Launch PowerShell 7.2-preview 3 and Install PSReadline 2.2 beta 2 with the following:

Install-Module -Name PSReadLine -RequiredVersion 2.2.0-beta2 -AllowPrerelease

More details about PSReadline: https://www.powershellgallery.com/packages/PSReadLine/2.2.0-beta2

Install Az.Tools.Predictor preview 3

Install-module -name Az.Tools.Predictor -RequiredVersion 0.3.0

More details about Az.Tools.Predictor: https://www.powershellgallery.com/packages/Az.Tools.Predictor/0.3.0

Enable Az Predictor

Enable-AzPredictor –AllSession (This command will enable Az-Predictor in all further sessions of the current user.)

Enable the list view mode (Optional) – Believe me, you want to enable this!

Set-PSReadLineOption -PredictionViewStyle ListView

If you want to load Az-Predictor every time you start PowerShell (and trust me you want that ), you can add the last three commands to your PowerShell profile.

image

OK, now that we know how to make this amazing tool work, let’s have some fun.

Start by typing known commands and see how Az-Predictor starts trying to predict what you want to achieve based on your history and documentation already available. It’s pretty cool!

See the example below, I just typed “set” and see how many possibilities Az-Predictor “suggested”, cool isn’t it?

image

If you use the up and down arrow keys on the keyboard you can choose the command you want to use as if you have a drop down menu,

If you use the up and down arrow keys or you can choose the cmdlet you want to use as if you have a drop down menu, also if you use the right arrow your cmdlet will be completed by Az-Predictor based on the cmdlet that you “selected”, and finally, if you want/need to change parameters of the cmdlets presented, press Alt+A and you start to select each changeable parameter of the command.

See in the command below what happens if I press Alt+A

image

You can explore at will and try many other possibilities. I hope you enjoyed it, see you soon!

Joao Costa

Creating a storage on Azure

SA_logo

Today I’m going to show you how to create a storage in the Microsoft Azure portal. So straight to the point, let’s get start: First log on your Azure Portal, next go to the “Search Bar” and type “Storage Accounts“, after that select Storage Accounts and finally click “Create“.

image

Now let’s add the necessary information for each Storage, remembering which organization will have Storages according to your needs. I will detail each configuration (Required ones):

Basics tab

  • Subscription – Select the subscription for the new storage account.
  • Resource group – Create a new resource group for this storage account, or select an existing one. For more information, see Resource groups.
  • Storage account name – Choose a unique name for your storage account. Storage account names must be between 3 and 24 characters in length and may contain numbers and lowercase letters only.
  • Region – Select the appropriate region for your storage account. Not all regions are supported for all types of storage accounts or redundancy configurations.
  • Performance – Select Standard performance for general-purpose v2 storage accounts (default). This type of account is recommended by Microsoft for most scenarios. Select Premium for scenarios requiring low latency. After selecting Premium, select the type of premium storage account to create. The following types of premium storage accounts are available:
  • Redundancy – Select your desired redundancy configuration. Not all redundancy options are available for all types of storage accounts in all regions. If you select a geo-redundant configuration (GRS or GZRS), your data is replicated to a data center in a different region. For read access to data in the secondary region, select Make read access to data available in the event of regional unavailability.
  • Advanced tab

    Networking tab
    • Connectivity method – By default, incoming network traffic is routed to the public endpoint for your storage account. You can specify that traffic must be routed to the public endpoint through an Azure virtual network. You can also configure private endpoints for your storage account. For more information, see Use private endpoints for Azure Storage.
    • Routing preference – The network routing preference specifies how network traffic is routed to the public endpoint of your storage account from clients over the internet. By default, a new storage account uses Microsoft network routing. You can also choose to route network traffic through the POP closest to the storage account, which may lower networking costs. For more information, see Network routing preference for Azure Storage.

    Then click Create.

    image

    After creation check it in your Storage accounts and by clicking on settings you can see all the parameters used in the Storage settings.

    image

    Thanks guys and see you on the next post!

    How to authenticate AzCopy on Azure

    AzCopy should now be downloaded to your computer (If you don’t know how to do this, go back to the last post here). But before you can perform any tasks, it is necessary to authenticate to your Azure subscription to access Azure Storage first.

    There are two ways to authenticate AzCopy to your Azure storage accounts – Azure Active Directory or by a Shared Access Signature (SAS) token. In this article, we’ll focus on using Azure AD.

    The most common method to authenticate AzCopy is via Azure AD. When using Azure AD, you have several options. Some of these options are:

    • Interactive Login – User is prompted to log in using the browser.
    • Service Principal + password – For non-interactive login. Recommended for automation and scripting.
    • Service Principal + certificate – For non-interactive login. Recommended for automation and scripting.

    In this article, you will learn how to authenticate via interactive login. To do so, first, open a command prompt or PowerShell and run the below command. The –tenant-id parameter is optional but recommended, especially if your login account is associated with more than one Azure tenant.

    image

    Once executed, you will be asked to open a browser and navigate to https://microsoft.com/devicelogin and enter the displayed code. You can see what that will look like below.

    05Enter the code from AzCopy into the browser

    Once you’ve entered the code into the browser, click Next and proceed to sign in to your account.

    03

    When sign-in is done, you should see the status shown in the browser and in the terminal similar to what’s shown in the screenshot below.

    04

    Now that you have all this knowledge, you should now be ready to put AzCopy in action! See you soon folks!

    How to Download and Install the AZCopy Tool

    Azure-Command-line-Tool-for-Data-Transfer

    This article was motivated by the doubt of one of our readers who asked us to explain more about AzCopy, as he had the need to copy files to the Azure Storage and was having issues (I already helped him to solve the issue, doing this through the AzCopy).

    AzCopy is a command-line utility that you can use to copy blobs or files to or from a storage account. It’s a great command-line utility that can automate and streamline the process but requires some setup.

    In this article, you’re going to learn how to prepare your system to use AzCopy. This includes downloading and Install the AzCopy, I will divide this post in two, starting explaining just about the download and installation of AzCopy. In the next article, I’ll focus on how to authenticate AzCopy on Azure Storage and how to copy files.

    The latest and supported version of AzCopy as of this writing is AzCopy v10. AzCopy is available for Windows, Linux, and macOS. In this article, only the Windows AzCopy utility is covered.

    Downloading AzCopy: The Manual Way

    There are a couple different to download AzCopy. Let’s first do it the manual way. You might use this method if you don’t intend to install AzCopy on many computers at once.

    Navigate to this download link–  and it should initiate a download of the zip file. Once downloaded, extract the zip file to the C:\AzCopy or a folder of your choice.

    Lastly, add the installation directory to the system path. Refer to the article here if you need to know how to do that. Adding the folder path to the Windows PATH allows you to call the azcopy executable whenever you are in any working directory at the command line.

    Downloading AzCopy via PowerShell Script

    If you intend to install AzCopy on many machines or simply need to provide instructions for someone else to install it, you can use PowerShell also. Using a PowerShell script simplifies the process down to a single script.

    Create a new PowerShell script and copy/paste the below contents into it. You can get an idea of which each section of the script is doing by inspecting the in-line comments.

    By default, the below script will place AzCopy in the C:\AzCopy folder. If you’d like to change that, when running the script, use the InstallPath parameter or simply change the default path in the script itself.

    Function Install-AzCopy {
    [CmdletBinding()]
    param(
    [Parameter()]
    [string]$InstallPath = ‘C:\AzCopy’
    )

        # Cleanup Destination
    if (Test-Path $InstallPath) {
    Get-ChildItem $InstallPath | Remove-Item -Confirm:$false -Force
    }

        # Zip Destination
    $zip = “$InstallPath\AzCopy.Zip”

        # Create the installation folder (eg. C:\AzCopy)
    $null = New-Item -Type Directory -Path $InstallPath -Force

        # Download AzCopy zip for Windows
    Start-BitsTransfer -Source “
    https://aka.ms/downloadazcopy-v10-windows” -Destination $zip

        # Expand the Zip file
    Expand-Archive $zip $InstallPath -Force

        # Move to $InstallPath
    Get-ChildItem “$($InstallPath)\*\*” | Move-Item -Destination “$($InstallPath)\” -Force

        #Cleanup – delete ZIP and old folder
    Remove-Item $zip -Force -Confirm:$false
    Get-ChildItem “$($InstallPath)\*” -Directory | ForEach-Object { Remove-Item $_.FullName -Recurse -Force -Confirm:$false }

        # Add InstallPath to the System Path if it does not exist
    if ($env:PATH -notcontains $InstallPath) {
    $path = ($env:PATH -split “;”)
    if (!($path -contains $InstallPath)) {
    $path += $InstallPath
    $env:PATH = ($path -join “;”)
    $env:PATH = $env:PATH -replace ‘;;’,’;’
    }
    [Environment]::SetEnvironmentVariable(“Path”, ($env:path), [System.EnvironmentVariableTarget]::Machine)
    }
    }

    Once the script has run, you can then confirm that AzCopy was downloaded successfully. While still in the PowerShell console, listing the files in the install path by running Get-ChildItem -Path $InstallPath replacing whatever folder you used.

    If everything went well, you should see the azcopy.exe utility and a license text file.

    You can also confirm that the installation path is added to the system path variable by running $env:Path -split ";" and noticing that the install folder shows up at the bottom of the list.

    In the example below, C:\AzCopy is listed which means that the location was added successfully.

    image

    That and everything for today guys, in the next post I will talk about how to authenticate in Azure Storage and how to effectively copy files using AzCopy.