Friday, August 30, 2019

Store Azure AD Audit Logs in an Azure Storage Table using an Azure Runbook


In previous posts, I showed how to:
We are now going to go further and store the Azure AD Audit logs into an Azure Storage Account table. We are going to do all this using PowerShell only and we will open the Azure Portal only to check the results of our scripts.


The PowerShell script will:
  • create resources:
    • a Storage Account,
    • a Storage Table in the Storage Account,
    • an Automation Account,
    • a PowerShell RunBook in the Automation Account.
  • add credentials and variables to the Automation Account.
  • import the code of the runbook from a gist.
  • execute the runbook to import the Azure AD Audit logs from Azure Active Directory and store them into the Azure Storage Table.
  • display the result of the runbook job.


To do this tutorial you must:
  • Have access to an Azure tenant and to an Azure subscription of that tenant.
  • Have a Global Administrator account for that tenant.
  • Have the Azure AD Audit logs non empty (you can manually create user in Azure AD if needed or use this post to it using PowerShell).
  • Have a local PowerShell environement with new Az module installed and working properly.


This is a tutorial. Do not never, ever do what we are going to do in a real IT department:
  • We are going to use the AzureADPreview module the use of which is not allowed for production matters.
  • We are going to use the Global Administrator credentials in our runbook that is strictly a bad idea, because, in a real company, if a malicious people can have access to the runbook and change the code (that is quite easy) , this people could perform catastrophies regarding Azure environements in this company.
I will post an article later showing how the same use case is processed in a real company.
Now you are aware of that, let's start the tutorial.


1. Connecting to Azure, setting the Subscription, AzContext, ResourceGroup

We need first to connect to Azure. Then we need to define the containers for our resources (Automation Account, runbook, Storage Account, Storage Table).
  • The Azure Subscription is THE container in Azure. It that gathers all the costs of the resources at a first level, allows users to find the resources and administrators to easily define permissions. It is the main container to store resources in Azure.
  • The Azure Resource Group is a sub container within the subscription. It allows users and administrators to gather resources linked by a same project, topic,etc. Most of all, if you remove a resource group you remove all the resources inside of it. Very useful to create a bunch of resources test them and delete them to perform another test without any risk to touch to another resources in the subscription.
  • The Azure Context is like an invisible link to a subscription in our PowerShell session. In certain PowerShell cmdlets, we will not be necessarely asked to re-precise the subscription to impact each time. The cmdlet will be sometimes smarter enough to guess the subscription to impact because it is linked to the context. That's why, the context is a very important thing to check or define if needed.

# Will be prompted to sign in via browser
# check the subsctiption available and choose one to work with
#define the subscription you want to work with
$subscriptionId="your subscription id"
#Check what iz your AzContext
# Use set context if you need to change subscriptions.
Set-AzContext -Subscription $subscriptionId

Notice than, after my connection, my Azure Context was set on my Paas subscription. As I want to perform this tutorial on the Iaas one, I have to change the Context to point on the Iaas subscription.

#New Resource Group
$Location = "francecentral"
$resourceGroupName = "azureADAuditLogs"
New-AzResourceGroup -Name $resourceGroupName -location $Location

If I go to the Portal, the Resource Group exists, but for another subscription.

After the cmdlet success

We can see the new Resource Group in the Azure Portal within the good subscription.

We have all our first level containers now, we can thus create the resources within them.

2. Creating the Automation Account

An Automation Account is both a resource, but also a container for one or several runbooks that can execute code on demand or based on schedule. The Automation Account can provide the runbook(s) with several items:
  • Variables that the runbook(s) can call and use.
  • Credentials (encrypted of course) that the runbook(s) can call and use to authenticate to Azure, Azure AD, etc.
  • Code modules that the runbook(s) can import and use.
Let's create the automation account.
#New Automation Account
$automationAccountName = "adlogs-automationAccount"
New-AzAutomationAccount -ResourceGroupName $resourceGroupName -Name $automationAccountName -Location $Location -Plan Free

We had a successful confirmation in the Powershell window and we can check that the Automation Account has been created successfully in the portal:

3. Adding modules to the Automation Account

The runbook that we will create later within the Automation Account will have 2 tasks to do:
  • Connect to Azure Active Directory to import the Audit Logs. There is a PowerShell module AzureADPreview that helps to do this very quicly with a single line cmdlet! Unfortunately, a standard Automation Account doesn't come with this module loaded so we have to import it. This module is in preview so as written previously it is not recommended by Microsoft to use it at work yet.
  • Then, export all the logs line by line in an Azure Storage Table. Here again, creating lines in an Azure Storage Table requires a specific module: AzureRMStorageTable. We have to import this module also in our Automation Account.

3.1 Adding module ADPreview to the Automation Account

#Importing AzureADPreview module into Automation Account
$ModuleADPreview = "AzureADPreview"
$uriModuleADPreview = (Find-Module $ModuleADPreview).RepositorySourceLocation + 'package/' + $ModuleADPreview
New-AzAutomationModule -Name $ModuleADPreview -ContentLinkUri $uriModuleADPreview -ResourceGroupName $resourceGroupName -AutomationAccountName $automationAccountName

Above are the cmdlets to import the module AzureADPreview, and below, the result of them in the PowerShell window:

As soon as the cmdlet receives a response, we can see in the portal that the module is in a state "Importing":

And soon imported and ready to use:

3.2 Adding module AzureRMStorageTable to the Automation Account

Regarding this module, there is a somehow tricky explanation to do. This module is one of the most old modules of Azure since we had needed to store data in Storage Tables from the beginning of the portal. However, the version of Azure has increased and the name of the cmdlets also, we had AzureRM, and now Az, but regarding the storage the names stayed the sames. Furthermore, you notice that all our PowerShell cmdlets are based on new PowerShell Az module, but the current (2019-08-26) out-of-the-box Azure runbook created within an automation account is still using the AzureRM module!
So now we have 2 versions of the AzureRMStorageTable :
  • The latest version (2.0) is compatible with the new module Az and is not compatible with the current (2019-08-26) out-of-the-box Azure runbook that is still using AzureRM . There is, by the way, no module named AzStorageTable.
  • the version is compatible with the module AzureRM, and as we are going to use the module AzureRMStorageTable within a runbook we have to import the version of the AzureRMStorageTable because the out-of-the-box Azure runbook at date (2019-08-26) is still baseed on AzureRM.
So here are the cmdlets to do it:
#Importing AzureRmStorageTable module into Automation Account
$ModuleStorageTable = "AzureRmStorageTable"
$uriModuleStorageTable = (Find-Module $ModuleStorageTable).RepositorySourceLocation + 'package/' + $ModuleStorageTable + '/'
New-AzAutomationModule -Name $ModuleStorageTable -ContentLinkUri $uriModuleStorageTable -ResourceGroupName $resourceGroupName -AutomationAccountName $automationAccountName

And the result at the execution:
In PowerShell

and in the portal

4 Adding the credentials to the Automation Account

As written before we are going to add credentials to the Automation Account for them to be used by the runbook, and this will be the Azure Tenant Global Administrator credentials. As written before also, you cannot do this in a real company because is very unsafe, but this is only a tutorial in a demo tenant so for the quickness of the demo we are going to do this here.

#New Credentials for the Automation Account
$automationCredentialsName = "azureADConnectAccount"
New-AzAutomationCredential -Name  $automationCredentialsName -ResourceGroupName $resourceGroupName -AutomationAccountName $automationAccountName -Value (Get-Credential)

When running the cmdlet, you are prompted for credentials. Enter the Global Administrator ones.

then you can check the creation in the PowerShell Window and in the Azure Portal:

5. Adding variables to the automation account

the code executed by the runbook will be loaded later from a Gist. The runbook and its code, in order to perform their tasks successfully need information about :
  • Credentials name
  • Subscription where the Storage Table is
  • Resource Group of this Subscription
  • Storage Account where the Storage Table is
  • Storage Table name
Where are going to pass all these values to the runbook through variables within the Automation Account. We are creating first the variable the wich we know the value, we let the variable for the Storage Account and Table for later after having created the Storage Account and the Strorage Table.
Fill the $subscriptionName value with the name of the subscription you work with, the execute the code for the variables creation.
#New variables for the Automation Account
$subscriptionName = ""  #the name of the subscription where you want to create automation account and storage account
New-AzAutomationVariable -AutomationAccountName $automationAccountName -Name "subscriptionName" -Encrypted $False -Value $subscriptionName -ResourceGroupName $resourceGroupName
New-AzAutomationVariable -AutomationAccountName $automationAccountName -Name "resourceGroupName" -Encrypted $False -Value $resourceGroupName -ResourceGroupName $resourceGroupName
New-AzAutomationVariable -AutomationAccountName $automationAccountName -Name "automationCredentialsName" -Encrypted $False -Value $automationCredentialsName -ResourceGroupName $resourceGroupName

5. Creating the Storage Account and the Storage Table

5.2 Storage Account creation

A storage account name has to be unique worldwide, because its name define its Url. To be sure of the unicity of the name for anybody doing this tutorial, I use this trick :
your tenant domain name (that is unique worldwide) + suffix + increment number. Everything has to be in lower case.
So fill the value with your tenant domain for the $tenantDomain variable (lowercase) and run the PowerShell cmdlets
#New Storage Acount
$tenantDomain = ""#fill with your tenant domain
$storageAccountNameSuffix = "adlogs2" #increment the number each time you perform a new test (except if you delete the storage account after each test)
$storageaccountname = $tenantDomain + $storageAccountNameSuffix
$StorageAccount = New-AzStorageAccount -ResourceGroupName $resourceGroupName -Name $storageaccountname -Location $location -SkuName Standard_RAGRS -Kind StorageV2

No response in PowerShell when created,...

But it appears in the portal.

5.3 Storage Table creation

For creating the table run the following cmdlets:
#New Storage Table
$tableName = "azureADAuditLogs"
$ctx = $storageAccount.Context
New-AzStorageTable -Name $tableName -Context $ctx

Table is created, the unique Url appears in PowerShell. That's why the storage account name has to be unique worldwide.

This is the table view in portal:

5.4 Adding the Storage Account and the Storage Table variables to the Runbook

Now than we have the name of the Storage Account and the Storage Table we can pass them to the runbook

New-AzAutomationVariable -AutomationAccountName $automationAccountName -Name "storageAccountName" -Encrypted $False -Value $storageaccountname -ResourceGroupName $resourceGroupName
New-AzAutomationVariable -AutomationAccountName $automationAccountName -Name "tableName" -Encrypted $False -Value $tableName -ResourceGroupName $resourceGroupName

6. Runbook creation

Last resource to create, we will create the runbook while importing its code from my Gist :
You can see how the runbook code is:
  • Importing the AzureADPreview Powershell module
  • getting the credentials from the Automation Account
  • connecting to AzureAD
  • getting the Azure AD Audit Logs in 1 cmdlet
  • connecting then to the Azure tenant
  • retrieving the Azure Storage Table
  • create a line by AD log an store the log in it

Copy and paste the first line and replace by a path that exists in your loacl machine.
Then, copy and paste the other lines in your PowerShell window and hit enter.

#Importing Runbook code from a public Gist
$runbookCodeFileTempPath = "C:\dev\" #set a path that really exists in your local machine
$runbookName = "exportAzureADAuditLogs"
$runbookCodeFileName = "Test-exportAzureadauditlogs.ps1'"
$runBookContentUri = ""
Invoke-WebRequest -Uri $runBookContentUri -OutFile ($runbookCodeFileTempPath + $runbookCodeFileName)

$params = @{
'Path'                  = $runbookCodeFileTempPath + $runbookCodeFileName
'Description'           = 'export Azure AD Audit logs in a Azure Storage Account Table'
'Name'                  = $runbookName
'Type'                  = 'PowerShell'
'ResourceGroupName'     = $resourceGroupName
'AutomationAccountName' = $automationAccountName
'Published'             = $true
#New Runbook for the automation Account

Import-AzAutomationRunbook @params

7. Starting the job, importing the logs, filling the table, reading output

For ending, we just have to start the job, wait for its completion and check remotely the output.
Just copy and paste the following code in your PowerShell window and hit enter.

#Starting the Runbook Job
$job = Start-AzAutomationRunbook -Name $runbookName -ResourceGroupName $resourceGroupName -AutomationAccountName $automationAccountName

# Waiting for Job completion
$timeCount = 0
do {
    #loop body instructions
    Start-Sleep -s 1
    Write-Output ("waited " + $timeCount + " second(s)")
    $job2 = Get-AzAutomationJob -JobId $job.JobId -ResourceGroupName $resourceGroupName -AutomationAccountName $automationAccountName
    if ($job2.Status -ne "Completed") {
        Write-Output ("job status is " + $job2.Status + " and not completed")
    else {
        Write-Output ("job status is " + $job2.Status + ". Writing Job information and  Output for checking...")
}while ($job2.Status -ne "Completed")
$job2 = Get-AzAutomationJob -JobId $job.JobId -ResourceGroupName $resourceGroupName -AutomationAccountName $automationAccountName
if ($job2.Exception -eq $null) {
    Write-Output ("job completed with no exceptions")
else {
    Write-Output ("job exceptions: " + $job2.Exception)

# Full Job output 
$jobOutPut = Get-AzAutomationJobOutput -AutomationAccountName $automationAccountName -Id $job.JobId -ResourceGroupName $resourceGroupName -Stream "Any" | Get-AzAutomationJobOutputRecord
$jobOutPut = ($jobOutPut | ConvertTo-Json) | ConvertFrom-Json
$index = 0

foreach ($item in $jobOutPut) {
    Write-Output "---------------------------------"
    Write-Output ("output " + $index)
    Write-Output ($item.Value)

You should obtain this :
  • Job starting
  • Loop waiting for job completion
  • Output with the Azure Active Directory Audit Logs, and trace of the Azure Storage Table lines creation and filling.

You can the go to the Azure Portal to check that:
a Runbook Job has been completed in the runbook

retrieve the output you saw in the PowerShell window

and most of all, check that the Azure Storage Table is successfully filled!

doing the same in one shot

You can find all this in one script in my GitHub repo: store-AzureAD-auditLogs-RunBook
If you want to execute that script in one shot, remove the resource group:
Remove-AzResourceGroup AzureADAuditLogs

and play the script after having change the variable values at the begining of the script.

Thursday, August 29, 2019

Change Visual Studio Code syntax highlighting

This how-to shows how to change the syntax hilighting in Visual Studio code. This is the basic VS code PowerShell highlighting. I would have liked at least, a different color for the variables.

I wanted to use colorisers for the PowerShell variables. After few searches, I found the trick:
Open the settings of VS Code:

In Settings Editor, switch from "ui" to "json"

This is the native parameters you should have after opening the settings of VS Code in json mode:

This is the trick for changing the variable, the keywords (if, else, do, while, etc.) and the function (PowerShell cmdlets) color. Paste this without forgetting the coma to the existing text before pasting. You will put the color you like and will notice a tool for choosing the color you want. Awesome!

    "editor.tokenColorCustomizations": {
        "variables": "#9ad7fa",
        "keywords": "#fff",
        "functions": "#7096ff"

This is the screenshot of my final config:

As soon as you save your settings.json file you can switch to an already open PowerShell source code file and enjoy the new syntax highlighting.


Check list for configuring a local developement environment for Azure

As often, I am writing this post as a memo and also it can be useful to anybody.
As a Azure specialist contractor, I often go to a new client palce and receive a laptop to work on, from the client company. One first thing to do is often to configue the new Azure local environement to start working. Here is a check list of action to perform for working confortably:
  • Install Chocolatey (because you are sure to get secure software for windows from it, and that the installation process is simplified)
  • Install GitExtensions from Chocolatey (because it is a fine tool to mamage all the Git complexity when integrating an Azure devops team and most of all, you haven't to re-authenticate for each push)
  • Install VS code
  • Configure VS Code for Azure Powershell (I like PowerShell ISE also, but with VS Code you can work on several files, change the syntaxic coloration,etc.) :
  • Install Azure Resource Manager Tools Extension for VS Code
  • Install Azure App Service Extension for VS Code

Monday, August 26, 2019

Get Azure AD audit and sign-in Logs using PowerShell and AzureADPreview module

I ran randomly through a Microsoft documentation exposing PowerShell cmdlets to get quickly Azure AD logs.
As I had AzureAD module already installed on my computer, I tried to use them but they were not recongnized.
I understood that they, actually, were a part of another Azure AD PowerShell module: AzureADPreview.
It could be useful to use the module AzureADPreview to get quicly Azure AD Audit logs but you cannot run it if you have already the module AzureAD installed.
I had to uninstall the AzureAD module to have the AzureADPreview comdlets working as told in this forum.
Furthermore, it is, of course, not recommended by Microsoft to use the preview module for production matters.

Anyway, this is the steps to check in order to make the preview module work:

1. check there is only the AzureADPreview module installed and available

Use the
Get-module -listavailable
cmdlet to check that there is only the preview module available.

2. connect to Azure AD

use the cmdlet
If you have AzureAD module installed, the AzureAD module will be loaded and will perform the connection, thus you won't be able to use the AzureADPreview cmdlets later.
That's why AzureAD module has to be uninstalled.
You can see on my screenshot that neither AzureAD module nor AzureADPreview module have been loaded before the connection.

3. Check that the module AzureADPreview has been loaded

To be sure that the Azure AD connection has been done by the AzureADPreview module, use the cmdlet:

You can notice than AzureADPreview Module has been loaded and thus, that is actually that module that has connected the PowerShell session to Azure AD.

3. Get Azure AD Audit logs with a PowerShell cmdlet

use the cmdlet Get-AzureADAuditDirectoryLogs to get the Azure AD logs:

To get the Azure AD sign-ins logs you can use this cmdlet:
However, you must have a premium subscritpion to Azure AD to be allowed to consult the sign-ins log.

Sunday, August 25, 2019

Request Graph in a PowerShell script to get Azure AD Logs


Azure Active Directory provides IT teams with two kinds of important reports (MS documentation):
  • Audit logs
    These logs track all the activity regarding Azure Active directory performed by users and administrators.
    Audit logs provide system activity information about users and group management, managed applications and directory activities.
  • Sign-ins
    These logs keep tracks of all the sign-ins to Azure. This is information about the usage of managed applications and user sign-in activities.
This is the logs retention duration of all these logs in Azure Active Directory (offical MS documentation here)
Report Azure AD Free Azure AD Basic Azure AD Premium P1 Azure AD Premium P2
Audit logs 7 days 7 days 30 days 30 days
s N/A N/A 30 days 30 days
Azure MFA usage 30 days 30 days 30 days 30 days

It is thus useful and sometimes mandatory (for very secure activities, Army, Health, Banks, etc.) to save these logs longer. Microsoft and Azure specialists offer several approaches to do it (MS Documentation 1 and MS Documentation 2).
You can manually: You can also use powershell scripts to get the data. You can then store them as file, in databases, or any storage solution either in your private cloud or in Azure. : In this post, I will detail the last approach. It is not the newest one, using the certificate seems to be the newest way to get the logs programmatically, and linking the log to a storage account the most simple way to save them, but this post will show you how to request the Graph API using PowerShell and the credentials of an Azure App (Service Principal). Thus, we will:
  • Configure Azure to create an app and get the required App credentials to request the Graph API using PowerShell
  • Use the App client and secret to get the audit logs from Azure AD

1. Configuration actions in Azure Portal to create the App

Here is the official Microsoft documentation for reference. Portal UI has changed and it is not up to date yet. That's a reason why I published this post.

1.1. Prerequisites

To register an App that can be used in PowerShell to get Azure AD logs, you have to be:
  • Azure Administrator
  • Global Administrator
Here is a documentation for main security and admin roles in Azure

1.2. App creation

Sign in to Azure Portal and click on the active directory button in the left menu.

Then click on App Registration and Add.

In the "Register Application" pane, let default configuration. Just type the name of your App and for redirect url, you can use https://localhost.

When the App is registered, you are redirect on the App pane sumarizing main information. Notice that the Client ID has been defined.

1.3. Setting permission Microsoft Graph AuditLog.ReadAll for the app

We are now going to give permissions to the App in order to be able to access the Azure AD Audit and Sign-ins logs data using the App credentials in a PowerShell script.
Click on the API Permission item on the left menu.

Then click on "Add a permission" button, the "Request API Permissions" pane is opening, click on the Microsoft Graph panel.

Click on Application permission because we want to use the App credentials and permission within a PowerShell script. Actually, we have registered an Azure App, but we are only interested by the Service Principal of the App (roughly a service account), because we need credentials and permissions for a PowerShell script. Here is the Microsoft documentation about relationship between Azure App and Service Principal, if you want to know more on this topic.

Then, check AuditLog.ReadAll and click on "Add permissions" button.

You are led back to the "API Permissions" pane and notice that the set permission needs Administrator aproval, so, as you are an administrator, click on "Add admin consent" button.

A confirmation dialog is opening beacause granting permissions to an App can have dangerous security consequences.

When admin consent is done you can see that the granted permission is effective.

1.4. Setting permission Azure Active Directory Graph Directory.ReadAll for the app

Do exactly the same than above except...
that you choose Azure Directory graph pane...

and check Directory.ReadAll

You should obtain this screen at the end.

1.4. Setting secret for the App (Service Principal)

Now than whe have the App with its Service Principal Client ID and the required permissions for requesting Azure Active Directory Logs, it misses just a secret (password) to authenticate to Azure in a PowerShell script.
Click on "Certificates & secrets" item on the left menu, then on "New secret" button.

Give the secret a name and a duration.

When it's done you get a new secret. Cautionously copy and paste it in a file because the secret will be unvisible on the portal later for security concerns.

At this point, you have all you need to make a PowerShell script work: the App (Service Principal) with an ID, a secret and required permissions.

2. Requesting Azure AD Logs in PowerShell with the created App

Thanks to Bachoang, here is a github script ready to use:
Save the script in a ps1 file, open it with your favorite PowerShell editor.
Replace the client ID, client secret by those of the created App. Change the tenant domain name.
I also updated the Url of Graph API line 20:
$url = "\`$filter=eventTime gt $2daysago"

You can also update the file path line 26 if you want to get the Azure AD logs in a file in json format
Last, the new secret format is no longer 44 characthers as writen line 3, but 32.

Then, you can run the script and see the results in your PowerShell editor output.

3. Using Graph Explorer

I will finish this post by showing a way to track issues when requesting Graph if needed. You can use Graph Explorer.

Sign-in with your tenant admin account, paste the Graph Azure AD request and run the query. You can check that way that everything is all right regarding the Graph Url, an data availability and correctness.