The Technology That Shaped Me – Finding A Passion

I haven’t always worked in in the tech industry. I’ve travelled a less well trodden path.

An economics degree, voluntary work with disabled teenagers and then teaching eventually led to a technology opportunity and I’ve been firmly entrenched in the sector ever since. But I’m grateful for that unique path because although those experiences were a long time ago I use them to bring a different perspective and skill set to my everyday job.

While it might have taken time for me to find myself a career in the industry, I have always had a passion for technology. There were signs; some early obsessions, connections and influences that hinted of a future to come. So strong were some of these connections that much of the tech I still own. At home this is called my museum (I am Lord President Business and the kids are not allowed to touch it), I don’t have it all but when I go through it, unbox it and examine it, I still feel a thrill, that joyful feeling of discovery from years ago.   

Atari 2600

Christmas 1980 (for the record I was seven years old). I can still remember my Dad handing my Mum a large boxed present in the afternoon, long after the standard Wade household present opening period. He had even waited until after the Queen’s speech! My mind flipped when it was unwrapped; this thing plugged into the TV and ran the same games we played at the arcade while on our annual family holiday. The arcades on Cromer Pier were now in my living room and my brothers and I didn’t need 10p to play. Looking back I realise how lucky we were to be given this Christmas gift but at seven I knew little different. I got so good at Defender the score went off the charts. All I wanted to do was know how it all worked, I had to find out. 

ZX Spectrum

Christmas 1982 (I’m now nine years old for those counting) and I was beginning to think that someone might share my passion, or at least be fuelling it, as a ZX Spectrum arrived for the family. A whole new world of games (Manic Miner anyone?) but this time a keyboard, a console I could command; I could make it do things, I could make things happen. I remember spending all day working out how to print the Union Jack on screen. It took ages but it was incredibly satisfying.

Amstrad PC 1512

Christmas’ came and went but who cares about Christmas when your Dad has an office with a computer? (I’m now eleven years old). He had an Amstrad PC 1512, which was total science fiction as far as I was concerned, and I was in heaven. Interestingly enough my passion for this machine was the disks and applications that came with it and moving data about, not the MS-DOS operating system. MS-DOS certainly had a part to play but that was to come, it was software and hardware that became interesting.

From there computers came and went, time passed and as a teenager I continued investigating. The Sony Walkman revolutionised my life (I still have mine, in the museum) and I could tell you a lot of stories, but because this list is about the technology that truly shaped my future there is room for just one more. 

Intel i486DX2-66 Processor

I was absolutely determined to get an Intel 486DX2-66 processor (I was twenty years old now). It was the king at the time. It came in a whole range of machines and every one of them was in a beige box. I wanted one of those beige boxes and in the early 90s I got one. It came with Windows 3.11. My best mate gave me a copy of an MS-DOS handbook and I was away. That computer went a long, long way. I performed a lot of hardware updates on that machine, I remember clearly installing a CD-ROM drive, upgrading it to Windows 95, extra memory, different disk configurations and countless hours of entering commands and code. I kept going with it and it kept me going for years.

Investigating and experimenting with that device gave me the confidence to share the thrill of discovery at work. On those first small Microsoft networks I built NetBEUI became a good friend, I probably had too much NetBEUI in my life but by then I had found my passion and wasn’t looking back. 

Five Steps to Getting AI Projects Right

AI projects are not IT projects

You might be relying heavily on code, developers, cloud services and data stored in systems managed by IT  but an AI project is a business project, the focus should be about solving a problem, not investigating the technology.

Momentum is important

Momentum makes these projects successful. You’ll need to be bold and disruptive to make progress. The best leaders give people the freedom to be creative, find one of these to be your sponsor.  

Building the team is vital

Good AI projects enable a team not replace a team. For success you should use your existing experience to find new results across larger data sets, opening up scale that never existed before.

I use this equation to build out the team

Success = Platform + Expertise + Data + Domain Knowledge

Platform = Extensible, secure and open public cloud

Expertise = Developers, data science, cloud architect

Data = Organisation or 3rd party source. Note you don’t need it all to get started

Domain Knowledge = Field expert

Success should come quickly

If you have put the right team together you should be getting results or proving the validity of the project in just a few weeks. In IT we call these ‘sprints’. One or two ‘sprints’ and you should be making progress. I’ve seen results much faster than this too. 

Investment should follow success

Finally, major investment should not be provided until success has been shown, but it should then come quickly. Losing momentum dampens enthusiasm, causes rot to set in and then projects fail. Failure is not a bad thing, it helps us learn but failing to move a successful project forward because it has nowhere to go means we’ve got the wrong sponsor or are not solving the right problem. Reset and go again!  

OMS Query – Patching Status for Meltdown and Spectre

This is a short article to show you how to use OMS Log Analytics to query the status of patches on Microsoft Windows Server platforms.

Please note: Official guidance and advice can be found here Protect your Windows devices against Spectre and Meltdown. This article is just one example of how to monitor patch status using the super cool OMS Log Analytics tools.

If you have not used OMS or Log Analytics it is well worth spending some time investigating.  You have the option of paid, trial and free tiers and a whole range of interesting preconfigured packs to play with.

Where Log Analytics gets interesting is when you start to increase the amount of information you are gathering and then use custom queries to dig for information, provide proactive notifications and automated actions and to train and develop models to display insights into your environment. Just imagine a machine learning model applied to data from your sys log server to map our network activity and threats.

For this article I am assuming that you already have OMS enabled and are collecting data but may never have looked into Log Analytics. You’ve probably clicked the Advanced Analytics button a few times and made some progress or gone “Whoa dude, strange things are afoot at the Circle K!” (The last bit might just be me :-))

Lets get cracking:

Head to your OMS Workspace that hosts your LogAnalytics Service for the VMs you want to monitor. At this stage it’s worth noting that there are a number of architectural options when considering your OMS Workspace design. This article does not go into the patterns you can adopt but as long as you have some VMs on premises being monitored and the data being collected you’ll be able to continue.

Select Log Search and then open up Advanced Analytics and “Hold On!”       

When the Advanced Analytics page has loaded open a new tab and paste in the query you need for the results you are after.  To test select Run.

       

The query you are looking to run is from the Update data.  Therefore this needs to be your first input.  You are then extracting data from here and narrowing down what you are looking for.  Once narrowed down you need to decide how you want this data displayed, this is your summary.  Finally we are placing all this information into a table.

If this is the very first time you have tried a free form query try the top most line first.  Its likely you will get a lot of records but you will see all the data and then be able to narrow it down to what you are after.

I have copied the query below for you to use.  Like everything if you know a better of of doing things please share I’d certainly be interested!

Update
| where KBID == "4056898" or KBID == "4056890"
| where UpdateState == 'Installed' or UpdateState == "Needed"
| summarize hint.strategy=partitioned arg_max(TimeGenerated, *) by Computer,SourceComputerId,UpdateID
|summarize dcount(Computer) by Computer,UpdateState
| render table

 

Disclaimer:  Please note although I work for Microsoft the information provided here does not represent an official Microsoft position and is provided as is.

Auditing Azure VMs Add Results to Azure Tables | Azure PowerShell

Having read Paulo Marques article Working with Azure Storage Tables from PowerShell I decided to make the edits to my auditing scripts and push the results into Azure Tables to act as a repository I have the ability to keep but also one that gives me more options. Moving forward we can look to update or pull this information out on demand or use it as a basis of a comparison.  I find it quite useful to have an independent record of the starting and end state of an environment pre and post any work undertaken.

There are a number of ways you can audit an Azure environment. With most of my customers I have implemented OMS, often using a combination of paid and free tiers to achieve the reporting they need to meet their own requirements and standards.

I’m a big fan of OMS, this script represent only one way to gather information and a chance to try something new in PowerShell.

To get started you’ll need to follow the instructions in Paulo’s article to install the correct module and from there I suggest following his guide as this will give you a good understanding how the commands operate. Once competed it is a straight forward process to integrate this in to any auditing script you currently have. The example below already has a table created.


# Variables for the environment set up
# PLEASE NOTE the Azure Table has already been set up
$subscriptionName = "Subscription Name"
$resourceGroup = "Resource Group"
$storageAccount = "Storage Account"
$tableName = "Table"
$PartitionKey = "Partition Key"
$table = Get-AzureStorageTableTable -resourceGroup $resourceGroup -tableName $tableName -storageAccountName $storageAccount

# Call the status of each server and upload into the Azure Table
$rowcounter = 1
$RGs = Get-AzureRMResourceGroup 

  foreach($RG in $RGs)
   {
     $VMs = Get-AzureRmVM -ResourceGroupName $RG.ResourceGroupName
     foreach($VM in $VMs)

     {
      $VMDetail = Get-AzureRmVM -ResourceGroupName $RG.ResourceGroupName -Name $VM.Name -Status 

      foreach ($VMStatus in $VMDetail.Statuses)
       {
        $VMStatusDetail = $VMStatus.DisplayStatus                                               

       }

        Add-StorageTableRow -table $table `
        -partitionKey $PartitionKey `
        -rowKey ([guid]::NewGuid().tostring()) `
        -property @{"ResourceGroup"=$RG.ResourceGroupName;"computerName"=$VM.name;"status"=$VMStatusDetail}
       $rowcounter++  

       }
    }

This second example updates the values in the Azure Table. To do this we have to pull out the Computer / Server status from the Table and add this to the collected information as before.

# Variables for the environment set up
# PLEASE NOTE the Azure Table has already been set up
$subscriptionName = "Sub Name"
$resourceGroup = "Resource Group Name"
$storageAccount = "Storage Account Name"
$tableName = "Table Name"
$PartitionKey = "Key"
$table = Get-AzureStorageTableTable -resourceGroup $resourceGroup -tableName $tableName -storageAccountName $storageAccount

# Call the status of each server and upload into the Azure Table
$rowcounter = 1
$RGs = Get-AzureRMResourceGroup 

  foreach($RG in $RGs)
   {
     $VMs = Get-AzureRmVM -ResourceGroupName $RG.ResourceGroupName
     foreach($VM in $VMs)

     {
      $VMDetail = Get-AzureRmVM -ResourceGroupName $RG.ResourceGroupName -Name $VM.Name -Status 

      foreach ($VMStatus in $VMDetail.Statuses)
       {
        $VMStatusDetail = $VMStatus.DisplayStatus                                               

       }

# Creating the filter and getting original entity
[string]$filter = [Microsoft.WindowsAzure.Storage.Table.TableQuery]::GenerateFilterCondition("computerName ",[Microsoft.WindowsAzure.Storage.Table.QueryComparisons]::Equal,$VM.Name)
$computer = Get-AzureStorageTableRowByCustomFilter -table $table -customFilter $filter

# Changing values
$computer.status = $VMStatusDetail

# Updating the content
$computer | Update-AzureStorageTableRow -table $table

# Getting the entity again to check the changes
Get-AzureStorageTableRowByCustomFilter -table $table -customFilter $filter 

       }
    }

Remember there is always a better way to do things and the only way we find out is if you have a go at sharing. I look forward to your versions and updates. Happy scripting!

Disclaimer:  Please note although I work for Microsoft the information provided here does not represent an official Microsoft position and is provided as is.

Azure Disk Encryption and Azure Backup

If you are looking to use Azure Disk Encryption and Azure Backup you need to follow a couple of additional steps to the standard encryption procedure.

The offical documentation can be found below:

How it works

There are two types of encryption keys to consider.

  • BEK – Bit Locker Encryption Key
  • KEK – Key Encryption Key

The encryption service uses Key Vault to manage the secrets, to do this we need an application in Azure AD that has permissions (Set by a Key Vault Access Policy) to operate inside of Key Vault.

This is used if you are just using BEK or setting up KEK for Azure Backup support.

For KEK a Key must be imported or created in the Key Vault. You reference this key when running the commands.

Finally, the Backup Management Service needs permissions to access the Key Vault and the keys.

Image 1: Example of Secrets inside of Key Vault

Procedure

Please note: You will need a Key Vault before you can complete this procedure. The Key Vault must be in the same region as the VM that will be encrypted.

1. Set up an Azure AD Application

In Azure Active Directory, select App registrations and create a new app registration. Enter a Name, select Web app / API and assign a sign-on URL (you will not use this so a default entry is adequate).


Image 2: App Registration in Azure Active Directory

Make a note of the Application ID and create and take note of the application Key. Please note that the Key will only be available to you after it is saved and only once on the page. After that it will be hidden.

2. Configure the permissions in the Key Vault for the new Azure AD Application

In the Key Vault set up an Access Policy for the new application.

      
Image 3: Setting up permissions in the Key Vault (an Access Policy)

Key Permissions need to be set to Wrap Key, Secret permissions to Set.


Image 4: Setting the Key Vault Access Policy for the Azure AD Application

3. Create a Key in Key Vault

This will be the key used to wrap the BEK, also known as the KEK


Image 5: Creating the KEK

4. Set permissions for the Backup Management Service

Select Access Policies and from the template select Azure Backup. The principal will be Backup Management Service.

    
Image 6: Creating the Access Policy for the Backup Management Service

5. Check the Advanced access policies to enable access to Azure Disk Encryption for volume encryption.


Image 7: Setting the Advanced Access policies for Disk Encryption

PowerShell commands for an existing VM


subscriptionName = "SUBSCRIPTION NAME"

$RGName = "RESOURCE GROUP NAME"

$VMName = "VM NAME"

$AADClientID = "AZURE AD APPLICATION ID"

$AADClientSecret = "AZURE AD APPLICATION SECRET"

$VaultName= "KEY VAULT NAME"

$keyName = "KEY NAME"

$keyEncryptionKeyUri = Get-AzureKeyVaultKey -VaultName $VaultName -KeyName $keyName

$KeyVault = Get-AzureRmKeyVault -VaultName $VaultName -ResourceGroupName $RGName

$DiskEncryptionKeyVaultUrl = $KeyVault.VaultUri

$KeyVaultResourceId = $KeyVault.ResourceId

Set-AzureRmVMDiskEncryptionExtension -ResourceGroupName $RGName -VMName $vmName -

AadClientID $AADClientID -AadClientSecret $AADClientSecret -

DiskEncryptionKeyVaultUrl $DiskEncryptionKeyVaultUrl -DiskEncryptionKeyVaultId

$KeyVaultResourceId -KeyEncryptionKeyUrl $keyEncryptionKeyUri.Id -

KeyEncryptionKeyVaultId $keyVaultResourceId

Disclaimer:  Please note although I work for Microsoft the information provided here does not represent an official Microsoft position and is provided as is.

Adding a Public IP to an Existing Azure ARM VM

If you are not running a jump host in your environment I find from time to time that I need to add a Public IP to a NIC and connect to my virtual machine.

PowerShell is by far the easiest way to complete this task. The small script below outlines how to do this.

# New-AzurePublicRmIAddress creates the new IP - Run this first.

new-azurermpublicIPAddress -Name testip -ResourceGroupName wpbackup -AllocationMethod Static -Location "Southeast Asia"

# Set the variables but getting the properties you need
$nic = Get-AzurermNetworkInterface -ResourceGroupName Nameof ResourceGroup -Name NameofNIC
$pip = Get-AzurermPublicIPAddress -ResourceGroupName wpbackup -Name testip
$nic.IPConfigurations[0].PublicIPAddress=$pip

# Finally set the IP address against the NIC
Set-AzureRmNetworkInterface -NetworkInterface $nic

Disclaimer:  Please note although I work for Microsoft the information provided here does not represent an official Microsoft position and is provided as is.

Audit number of VHDs per Storage Account | Azure

Time for some code.  I was recently asked by a customer to help them audit the number of active VHDs in a storage account.

As ever with a little digging around and some slight adjustment I was able to provide what they were after.

Original came from the very accomplished John Savill and was posted at Windows IT Pro.

$FindStorage = Get-AzurermStorageAccount
$out = @()

Foreach ($Storage in $FindStorage)
{
$Name = $Storage.StorageAccountName
$ResourceGroupName = $Storage.ResourceGroupName
$Location = $Storage.Location

$AllBlobs = Get-AzureRMStorageAccount -Name $Name -ResourceGroupName $ResourceGroupName |
    Get-AzureStorageContainer | where {$_.Name -eq 'vhds'} | Get-AzureStorageBlob | where {$_.Name.EndsWith('.vhd')}

$VHDsinAct = 0

foreach ($Blob in $AllBlobs)
{

    if($Blob.ICloudBlob.Properties.LeaseState -eq 'Leased' -and $Blob.ICloudBlob.Properties.LeaseDuration -eq 'Infinite')
    {
        $VHDsinAct++
    }
}

$props = @{

StorageAccount = $Name
VHDs = $VHDsinAct
ResourceGroup = $ResourceGroupName
Location =$Location
}
#Write-Output "Total of $VHDsinAct VHDs in $Name"
$out += New-Object PsObject -Property $props
}

$out | Format-Table -AutoSize -Wrap  StorageAccount, VHDs, ResourceGroup, Location
$out | Out-GridView -Passthru

Disclaimer:  Please note although I work for Microsoft the information provided here does not represent an official Microsoft position and is provided as is.

Successfully Working from Home

I’ve learnt quite a bit about working from home in the last ten years and thought is was about time I shared one of the secrets to my success.

It will take longer to get used to than you will first admit. 

Working from home has some obvious benefits; no travel time, no interruptions, working all day in your pyjamas. It’s a breeze right? When I look back I can honestly say it took me over a year to get into the correct rhythm.  I had started a new job, I had a new baby (our first), I was sent a laptop, filing cabinet (don’t why I got that), chair, printer, tech. toys and I was away.  I knocked off what I thought was a day’s work by morning tea and was a very happy man. But how do you get by with no interaction with anyone at work? Monday is great but by Wednesday, outside of the odd phone call and customer conversation, who do you have the work chat with? What happens if you get frustrated at work and the next person you see is your new child or sleep deprived partner?  You suddenly need to slip out of work mode and into home mode, then back again.  You think Superman makes a fast change in a phone box, it’s nothing compared to mental gymnastics of the accomplished home worker.

As you get used to the transition you’ll be telling everyone how great life is but some times you’ll be doing this to convince yourself, more than anyone else. But working from home can be very rewarding and productive. It took me a while to work this out as I am not someone that has had much interest in physiology but you need to train your brain.

What did I do? I decided I had to identify in my mind where and when I was at work.  I picked a space and made sure everything was the same each time I started.  I created a routine of work, emails, calls and customer visits that I stuck to. I even cleaned and tided the space every week and set it up for Monday. I mentally told myself when I leave this spot I am no longer at work, I am at home. I moved a chair by the door and said to myself, work goes there when I leave this room. Over about twelve months I began to surprise myself with how quickly I was able to mentally switch roles.  I could stride through the house be dad, walk into my work space, sit down and get straight back into it. It was at this point working from home truly became great and productive.  

Without knowing it I was taking my brain through a series of mental exercises.  My brain was getting a workout and learning how to flip modes very quickly.

I have switched companies now and at Microsoft I have the flexibility to work at the office or at home. I can spend weeks in the office environment or at customer sites an then a period at home and the mental flexibility is still there. All I have to do is remember to get dressed when I go into the office.

This is a skill I’m sure anyone can learn.  I’d be interested in what makes working from home a success for you. I always say we all learn by sharing and if you have found another way don’t be shy, let the world know.

Audit Azure ARM Networks

Consultants love to audit environments and there is no better use of a script than for this purpose.

This script lists out the virtual networks and subnets in a subscription.

Remember there is always a better way to do things and if you have a better way don’t forget to share.

$FindNetworks = Find-AzureRmResource | where {$_.ResourceType -like "Microsoft.Network/VirtualNetworks"}</code>

$out = @()

Foreach ($Network in $FindNetworks)
{
$Name = $Network.Name
$ResourceType = $Network.ResourceId
$ResourceGroupName = $Network.ResourceGroupName
$Location = $Network.Location

$VNetDetail = Get-AzureRmvirtualNetwork -Name $Network.Name -ResourceGroupName $Network.ResourceGroupName

$props = @{

VNetName = $Network.Name
ResourceGroup = $Network.ResourceGroupName
Location = $Network.Location
AddressSpace = $VNetDetail.AddressSpace.AddressPrefixes
Subnets = $VNetDetail.Subnets

}
$out += New-Object PsObject -Property $props
}
$out | Format-Table -AutoSize -Wrap  VNetName, AddressSpace, Subnets, ResourceGroup, Location
$out | Out-GridView -Passthru

Disclaimer:  Please note although I work for Microsoft the information provided here does not represent an official Microsoft position and is provided as is.

Hybrid Use Benefit from Image | Azure

Please see post Hybrid Use Benefit HUB | Azure for details on the Microsoft HUB process.

I have been using a slight edit on the process described so thought I would place the code I have been using below.

Please note HUB images are now available in Azure, therefore a generalised image is no longer required.

#login into azure and select the right subscription
Login-AzureRmAccount
Get-AzureRmSubscription
Select-AzureRmSubscription

#upload HUB file
$RGName = "Resource Group Name"
    $urlOfUploadedImageVhd = "https://storageaccountname.blob.core.windows.net/container/imagename.vhd"
    Add-AzureRmVhd -ResourceGroupName $rgName -Destination $urlOfUploadedImageVhd -LocalFilePath "C:\Source\imagename.vhd" 

#Create VM using image
$Cred = Get-Credential #Don't forget needs to be complex
$vmName = "Name of VM"
$StorageAccount = Get-AzureRmStorageAccount -ResourceGroupName $RGName -name "Resource Group Name"
$OSDiskName = "$vmName-C-01" 
$nicname = "Nic01-$vmName-Prod"
$OSDiskUri = $StorageAccount.PrimaryEndpoints.Blob.ToString() + "vhds/" + $OSDiskName + ".vhd" #Name & path of new VHD
$URIofuploadedImage = $StorageAccount.PrimaryEndpoints.Blob.ToString() + "image container/image.vhd" #location of template VHD
$Location= "Azure location"

#Networking 
$Vnet = Get-AzureRmVirtualNetwork -Name "Virtual Network Name" -ResourceGroupName $RGName
$SubnetProduction = Get-AzureRmVirtualNetworkSubnetConfig -Name "Sub-1" -VirtualNetwork $Vnet
$Nic = New-AzureRmNetworkInterface -ResourceGroupName $RGName -Name $Nicname -Subnet $SubnetProduction -Location $Location

#Define VM Configuration
$VMConfig = New-AzureRmVMConfig -VMName $vmName -VMSize "Standard_DS2" |
    Set-AzureRmVMOperatingSystem -Windows -ComputerName $vmName -Credential $Cred -ProvisionVMAgent -EnableAutoUpdate |
    Set-AzureRmVMOSDisk -Name $OSDiskName -VhdUri $OSDiskUri -CreateOption FromImage -SourceImageUri $URIofuploadedImage -Windows |
    Add-AzureRmVMNetworkInterface -Id $Nic.ID -Primary

#Create VM
New-AzureRmVM -ResourceGroupName $RGName -Location $Location -LicenseType "Windows_Server" -VM $VMConfig

#Check license type of new VM
Get-AzureRmVM -ResourceGroupName $RGName -Name $vmName | Format-Table -AutoSize Name, LicenseType, Location, ProvisioningState

Disclaimer:  Please note although I work for Microsoft the information provided here does not represent an official Microsoft position and is provided as is.