Thursday, November 23, 2017

Change UPN Script for Azure AD Operations

In a lot of Azure deployments, synchronizing identity plays a key part of the overall delivery.  This allows one to utilize on-premises accounts/passwords/etc when logging into Azure, and makes overall management easier.

In some client cases, their internal active directory structure might utilize a domain name that is not routeable on the internet.  This poses a problem, particularly with office 365 integration.  Clients want to log in with their .com address, lets say, but AD Connect ends up syncing a .local address instead.

The fix to this can be quite simple, and generally involves setting the user UPN to the correct external address.  Doing this on an existing domain can be a little painful.  There are some scripts out there that do this, but I wanted to write my own and include some powershell features such as -whatif.

Here is the script, enjoy!


[CmdletBinding(SupportsShouldProcess=$true)]
param(
    [Parameter(Mandatory=$true,HelpMessage="The old suffix to look for")]
    [string]$oldSuffix,
    [Parameter(Mandatory=$true,HelpMessage="The new suffix to set")]
    [string]$newSuffix,
    [Parameter(Mandatory=$true,HelpMessage="The ou to filter for")]
    [string]$ou,
    [Parameter(Mandatory=$true,HelpMessage="The server to target")]
    [string]$server
)

Import-Module ActiveDirectory 
"Oldsuffix: $oldSuffix"
"Newsuffix: $newSuffix"
"ou: $ou"
"server: $server"

$users = Get-AdUser -Filter "*" -SearchBase "$ou"

if (-not $users){
    "Found no users with specified filter"
    exit    
}

foreach ($user in $users){
    "===== Processing {0}" -f $user.UserPrincipalName
    
    if (-not ($user.UserPrincipalName -like ("*$oldSuffix"))){
        "Users UPN suffix does not match oldsuffix, skipping.."
        continue
    }

    if ($user.UserPrincipalName -like ("*$newSuffix")){
        "User is already set correctly, skipping..."
        continue
    }
    
     
    if ($PSCmdlet.ShouldProcess("Updating user")){
        $newUpn = $user.UserPrincipalName.Replace($oldSuffix,$newSuffix)
        $user | Set-ADUser -server $server -UserPrincipalName $newUpn
        "Changed suffix"
    } else {
        "Would have replaced suffix"
    }
}


Saturday, November 11, 2017

CryptoAnchors, Azure Key Vault, and Managed Service Identity

There has been a lot of fallout due to the recent breaches in the news, and the public punching that was dealt to the executives (or former executives) over at Equifax and Yahoo.  Through all of this, the security lead @ Docker released an interesting post on the concept of Crypto Anchors.  You can read the original article here.

The idea is a simple one.  Currently, most modern web frameworks store passwords along with the salt in hashed format in the database.  An attacker who finds a hole in the application is free to download the database and subsequently offline crack all the passwords to their hearts desire.  Obviously salting helps with this, ensuring that the attacker would have to crack each password individually.  The power of the cloud, however, has turned this into something that can be done on demand.

Crypto anchors, in this case, ensure that decrypting the data can only be done from the environment itself.  That is to say, when you ex-filtrate the database, not all the components you need to decrypt the data are present.

From my own history, I feel like this was the way we created code way back in the dark ages.  I remember configuring a "salt" in my configuration scripts, which would be used to encrypt the password in the database.  Of course, there are problems with this approach (code check-ins, et al) which I think was the main reason why frameworks went down the ways of putting all that data in the database.

I'd say there are limitations to this technique in the real world.  Recent attacks against the memory on a webserver would show that no matter where the "key" is actually stored, provided the compute is actually done on the web server, the key could be ex-filtrated.   When the author refers to an HSM to store the key, what we really need is a trusted compute environment to execute the function itself.  That way the key is never exposed to the web application at all.  Probably a really good use case for Confidential Computing.

In any event, enough pre-amble.  The goal of this post is to mock up a proof of concept to show this in action.  I decided to use a .net core web application along with Azure Key Vault, and the Managed Service Identity which is currently in preview.

Rather then going through the trouble of creating and implementing my own UserManager, I opted to simply add code in the required functions that acted on a password.  The code looks something like the following:


            var azureServiceTokenProvider = new AzureServiceTokenProvider();

            var keyVaultClient = new KeyVaultClient(
                    new KeyVaultClient.AuthenticationCallback(azureServiceTokenProvider.KeyVaultTokenCallback));

            var secret = await keyVaultClient.GetSecretAsync("https://xxxxx.vault.azure.net/secrets/salt").ConfigureAwait(false);

            var saltInBytes = System.Text.Encoding.Unicode.GetBytes(secret.Value);
            var hashedPassword = KeyDerivation.Pbkdf2(
                        password: password,
                        salt: saltInBytes,
                        iterationCount: 12341,
                        numBytesRequested: 256 / 8,
                        prf: KeyDerivationPrf.HMACSHA512
                   );

            return Convert.ToBase64String(hashedPassword);


I simply added the above as a function in my AccountController.cs.  The code is called to "prehash" passwords that are passed to me, and this is done before I pass it to the UserManager.  Keep in mind that the UserManager is still hashing the password into the database, and storing it along side the salt.  When the database is ex-filtrated, it will be missing the key vault salt.

I think the key part of this architecture is using Managed Service Identity on your keyvault.  In the past, you would have to provide a client id/secret that was stored in configuration.  This, of course, could be ex-filtrated from the web app itself if it was compromised.  Since KeyVault is a publically accessible service, anyone with the id/secret could access the vault.

Managed Service Identity solves this by having the platform grant an identity to the app container itself, something that cannot be transferred by simply knowing the client id and secret.  Pretty cool.  Setting this up was super simple.  Follow the instructions here.

The last part was setting up Azure Key Vault, which literally only takes a smile.  Ensure that you grant access to the managed service identity you created for your app.

In conclusion, we talked a little bit about crypto anchors, and how it can be an effective pattern in protecting data.  I then showed a quick setup in .net core, using Azure Managed Service Identity and Azure Key vault.  Happy encrypting!