Symbolic Link to an Azure File Share

Symbolic link to an Azure file share

We recently used a symbolic link to an Azure file share to transparently replace a local folder in which data sets are cached for download. That means that the existing service transparently copies the data sets to an Azure file share without having to change anything in the code to do so. With a small adaptation of the code, we can now provide download links to data in the Azure file share so this process is also transparent for the clients downloading the data sets.

You can already guess the reason for this exercise. We did this to fix a bandwidth issue on-premises by creating an easy workaround with minimal code changes. As more and more clients download more and more data sets, this service consumes too much bandwidth. This means we have to throttle the service and/or implement QoS to it. While this helps the other services using that internet connection, it does nothing to improve download speeds for the clients. This is just an example and is not meant as architectural or design advice. It is an interim fix to an existing problem. This trick is something that is used with AKS as well for example.

How to add a symbolic link to an Azure file share

Create an Azure file share

Create a storage account and create a file share.

Symbolic Link to an Azure File Share
Our Azure file share.

Handling credentials

Dealing with the credentials needed for this is easy. All we need to do is add the information into the credential manager as a Windows credential. That would be the user, the password, and the file share UNC path. Note that here the password is our storage account key.

Symbolic Link to an Azure File Share
Go to the connect settings of your file share.

Grab the info you need from the “connect” settings for your Azure file share. We will not map the the files hare to a drive, so there is no need to run this PowerShell script.

So in this example that is:
Internet or networkk address: \\datasets.file.core.windows.net\fscache
User name: localhost\datasets
Password: real2Nonsense4Showing8AfakeStorage28Accountkey/goobledeGookStuffa/AndSomeMoreNonsentMD==

We will add these credentials to the Credential Manager as Windows Credentials.

Click on “Add a Windows Credential”.
Add the file share UNC path, the username and the storage account key

That is it, if you entered everything correctly, this will work.

Creating the symbolic link

Once you have added the credentials creating the symbolic link is very easy.

mklink /d "E:\Download\Cache" "\\datasets.file.core.windows.net\fscache"

You do need to take care you create the symbolic in the right place in your folder structure. But other than that, that is all you need to do.

Symbolic Link to an Azure File Share
the symbolic link to the Azure file share

The symbolic link is available and can be used transparently by the service/application.

To test the file share in Azure you can upload or download data via azcopy or Azure Storage Explorer. The download functionality in our case is handled in the code, But here is a quick example of how to do a download it via azcopy using a shared access key signature.

azcopy copy "https://datasets.file.core.windows.net/fscache/DataSetSatNavSouthernUtah.zip?sv=2020-02-10&ss=bfqt&srt=sco&sp=rwdlacuptfx&se=2021-06-25T06:06:02Z&st=2021-06-24T22:06:02Z&spr=https&sig=%2FA%9SOrrY4KFAKEikPKeysOycLb4neBLogpPostpAQ624%3D" "ED:\MyDataSetDownloads" --recursive

Pro tip: if you need to remove the symbolic link but keep the data, use rmdir “E:\Download\Cache” and not del “E:\Download\Cache” or you will delete the data. That might not be what you want.

Next steps

Mind you, this was the easy and quick fix for a problem this service was facing. This is not a design or architecture. We are considering replacing the symbolic link solution with Azure File Sync. With a bandwidth cap and QoS on-premises, we would offer the primary download link to the cloud. There they can get all the bandwidth Azure can offer. Next to that, we would have an alternative link, marked as slow, that still points to the on-prem version of the data. This means the current implementation is still fully functional even when the Azure files share has an issue. Sure, the local copy comes with a significantly reduced performance, but it provides a failsafe.

The future

Well, the future lies in turning this into a solution running 100% in the cloud. Now, due to a large number of dependencies on various on-premies data sources, this is a long-term effort. We decided no to let perfection be the enemy of the good and fixed their biggest pain point today.

Conclusion

For sure, the use of a symbolic link to access an Azure file share is not something that will amaze people that have been working in the cloud for a while. It is however a nice example of how the use of Azure combined with on-premises services can result in a hybrid solution that solves real-world problems

This particular scenario enables them to distribute their data sets without having to worry about bandwidth limitations on-premises. That means they do to invest in a bigger internet pipe and a firewall with more throughput, or having to port their service and all its dependencies to a full-blown Azure solution.

Sometimes successful and cost-effective solutions come in the form of little tweaks that allow us to fix pain points easily.

How to survive in the ever-changing IT world

How to survive in the ever-changing IT world

How to survive in the ever-changing IT world is something that all IT professionals and developers want to figure out. No matter how junior or senior you are, no matter what level of expertise you have, it is a challenge for everyone. At least if they are honest to themselves and others.

The IT world changes at a very fast pace and longevity of technology seems to be a distant pipe dream. The only thing that has tenure it IT seems to be considered legacy and tech debt. But is that always the case? There is flip side to fast paced changes. The faster things change the shorter they last.

The faster things change the shorter they last

Today’s IT world is changing extremely rapidly in terms of technologies used, hardware and software lifecycle management, trends, and hypes. I have always said IT looks a lot like fashion-driven show business.

Everyone strives to keep up with these changes, both individuals and organizations. Can they in the end, or will we outrun ourselves? If we can at what cost? Quality, longevity, minimal viable products, bugs, journeys instead of products, etc.

The skill sets needed to make this happen are grown and groomed, not produced at will. All this while investing in education and training is often only lip service as the requirements change so fast, the willingness to do so is diminished. And as such we make our problems even bigger.

Join us for a chat

How to survive in the ever-changing IT world
Register here Upcoming Webinar: How to Survive in The Ever-Changing IT World – VirtualMetric – Infrastructure Monitoring Blog

Find out in a chat with Yusif Ozturk (Co-Founder and Chief Software Architect at
@VirtualMetric) and me how we view and think about these challenges. Companies must ensure their software is using up-to-date technologies and finding the best talent experienced with modern technologies. You need to adapt pretty fast so that you can survive. When a company creates a new software product and develops a new solution, everything changes in a short time. There are new programming languages, new frameworks, and new technologies coming to the market. One of the challenges that organizations face is the fast technology shifts, new skills needed, new experienced staff, etc. Maybe you will find some insights on how to survive in the ever-changing IT world.

SecretStore local vault extension

What is the SecretStore local vault extension

The SecretStore local vault extension is a PowerShell module extension vault for Microsoft.PowerShell.SecretManagement. It is a secure storage solution that stores secret data on the local machine. It is based on .NET cryptography APIs, and works on Windows, Linux, macOS thanks to PowerShell Core.

The secret data is stored at rest in encrypted form on the file system and decrypted when returned to a user request. The store file data integrity is verified using a cryptographic hash embedded in the file.

The store can be configured to require a password or operate password-less. Requiring a password adds to defense-in-depth since password-less operation relies solely on file system protections. Password-less operation still encrypts data, but the encryption key is stored on file and is accessible. Another configuration option is the password timeout, which by default is 15 minutes for automation purposes you can use Unlock-SecretStore to enter the password for the current PowerShell session for the duration of the timeout period.

Testing the SecretStore local vault extension

Below you will find a demonstration script where I register a vault of the type secret store. This is a local vault extension that creates its data and configuration files in the currently logged-in user scope. You specify the vault type to register by the ModuleName parameter.

$MySecureVault1 = 'LocalSecVault1'
#Register Vault1 in secret store
Register-SecretVault -ModuleName Microsoft.PowerShell.SecretStore -Name 
$MySecureVault1 -DefaultVault

#Verify the vault is there
Get-SecretVault

#Add secrets to Vault 1
Set-Secret -Name "DATAWISETECH\serverautomation1in$MySecureVault1" -Secret "pwdserverautom1" -Vault $MySecureVault1
Set-Secret -Name "DATAWISETECH\serverautomation2in$MySecureVault1" -Secret "pwdserverautom2" -Vault $MySecureVault1
Set-Secret -Name "DATAWISETECH\serverautomation3in$MySecureVault1" -Secret "pwdserverautom3" -Vault $MySecureVault1

#Verify secrets
Get-SecretInfo

Via Get-SecetInfo I can see the three secrets I added to the vault LocalSecVault1

SecretStore local vault extension
SecretStore local vault extensionThe three secrets I added to vault LocalSecVault1

The configuration and data are stored in separate files. The file location depends on the operating system. For Windows this is %LOCALAPPDATA%\Microsoft\PowerShell\secretmanagement\localstore. For Linux and MacOS it is $HOME/.secretmanagement/localstore/

SecretStore local vault extension
The localstore files

As you can see this happens under the user context. Support for all users or machine-wide context or scope is a planned future capability, but this is not available yet.
Access to the SecretStore files is via NTFS file permissions (Windows) or access control lists (Linux) limiting access to the specific user/owner.

Multiple Secret stores

It is possible in SecretManagement to register an extension vault multiple times. The reason for this is that an extension vault may support different contexts via the registration VaultParameters.

At first, it might seem that this means we can create multiple SecretStores but that is not the case. The SecretStore vault currently operates under the scope of the currently logged-on user at a very specific path. As a result, it confused me when I initially tried to create multiple SecretStores. I could see all the secrets of the other stores. Initially, that is what I thought happend. Consequenlty, I had a little security scare.. In reality, I just register different vault names to the same SecretStore as there is only one.

$MySecurevault2 = 'LocalSecVault2'
$MySecureVault3 = 'LocalSecVault3'

#Register two more vaults to secret store
Register-SecretVault -ModuleName Microsoft.PowerShell.SecretStore -Name $MySecurevault2 -DefaultVault
Register-SecretVault -ModuleName Microsoft.PowerShell.SecretStore -Name $MySecureVault3 -DefaultVault

#Note that all vaults contain the secrets of Vault1
Get-SecretInfo
 
#Add secrets to Vault 2
Set-Secret -Name "DATAWISETECH\serverautomation1in$MySecureVault2" -Secret "pwdserverautom1" -Vault $MySecureVault2
Set-Secret -Name "DATAWISETECH\serverautomation2in$MySecureVault2" -Secret "pwdserverautom2" -Vault $MySecureVault2
Set-Secret -Name "DATAWISETECH\serverautomation3in$MySecureVault2" -Secret "pwdserverautom3" -Vault $MySecureVault2

#Note that all vaults contain the secrets of Vault1 AND Vault 2
Get-SecretInfo
SecretStore local vault extension
Note that every registered local store vault beasically sees the same SecretStore as they all point to the same files.

Now, if you think, that multiple SecretStores per user scope are a good idea there is an open request to support this: Request: Multiple instances of SecretStore · Issue #58 · PowerShell/SecretStore (github.com).

KeePass SecretManagement extension vault

KeePass SecretManagement extension vault

The SecretManagement and SecretStore can work with SecretManagement extension vault modules. These can be found in the PowerShell Gallery using the “SecretManagement” search tag. Some example are:

I use KeePass and as such, the KeePass SecretManagement extension vault is the one I will demonstrate. First of all, install the module. Note that I chose to use the most recent beta version, which is 0.9.2-beta0008 at the time of writing this blog post.

Install-Module -Name SecretManagement.KeePass -AllowPrerelease

Naturally, if you haven’t installed SecretManagement and SecretStore modules yet, you must now really do that to be able to play with them.

Install-Module Microsoft.PowerShell.SecretManagement, Microsoft.PowerShell.SecretStore

Now that has been taken care of we can start testing the KeePass SecretManagement extension vault.

Using the KeePass SecretManagement extension vault

I created a demo KeePass .kdbx file in which I stored some example user names with their passwords. This file has a master password. You can also use a key or the Windows user account if you want to do so.

Our demo .kdbx file

Now I will register the KeePass file as a Vault

Register-KeePassSecretVault -Name 'WorkingHardInITKeePassVault' -Path 'C:\SysAdmin\Authentication\workinghardinit.kdbx' -UseMasterPassword
KeePass SecretManagement extension vaultRegister the KeePass Vault

As you can see this prompts you for the KeePass Master Pasword.

Keepass Master Password
Enter the Keepass Master password for: C:\SysAdmin\Authentication\workinghardinit.kdbx
Password for user Keepass Master Password:

Now that is done, I will unlock the KeePass secret vault so I can use it in automation without being prompted for it. By default, it remains unlocked for 900 seconds (15 minutes). This is configurable.

Unlock-KeePassSecretVault -Name 'WorkingHardInITKeePassVault'
Unlock the KeePass Vault, by entering the store password and, if not opened yes the KeePass master password
$FCcreds = Get-Secret -Name 'FC Switch 01'  -Vault 'WorkingHardInITKeePassVault'
$FCSwitchUser = $FCcreds.GetNetworkCredential().UserName
$FCSwitchPwd  =$FCcreds.GetNetworkCredential().Password
write-Host -foregroundcolor Green "FC Swicth 01 username $FCSwitchUser has $FCSwitchPwd for its password"
KeePass SecretManagement extension vault
We grab the username for the FC Switch 01 entry in the KeePass secret Vault.

Note that the entry for the secret is a network credential. As result, we can use the properties of the credential object to obtain the username and password in plain text. That is to say, we can (and should) use the credentials directly. You do not need to show or use the password in plain text. I did this here to show you that we got the correct values back.

Credentials ready to use.

Updating and adding secrets

Currently, updating the secrets with is not supported.

Let’s hope that theu allow updating and document using the hash table to enter metadata better in the future.

We need to first remove the existing one for now and re-enter the information. We’ll see how this evolves

Remove-Secret -Name 'FC Switch 01' -Vault 'WorkingHardInITKeePassVault'
$FCcreds = Get-Credential -UserName 'fcadmin'
Set-Secret -Name 'FC Switch 01' -Secret $FCcreds -Vault 'WorkingHardInITKeePassVault'

Finally, the good news is that there is also a PowerShell KeePass module that you can use for that sort of work. So you have the means in PowerShell to do so. See Getting Started · PSKeePass/PoShKeePass Wiki (github.com).

Conclusion

That was fun, was it not? The SecretManagement and SecretStore modules are going places. I hope this helps and happy scripting!