I have had the distinct pleasure of being one of the first people to implement a SMB over QUIC POC. It was in a proof of concept I did with Windows Server 2022 Azure Edition in public preview.
That was a fun and educational excercise. As a result, I learned a lot. As a result, I decided to write a lab and test guide, primarily for my own reference. But also, to share my experience with others.
I am convinded it will fill a need for people that require remote access to SMB file shares without a VPN. Next to that, the integration with the KDC proxy service make it a Kerberos integrated solution. In addition, the KDC Prosy service has the added benefit of allowing for remote password changes.
If you are interested in Microsoft and QUIC I have some good news for you. Recently a new article, SMB over QUIC Technology | StarWind Blog (starwindsoftware.com) was published. It is the first in a series about what Microsoft is working on in regards to QUIC. While not without some controversy, QUIC does a lot for a number of issues connectivity over “the internet at large” has been dealing with.
It leverages UDP.
TLS 1.3 is built into the protocol.
Reduces RTT during connection & encryption setup.
Handles and optimizes flow control and loss recovery.
Over the internet, with mobile clients, this is a big deal. Since it is secure by default people really started thinking about where this can be used to improve things for all involved.
I think QUIC is going to be more and more important in the future and this article positions QUIC in the Microsoft ecosystem. So, head over there, read it, and let me know what you think.
TLS 1.3, QUIC, HTTP/3, and SMB 3.1.1 are shaking up things a bit by challenging TCP. Microsoft dropped QUIC into Windows Server 2022 Azure edition. That went into public preview last week and I dove in to the lab to figure out what I can do with it.
As a technologist, I am having a lot of fun testing this out in the lab. Last weekend I was busy with SMB over QUIC and QUIC in IIS. I learned a lot and have made up my mind I can use this in the real world to solve requirements. I will share my findings and musing with you in near the future. But today, start with an introduction in SMB over QUIC Technology | StarWind Blog (starwindsoftware.com).
We recently used a symbolic link to an Azure file share to transparently replace a local folder in which data sets are cached for download. That means that the existing service transparently copies the data sets to an Azure file share without having to change anything in the code to do so. With a small adaptation of the code, we can now provide download links to data in the Azure file share so this process is also transparent for the clients downloading the data sets.
You can already guess the reason for this exercise. We did this to fix a bandwidth issue on-premises by creating an easy workaround with minimal code changes. As more and more clients download more and more data sets, this service consumes too much bandwidth. This means we have to throttle the service and/or implement QoS to it. While this helps the other services using that internet connection, it does nothing to improve download speeds for the clients. This is just an example and is not meant as architectural or design advice. It is an interim fix to an existing problem. This trick is something that is used with AKS as well for example.
How to add a symbolic link to an Azure file share
Create an Azure file share
Create a storage account and create a file share.
Handling credentials
Dealing with the credentials needed for this is easy. All we need to do is add the information into the credential manager as a Windows credential. That would be the user, the password, and the file share UNC path. Note that here the password is our storage account key.
Grab the info you need from the “connect” settings for your Azure file share. We will not map the the files hare to a drive, so there is no need to run this PowerShell script.
So in this example that is: Internet or networkk address: \\datasets.file.core.windows.net\fscache User name: localhost\datasets Password: real2Nonsense4Showing8AfakeStorage28Accountkey/goobledeGookStuffa/AndSomeMoreNonsentMD==
We will add these credentials to the Credential Manager as Windows Credentials.
That is it, if you entered everything correctly, this will work.
Creating the symbolic link
Once you have added the credentials creating the symbolic link is very easy.
You do need to take care you create the symbolic in the right place in your folder structure. But other than that, that is all you need to do.
The symbolic link is available and can be used transparently by the service/application.
To test the file share in Azure you can upload or download data via azcopy or Azure Storage Explorer. The download functionality in our case is handled in the code, But here is a quick example of how to do a download it via azcopy using a shared access key signature.
Pro tip: if you need to remove the symbolic link but keep the data, use rmdir “E:\Download\Cache” and not del “E:\Download\Cache” or you will delete the data. That might not be what you want.
Next steps
Mind you, this was the easy and quick fix for a problem this service was facing. This is not a design or architecture. We are considering replacing the symbolic link solution with Azure File Sync. With a bandwidth cap and QoS on-premises, we would offer the primary download link to the cloud. There they can get all the bandwidth Azure can offer. Next to that, we would have an alternative link, marked as slow, that still points to the on-prem version of the data. This means the current implementation is still fully functional even when the Azure files share has an issue. Sure, the local copy comes with a significantly reduced performance, but it provides a failsafe.
The future
Well, the future lies in turning this into a solution running 100% in the cloud. Now, due to a large number of dependencies on various on-premies data sources, this is a long-term effort. We decided no to let perfection be the enemy of the good and fixed their biggest pain point today.
Conclusion
For sure, the use of a symbolic link to access an Azure file share is not something that will amaze people that have been working in the cloud for a while. It is however a nice example of how the use of Azure combined with on-premises services can result in a hybrid solution that solves real-world problems
This particular scenario enables them to distribute their data sets without having to worry about bandwidth limitations on-premises. That means they do to invest in a bigger internet pipe and a firewall with more throughput, or having to port their service and all its dependencies to a full-blown Azure solution.
Sometimes successful and cost-effective solutions come in the form of little tweaks that allow us to fix pain points easily.
How to survive in the ever-changing IT world is something that all IT professionals and developers want to figure out. No matter how junior or senior you are, no matter what level of expertise you have, it is a challenge for everyone. At least if they are honest to themselves and others.
The IT world changes at a very fast pace and longevity of technology seems to be a distant pipe dream. The only thing that has tenure it IT seems to be considered legacy and tech debt. But is that always the case? There is flip side to fast paced changes. The faster things change the shorter they last.
The faster things change the shorter they last
Today’s IT world is changing extremely rapidly in terms of technologies used, hardware and software lifecycle management, trends, and hypes. I have always said IT looks a lot like fashion-driven show business.
Everyone strives to keep up with these changes, both individuals and organizations. Can they in the end, or will we outrun ourselves? If we can at what cost? Quality, longevity, minimal viable products, bugs, journeys instead of products, etc.
The skill sets needed to make this happen are grown and groomed, not produced at will. All this while investing in education and training is often only lip service as the requirements change so fast, the willingness to do so is diminished. And as such we make our problems even bigger.
Join us for a chat
Find out in a chat with Yusif Ozturk (Co-Founder and Chief Software Architect at @VirtualMetric) and me how we view and think about these challenges. Companies must ensure their software is using up-to-date technologies and finding the best talent experienced with modern technologies. You need to adapt pretty fast so that you can survive. When a company creates a new software product and develops a new solution, everything changes in a short time. There are new programming languages, new frameworks, and new technologies coming to the market. One of the challenges that organizations face is the fast technology shifts, new skills needed, new experienced staff, etc. Maybe you will find some insights on how to survive in the ever-changing IT world.