Pollution of the Gene Pool a Real Life “FTP over SSL” Story

Imagine you get asked to implement a secure temporary data exchange solution for known and authenticated clients as fast as possible. You’re told to use what’s available already so no programming, buying products or using services. The data size can be a few KB to hundreds of megabytes, or even more. At that moment they already used FTP, both anonymous and with clear text authentication but obviously that’s very insecure. You’re told they need the solution a.s.a.p. meaning by the end of the week. So what do you? You turn to FTP over SSL in Windows 2008 (IIS 7.0, Release To Web -RTW- download) or Windows 2008 R2 (IIS 7.5, Integrated) as the one thing the company did allow for was the cost of a commercial SSL certificate and they had Windows 2008. If you want to read up on configuring that please have a look at the following entries http://learn.iis.net/page.aspx/304/using-ftp-over-ssl/ and http://learn.iis.net/page.aspx/309/configuring-ftp-firewall-settings/ where you’ll find lots of practical guidance.

You set it all up, test it, user folder isolation, NTFS permissions regulated with domain groups, virtual directories links are used for common data folders between users, etc. It all looks pretty good & is very cost effective. Customers start using it and if they have a problem they are helped out by the service desk. Good, mission accomplished you’d think. Except for someone who is not having any of that insecure firewall breaching FTP over SSL and starts kicking and screaming. The gross injustice of being forced into opening of some ports in their firewall is unacceptable. That same someone has been using clear text authentication for FTP downloads for many years and never even blinked at that has now discovered “security”.

FTP in a security Conscious World

We live, for all practical purposes, in a NAT/PAT & firewall world. These things became necessities of live after the FTP protocol was invented. You see, IPv4 has come a long way since its creation as have the protocols used over it. But originally, by design, it was not meant to provide security, just communications. Security in those early days was armed military personnel guarding physical buildings where you had access to the network and if you didn’t belong there they’d just shoot you. As a result TCP/IP is a lot like a flower power love child living a very secure universe where everyone loves everyone. Fast forward 30 years and that universe looks more like something out of a post-apocalyptic movie like Doomsday or Mad Max. If you don’t have security you become road kill and rather fast. So we built security on top of TCP/IP and we retrofitted it to the stack (a lot of the security in IPv6 was back ported to IPv4). We also invented firewalls acting like the walls of medieval castles. To add some more complexity there was not enough IPv4 love (i.e. public IP addresses) to go around which makes them expensive and/or unavailable. Network Address Translation came to the rescue. So we ended up where we are today with hundreds of millions of private IP range networks that are connected to the internet through NAT/PAT and are protected by firewalls. The size of these private networks ranges from huge corporate entities in the Fortune 500 list to all those *DSL & Cable Modem/Routers in our homes.

All of this makes the FTP protocol go “BOINK”. FTP needs two connections and quite liberal settings to work. But as the security story above indicates the internet world has moved from free love to the AIDS era so that doesn’t fly anymore. We need and have protection. But we also need to make FTP work.

Let’s first look at the basics. FTP client software needs two connections between the client and the server. One is the control channel (port 21 server side) the other is the data channel (port 20 server side). On the client side dynamic ports are used (1024-65535). These two connections present a problem for firewalls.

So port 21 needs to be allowed through the firewall on the FTP Server side. That’s pretty easy, but it’s not enough. Port 21 is the control channel that we use to connect, authenticate and even the delete and create directories if you have the correct file system permissions. To view and browse/traverse folders structures and to exchange data we need that data channel to pass through the firewall as well. That’s a dynamic port on the client that the server needs to connect to from port 20. Firewall admins and dynamic ports don’t get along very well. You can’t say “open range 1024 to 65553 for me will you?” to firewall administrators without being escorted out of the building by physical security people.

But still FTP seems to work, so how does this happen? For that purpose a lot of firewall/NAT devices make live a bit more secure and a lot easier by pro-actively looking at the network traffic for FTP packets and opening the required dynamic port automatically for the duration of the connection. This is called state full FTP. Now this is the default behavior with a lot of SOHO firewall/NAT devices so most people don’t even realize this is happening. You do not need to define rules that punch holes in the firewall. Instead the firewall punches them transparently when needed for FTP traffic. This is a risk as it happens without the users even being aware of this, let alone knowing what ports are being used. This isn’t very pretty but works quite well.

Here’s an illustration of Active FTP in action

clip_image002

You see initially there was only Active FTP, which is very client side firewall unfriendly because it means opening up dynamic ports on client side for traffic initiated by a remote FTP server. This needed to be fixed. That fix is Passive FTP and is described in RFC 1579”Firewall Friendly FTP”. Here it is the server that listens passively on a dynamic port and the client connects actively to that port. So Passive FTP makes the automatic punching of holes for incoming FTP traffic in the firewall/NAT devices more secure on the client side. With passive FTP the server does not initiate the data connection, the client does. When the client contacts the FTP server on port 21 it gets a response, then the client asks for passive FTP using the PASV command. The FTP server responds by setting up a dynamic port to which the client can connect. The client is notified about this using the Port command. Outgoing traffic initiated on the client from a dynamic to a port on the FTP Server is more firewall friendly (i.e. more secure) for the clients and thus more easily accepted by the security administrators. On the server side it is somewhat less secure.

clip_image004

Be aware that there are FTP clients which you need to explicitly configure for passive FTP (Internet browsers, basic FTP Client software). Some old or crappy clients don’t even support it, but that should be rare nowadays. When the client software automatically tries both active /passive to connect the user often doesn’t even know what’s being used which can lead to some confusion. Also keep in mind that often multiple firewalls are involved, both on the host as on the edge of both client and FTP server networks, that all need the proper configuration.

As an example of client side stuff to keep in mind: Configuring Internet Explorer to use Passive FTP and making sure ftp can also be used in Windows Explorer.

clip_image006

clip_image008

Improving FTP Security

One of the ways to reduce the number of ports that are used and as a result must be opened on the firewalls involved is to use a small predefined range of dynamic ports. Good FTP servers allow for this and so do IIS 7 and IIS 7.5. This reduces the number of ports to be allowed through and thus the conflicts with the security people enormously.

Now when we use FTP over SSL it becomes a practical necessity to use a small pre-defined range of dynamic ports to use. Snooping around in the packets to see if it’s SSL traffic so as dynamic ports can be opened just doesn’t work anymore because the traffic is encrypted. Opening thousands of ports is not an option. Those would become targets of attacks. Another hic up you can trip over is that some firewalls by default block SSL/TLS traffic on any other port than port than 443 (HTTPS).

So what do we need for FTP over SSL/TLS:

· Use Passive FTP and port 21 (Explicit SSL) or 990 (Implicit SSL)

· Select a small range of dynamics ports to define on the firewall and communicate that with your clients. This range needs to be opened in their outgoing rules for the clients that want to connect and the incoming/outgoing rules on the server side. Both the FTP server and the FTP clients need to respect this range.

· Use a FTP client that supports FTP over TLS. I used passive FTP with Explicit SSL to maintain the default port 21 for the connection channel. If the client doesn’t negotiate data encryption we refuse the connection. See FTPS on http://en.wikipedia.org/wiki/FTPS for more information on this.

· Buy a commercial SSL from a trusted source (VeriSign, Comodo, GoDaddy, Thawte, Entrust, …)

By using a commercial SSL certificate that securely identifies and verifies the FTP server, by limiting the communication through the firewall to some well-defined ports and by only allowing that traffic between a limited number of hosts, the risks are reduced immensely. The risks avoided are connecting to falsified hosts, password sniffing and data theft. The traffic that is allowed is far less risky and dangerous than anonymous or, what they used to do and allow, clear text authentication to non-verified servers on the internet. But still some people insisted that the FTP over SSL solution was introducing a serious security risk. Really and this isn’t the case with passive FTP without SSL? Sure it is, you just don’t realize that it happens and allow FTP traffic to wide range of dynamic ports and unknown hosts. So frankly crying wolf about properly configured FTP over SSL is like using “coitus interuptus” for birth control because you’ve read that condoms are not 100% failsafe. You’ll end up pregnant and infected with aids. That kind of logic is pure gene pool pollution. It’s also proof of an old saying: “never argue with an idiot, they drag you down to their level and beat you with experience”

Beware of NAT/PAT

As we mentioned in the beginning NAT has its own issues to deal with, so we still have to touch on the subject of NAT/PAT with FTP servers. Let’s first look at what is needed to make this work. You have already seen how the basics of passive FTP data connection work. The client sends a PASV command and the server responds by entering passive mode and telling the client what port to use.

Now with NAT/PAT devices the IP address needs to be swapped around. To do this these devices sniff the network traffic for the PASV command to find what port is used and turns the FTP server response from “227 Ok, Entering Passive Mode (192,168,1,32,203,8)” into 227 Ok, Entering Passive Mode (193,211,10,27,203,8).

As you can see the private IP address (blue, the first 4 numbers) is swapped to the public IP address (green) on which the FTP server is reachable and retains the port to use (red). The last to numbers in red describe the port number as follows: 203*256+8 =51976. When the client connects the reverse process takes place, the public IP is swapped for the private one.

PassiveFTPNatRewrite

You can already see where this is going with SSL. The NAT/PAT device cannot sniff the traffic for the PASV & PORT commands to see what on what dynamic port the client should establish the data channel and also due to the encryption it cannot alter the PASV command to swap around the IP addresses.

The best solution to this is to specify a firewall helper address for passive FTP which we can set to the public IP address of our FTP Server. Your FTP Server must support this; you’ll find that IIS 7.0 and IIS 7.5 do.

Other possible solutions and workaround are:

· FTP Clients that “guess” the address to use when the IP address in the PASV command doesn’t work (that would be an internal private range IP address). They then try to use the public IP address to establish the connection, which can work as the change is it is the public IP address of the FTP server or the public IP address of the NAT/PAT device. No guarantees are given that this will work.

· NAT/PAT devices sometimes allow for specified ranges to be forwarded to a specific IP address. So you could configure this to be the case for the small range of dynamic ports you defined for Passive FTP.

· Some FTP servers support he EPSV command (Extended Passive Mode), which only sends the port and where the IP address is the one used for establishing the control connection.

Be Mindful of Load Balancing on Server and/or Client Side

If Load Balancers are in play we must make sure that the communication always goes via the same node and IP address when using SSL or you’ll break SSL. If multiple IP addresses are used to route certain traffic via a certain device you make sure the FTP client doesn’t switch to another IP address for the data connection as this will fail. Both control and data channels must use the same IP address or passive FTP will fail even without using SSL. Also don’t forget some customers uses load balancers to route traffic based on purpose, cost, redundancy, etc. So this is also a concern on the client side. In the IIS log you’ll see that it complains about IP addresses that do not match. I’ve had this happen at 2 customer sites, which were easily fixed, but took some intervention of by their IT staff. Luckily they both had a competent SMB IT consulting firm looking after their infrastructure.

Table with FTP risks and mitigations

RISK MITIGATION RESULT
Server Connects to Client Use passive FTP Client initiates connection
Dynamic ports in use Select smaller fixed range of ports Less ports to open on firewall
Server not verified Use commercial SSL Certificate Server can be verified
Authentication not encrypted Use SSL for authentication Authentication encrypted
Data not encrypted Use SSL for data transport Data transport encrypted
Connections from & to unknown hosts Allow only trusted clients and/or servers No more FTP from/to any host.

Direct Access Step By Step Guide Version 1.2 released

I’m about to start work on a Windows 2008 R2 / Windows 7 Direct Access project and while gathering some resources (I played with it in the lab last fall) I noticed the Step by Step guide has been updated to version 1.2 which was published on June 18th 2010. It’s a great kick start for demoing Direct Access in a lab for management. Grab it here. http://www.microsoft.com/downloads/en/confirmation.aspx?familyId=8d47ed5f-d217-4d84-b698-f39360d82fac&displayLang=en. If you’re hooked and need more info, check out the Direct Access pages on TechNet: http://technet.microsoft.com/en-us/network/dd420463.aspx

Some people complain Direct Access is (overly) complicated. Well, it’s not a simple wizard you can run or some SOHO NAT device that you plug in, but come on people. We’re IT Pro’s. We did and do more complicated stuff than that. As a matter of fact I remember some feedback John Craddock got last year at Tech Ed Europe (2009). Some consultancy firm employees told him he should not make it look that easy. Organizations need consultancy to get it right. Really? Some will, some won’t. I have nothing against consulting, when done right and for the right reasons. I even consult myself from time to time with partners who need a helping hand. But take note that the world does run on people, and consultants are people (really!). What they can learn,  you can learn. Just put in the effort. So go have fun setting up Direct Access and giving your road warriors and IT Pro’s some bidirectional and transparent connectivity to company resources. To me Direct Access was one of the big selling points for Windows 7 / Windows 2008 R2. Better together indeed 🙂

The SP1 Béta Wave – E2K10 & W2K8R2

News from Tech Ed 2010 North America rolls in and we have the announcements of Windows 2008 R2 SP1 Béta for July 2010. Exchange 2010 SP1 Béta became available today! I’m grabbing it 🙂

I wouldn’t be surprised to see the the final releases of the products be announced at Tech Ed 2010 EMEA. Now I also wouldn’t mind if they came sooner due to the new and improved feature set both service packs offer, but I’m not really counting on that.

Bob Muglia’s live streamed keynote @ Tech Ed 2010 North America is nearing it’s end by now and he’s pretty up beat about lots of subjects Visual Studio 2010, Azure, System Center, Cloud, Exchange, OCS 14, Windows Phone 7, SQL Server 2008 R2, Office 2010, SharePoint 2010, Bing Maps & SDK, Avatar as a cloud case study & collaboration with Microsoft etc.

Cloud is omnipresent but they talk about hybrid. Making sure Hybrid is cost effective is important to me. I don’t need more work and costs but less.

Why I Find Value In A Conference

For those of you who are attending a high quality conference I’m going to share some tips. It’s great to be able to attend a conference. Not because it’s in a great city and you’ll get to dine out at night, but because of the opportunities it provides us to learn and grow.

So what is the value of such a conference? Well, it is about the communication with peers and guru’s. The conversations you’ll have and the exchange of taught and ideas. The technical information you’ll gather, the products and techniques you’ll see. All of this will help you to direct and focus the way in which you approach your job, run your projects, plan and realize your visions and ideas. A conference, if done well, is nothing more or less than a technical education in the business of ICT and how to do things and make it work, results versus efforts wise. On top of that you get to interact and share ideas with your peers. You can’t ask for anything more.

Conferences cost a lot of money, time and energy. After all when you’re attending you have no income or the boss pays you while you’re not in the office creating value. On top of that you have to pay for the hotel, flight, daily expenses and the conference fee.  So why would we do this? There’s a global crisis, there’s a European crisis, there’s an XYZ crisis and there is economic doom and gloom all over the place. Not to mention all the results of that downturn … cutbacks, redundancy, foreclosures, failing businesses, unemployment, etc. The conference scene is not immune to a recession. Conferences are canceled, scaled down, attendance drops.  The competition from blogs, on line, free or subscription based content, pre views, beta’s, web casts, computer based training can be felt as well, as has been the case for many years. Conferences need to fight to maintain high standards of technical content for an ever more demanding and skilled public in a very rapidly changing world and IT scene. But still, a conference done right is an investment worth making. It is an investment in knowledge. In return we get all of the above mentioned in the previous paragraph and that is why I attend them. They make me a better “Technical Architect”. That is try I really put in the effort to get the funds, create the opportunity and reserve the time to go. The benefits of a conference, if done right, cannot be denied.

What is the right way?  Well first here’s the wrong way. Don’t go there or learn how to use delegates in .NET or to build a Linq to Wmi query. It’s not just an ordinary classroom. Do not go there to nag about some issues you had or have because you’re too lazy to do research and read help files, readme texts, TechNet or MSDN. Do not go to complain about how hard it is to find information and study.  Do not go because of the great location, you will see nothing of it 🙂 unless you use it as a vacation. Maybe some USA people deserve to do this but heck I’m from Europe and I get my fair share of holidays. Also, take any frustration, denial or ignorance about the lies of instant gratification and careers success you might have had when you bought product X and it didn’t just improve your live after clicking “Next” somewhere else. Results and successes come from an enduring effort, which is a fancy word for hard work. They are not a right, a perk or let alone guaranteed (yeah some vendors and people lied to you, get over it). If you cannot search study and learn on your own get out of ICT now. If you need support for every issue or new item and expect someone to be there to help every step of the way you’re in trouble. It takes time, dedication and a great deal of effort to become and stay proficient in IT. Even then you’ll know about failure, setbacks, troubles and mistakes. Life & work is not a commercial.

Go to a conference for the big picture, the architecture, the networking with peers, the possibilities and the dreams. Expand your knowledge and views on how to make the pieces work and interact. The focus of the conference is on tying it all together, learning new and better ways, discovering possibilities which all equals to yet more stuff to learn and more work to do. At the booths manned by industry experts or representatives do give feedback about the product and offer to send some more details about certain real problems you might have come across. But be nice and polite, don’t be a jerk. Would you feel compelled to help a jerk? Techies are people, really!

Don’t run to sessions like a mad man without a plan.  Know why you are there and how to get what you’re after. If this is your first conference, everything will be new and fabulous (I hope). You can’t attend enough sessions. You are torn between the choice of sessions and tracks. You’re full of new ideas immediately and overwhelmed with even more just after that. So for all you newbie’s, get your act together and prepare a bit or it will turn into chaos. Write down ideas, insights and possibilities to pursue. What about you conference veterans? Have you’ve been there, done that?  Have you have seen it all before? Not really, and we all know it.  It’s all about lifelong learning. Conferences are about being in a stimulating environment where you “marinate” in the professional IT community for a week. Learn from and with the people attending. Not only during sessions. Lunch with them, have a coffee with them. Immerse yourself in this explosion of IT and business. Think about the new stuff, use your imagination, and write down ideas. Cross check your plans. Calibrate you insights.

The role of the conference is get you thinking about stuff and gives you a chance to talk to each other about that stuff. Interact! It is the essence of a conference. Ask questions both in public and in private. Talk to attendees, to experts, to vendors. See what they are in to, need, tried, where they succeeded and failed. Find out what they have to offer. Talk shop, talk IT life, and make that connection with the others attending in whatever role. There is a wide world out there much bigger, larger and perhaps tougher than your own little world that is driven by results and built on efforts. Those folks have professional and business experience in all of the subjects being discussed. Pick their brains! Get some new insights and ideas. Yes I know some vendors act like 2nd hand car dealers and yes I’ve met people who don’t know their own products. But that doesn’t rule out the need for and benefits of communication and interaction, those where just the wrong people, just move along. Oh and don’t forget to get a bunch of extra business cards you got out of the drawer. Also keep in touch with people you meet. Send a follow up mail, a tweet, a blog mention. That’s how you expand your knowledge network. If you get to go to a conference, enjoy it and make sure you arrive early and leave late. There is no value in missing a part of the experience to save some time or some €/$ bills.

I don’t implement all the ideas I return home with from conferences. But I have them written down in mails, scraps of paper, txt files, One Note scribbles etc. I tend to pour them into a word document during and after the conference. Form that I distill my plans, my vision and my roadmap.  I will present those to management, colleagues, partners and customers and offer them solutions based on my perception of what business needs we can satisfy with technology. The stuff I’m working on now was born as ideas 12 to 18 months ago. Those ideas are tested and checked over time, they ripe and then they become plans. I find a conference a great place to do that.