IT Strategies from Window NT 3.51 to the Cloud Era – Part 2

How do we get strategic IT?

That’s a hard question to answer. I can only give you my take on the subject. It’s far from complete but it addresses the way I look at it and how I try to do my part in achieving this. This is part 2 of a series on IT Strategies. Just some musings. You can find Part 1 here: https://blog.workinghardinit.work/2010/08/02/it-strategies-from-window-nt-3-51-to-the-cloud-era-part-1/

Merely leveling the playing field or value adding instrument?

Knowing what you have learned already, take a good look at the Total Cost of Ownership (TCO) and the Return on Investment (ROI) of the technology you own. You might wonder if any of those expensive IT systems are ever going to live up to all their potential. Can you easily pin point some real and substantial business benefits from your technology investments? We sincerely hope you can. Or do you have the “feeling” that most of that capital computing power is just sitting there idle and is hopelessly underused?

Use IT or lose IT!

Even in this day and age, it never ceases to amaze us how many routine and repetitive tasks are still done manually. IT can do two major things for your business. One is to improve physical operations by making them more efficient and the other is to reduce logical operations by automating as much of it as possible. If IT is not doing that, something is wrong. Why on earth are people reduced to mere (and rather slow and unreliable) biological processing units within the IT systems they work with? This not only wastes productivity but also leads to many avoidable errors that are not only time consuming to clean up but are often untraceable. The point is that you don’t want it to be like this and you need to fix it. This is not a question about wanting your technology to deliver value to your operations but needing it to do so. Not using your IT’s potential is literally squandering vast amounts of money. No business can afford to do that so you’ll either have to deal with that issue or risk losing funding for IT. Which means you’ll be even worse off. Any chance of ever recuperating the money you invested in technology has gone and funds to compete and get ahead of your competitors can only be dreamt of. This is not a good situation to be in and even harder to get out of.

Let’s face the fact that IT very often sucks at delivering value. That’s not technologies fault directly but it’s how we use it, or rather, abuse it. Business doesn’t know a thing about what IT can or can’t do and what’s the real cost of that. IT often is run by IT managers and CIO’s that are busier entrenching themselves in vast layers of expensive and often overly complex hardware and software. The IT industry itself is a willing accomplice. They really like to sell all that stuff and services as they love the color of money. They are partners in crime and hide that painful fact behind change management, methodology religion and an air of “we are keeping the sky from falling down”. But how do they explain it takes 14 days to restore a file server to service, that it costs 3 to 6 times more for enterprise IT to do than the market will bear? I’ll give you a hint, partially because the need to hire consultants to do their jobs. Why does it take over 4 years to consolidate to a new mail system in big government agencies? What happened Look, in corporate life a lot of what’s been doing done is bull shit and exists only to create career opportunities, to inflate ones importance and (they hope) ones wages. They built extremely expensive, large, complex systems that are hell to touch, let alone, change, improve or replace. People all follow their own time limited agenda’s and build up an extensive technology debt that will wreck a business sooner or later. But no one, not in business or in technology wants to acknowledge that. When you do bring it up and try to evangelize better ways you’re a bit of an annoying person rocking the boat. Hey, what we don’t need 10 consultants, a 20 person project team and 2 years to add a sub domain to Active Directory? What do you say? Our home build MIS that took 2 years and a couple of million € could have been bought for 3000,00- € with 80% of the features. That must be some real important report content that only our system can produce I bet! You think I’m joking? I hope you think that because you’re all doing it right and are not in an industry that will collapse by doing what I described above. Fast forward 10 to 15 years. What will “the cloud” have done to your little empire? Ok perhaps you’ll be retired by then, what if you’re 45 and a fake IT professional, developer or heaven forbid a fake consultant? Yeah you’ll call it age discrimination at the unemployment office, I’m sure, but what good will that do you really? But even then the true fakers will thrive. Never, ever forget that if the organization is large enough perception outperforms reality and, guess what fakers experts are at? Right! The most “reliable” consultants seem to work for organized crime these days (Consigliere). Make a mistake in that field of endeavor or don’t deliver and you find yourself standing on a piece of industrial plastic in the bosses office looking down the biggest and blackest hole you’ll ever see. And that shiny sliver of light at the end is most likely a Winchester Silvertip® Jacketed Hollow Point. They don’t buy crap services, only results.

Now What?

Now given all this, how the hell do we make IT a strategic asset? Well let’s first go and make sure we get a functional IT environment and then go for tactical, operational and strategic value. Just focusing on the strategy is not enough and even dangerous. The operational aspect is very important as well and you really do need both. To achieve strategic IT you really need to have an IT that’s functional. In the end you also need context, knowledge, skills, hard work and a “getting things done” attitude. Please do not think you can by this as a product or a service. You can only achieve an IT strategy by hard work and dedication. Trying to use recipes or methodologies without context is doomed to fail. It just doesn’t work. Unfortunately so many organizations really do that. Why? Because to many bad consultancy firms are milking those tools for the cash cows they have become. Look consultancy firms live by cheap personnel (to them, not to you) that get their jobs done by following guide lines, boilerplate documents and such. They couldn’t survive without them, the methodologies and processes are their life line. So when those don’t work they’ll sell you more of the same because that’s all they can handle and know. They don’t really care about your business. Essentially they are what are giving methodologies a bad name. They are used to let mediocre firms with mediocre or even incompetent personnel survive. Stay away from them. They have a hammer and they’ll use that tool everywhere for every job so basically they are frauds. This often continues way to long on because so much money is involved no one even dares to tell the truth anymore. Have you ever wondered why most over certified “BIG” corporations are hated and despised by their customers? They are over organized, to many weird processes. On paper all is well to perfect but in reality it doesn’t work anymore. This is probably the reason why we’re losing so many jobs in the West. We’re too expensive for the limited productivity we have and are burdened by the cost of dysfunctional methodologies. Are the other areas better? Nope they are cheaper, hungry for a better life and willing to make a real and prolonged effort.

Invest In Solutions & People

Basically what we’re saying is to stop buying technology and instead start investing in solutions. Solutions are built with technology. But there is more to it. Used correctly, technology can be a lot more than just a means to an end. The solutions built with technology must become an instrument that provides operational, tactical and strategic advantages to your organization. This means that IT will then realize its true potential.

So here is where your ICT staff (external or internal) can and must really differ from other commercial ICT providers or personnel. That piece of hardware and the operating system running on it is more to us than just a means to sell even more services, hardware and software. To us, it is the very basis of working solutions to real business issues. That’s the attitude we have towards addressing your problems. This requires knowledge, experience and dedication to finding the best approach for your organization.

This means that often you don’t have to buy yet more or new technology and software to solve problems. The best hidden secret in IT seems to be the value and opportunities that are available within most systems. It’s time for you to find them and be pleasantly surprised by what you what you already have but are not using. Use the money saved here to differentiate you in the market and be competitive. To achieve this you need knowledgeable people who care about the business.

Select your vendors and consultants with care. They should have one goal and that is to help you solve existing problems cost effectively and as a result deliver value to your company. They should not have the goal to sell only products or time to fix – perhaps even only perceived – problems. Think about it, you do not want to buy consulting hours, a dual CPU server or a body to fill an empty cubical. You want to buy a solution to an existing need. Of course we do work with vendors and other consultants in order to help you, but they have to agree and comply with our approach. If not, that’s a collaboration terminating event. It’s this attitude that helps us provide outstanding services. Yes we keep up to date on technology advances in our field of expertise, yes some of us are certified but we do not sell you acronyms and titles instead of skills. We are certified because we are experts, not vice versa. Most of all we are interested in learning about your business needs and matching them to satisfying solutions. Good consultants and IT Professionals will always seek the best way to help you. They will do this even when this means hiring others to do the job. If not, they are body shops who have only one goal, renting out people at daily rates. Walk away from these. They will never produce the results needed. Likewise, people just writing strategy documents, enterprise architecture designs and practice methodology religion that doesn’t improve or even relate to the realities in the field is nothing more than wasting money. They who do that are frauds. Why do I call this a “religion”? Because they only promise things later when you’re a good boy and when it never delivers you must have sinned against the methodologies? Oh please … enough already! Typical bullshit form people with no real field experience and knowledge, let alone the context to achieve something with methodologies.

A Vision on Realizing Solutions

Today anyone in the world with the right skills, attitude and motivation can provide services that result into high quality information technology solutions for any small, medium & large sized businesses around the globe. Fast internet connections in combination with secure remote administration tools have made this possible, especially when you realize that many technology implementations are being replaced by “good enough” services on the internet. Well, what good clients, employees and managers have in common is that they take their automation and computing needs seriously. However, they don’t always have the resources or know-how available to make the most of their investments in technology. By providing the necessary advice, assistance and implementation, technologists can add value to a company instead of delivering just another cost factor that merely levels the playing field. Solutions may range from single server systems over organization wide infrastructure roll outs to hybrid cloud implementations but the principle remains the same. We offer the knowledge and experience needed to make technology work for the business. A properly implemented IT solution can provide tactical, operational and strategic advantages in the market place. It makes you more competitive. To achieve these goals we rely on a number of key elements in any successful professional relationship.

Trust

All long term business relationships are based on mutual trust. Trust is not given away for free, it needs to be earned. You will need to establish a trust relationship between business and ICT in order for them to be your counselor and guide for current and future technology needs.

Integrity

Technology is a tool, a means to an end. Good IT offers to help your company use the tools it has or requires to their fullest potential. The business needs are important and IT should care about finding the right solution that the organization. An IT strategy must show commitment to work according to your needs and budget to find the right level of services and technology for your company. This needs to be done at a pace you can sustain. The goal is to add value to your business, not to extract money from it and waste time performing unnecessary tasks. On the opposite side, IT advisors must be honest and be able to walk away from organizations and managers who are not capable of being professional and have no real desire to improve. Only time and money can be wasted in such a situation. The money can be replaced. Our time cannot so it should never go to waste.

Reliability

All technology can and will fail at some point in time. Such events can result in lost data, lost time and lost revenue. Providing a business with a well-designed infrastructure however, will help you avoid such risks. This, in combination with excellent troubleshooting skills, facilitates recovering from such “disasters”. Keeping IT systems running smoothly requires a proactive approach consisting of maintenance and periodic auditing of the systems. The result of this is the reliability needed in order to prevent unscheduled down time. In the cloud era the ones responsible for all that change but the essence remains the same. Be advised however that everything can and will fail. Unbreakable does not exist. The differentiator is how we prevent it and how we deal with it when it occurs.

Attitude

You have a lot of choices when it comes to choosing someone to help you with your computing needs. Experience is one of the distinguishing factors, as is commitment, know-how and the ability to develop a vision about a company’s IT needs. This attitude provides guidance to organizations, so they can enjoy the benefits of well-designed computing systems without draining vast amounts of capital from their financial reserves.

Technology

Understanding technology is paramount to providing a great service. You cannot design and build something you do not understand. The correct mix of hands on experience, knowledge of complementing technologies and design skills, combined with continuous training is what enables IT Pro’s to optimize your technology investment.

Isn’t this going to very or prohibitively expensive?

This remark always provides for an opportunity to convince the person of the real value ICT can add. I have always made a genuine effort to position the IT solutions I design as qualitatively superior in both our approach and in the results we deliver. But that doesn’t mean they are overly expensive. Within our area of expertise, we will always seek to find the best and most cost effective solution that is viable to that particular business need. We will customize that solution to “tailor fit” the environment, but only if and when required. If IT is prohibitively expensive somebody is not doing their job right.

This, in my opinion, is a great way to provide an outstanding service and results; the value of which is reflected in the investment costs. How can you expect strategic IT and its implementation to be cheap? Good value for money, yes, but cheap, no. Now, some companies will never pay the rates and price needed to achieve this. Sometimes, in many ways, they are totally correct in not doing so. The most likely and best reason is that they do not need our services, i.e. it is just not worth it. For them IT is nothing more than a necessary and valuable commodity. The unfortunate reason is that our services can’t help them. Some simplified examples might help to clarify this. A lawyer who uses his laptop for writing reports and printing them out is not going to pay us 150 Euros per hour for setting up his free e-mail in Outlook, and he’s absolutely correct in not doing so. However, an organization that needs an effective, secure and rock solid virtualization solution will appreciate the benefits of good design and is willing to make that investment to achieve results. Until the day arrives the cloud does that better and more cost effective.

Costs for ICT and rates for personnel are not dictated by the complexity or the lack of complexity in an organization’s environment. Rates are established based on knowledge and the value IT brings to your business. An environment’s complexity only dictates how much time we will spend designing, implementing and/or supporting it, not how much time or solutions are worth. In the end you are engaging skills and knowledge, not time. Smart managers and IT architects simply refuse to compete with “low cost, low skill” operators who can only follow generic guidelines (if even that) and are stumped when “it doesn’t work”. But since you are reading this there is most likely a reason why you are looking beyond those players. You are interested in high quality strategic, tactical and operational ICT! So there you have it. It is the combination of skills, aptitudes, work ethics and intellectual capital that enables IT professionals to offer quality services that amount to added value and competitive advantage for your business. In the end, isn’t that what you want for your organization?

Think of any solution like a triangle. The three sides of the triangle have the following characteristics that can all be a quality by itself or combined together, depending on the circumstances.

  • Cheap
  • Fast
  • Good

However be advised that the three sides combined together almost never materialize in real life. So you’ll have to pick two of them and, more or less, sacrifice the third. Which combination is needed depends on the issue at hand. Do not fool yourself into thinking that Information Technology is something that anyone can do by just reading a book, following some courses or owning a PC. It takes a tremendous amount of time, dedication and a sustained effort to become and stay competent and ahead in the world of IT, far beyond a 9 to 5 job. So realistically, if you expect this kind of expertise to be available at low ball rates or for free, perhaps it is time to take another look at the triangle and reevaluate your position on this. This is true in a mainframe, client server, web, Service Oriented Architecture and cloud world. The environment, technologies, tools and solutions change and we have to change with it.

Final Thoughts

It is important for both management and employees alike to realize that choosing a dynamic and scalable IT infrastructure solution in combination with a flexible and integrated development platform is very important. It provides one of the pillars for the successful implementation and support of the business strategy. Today this is no longer an opinion or a choice; it is a fact of doing business.

This means that these choices are as much a part of creating competitive advantage as any other strategic decision. The need to provide long term backward compatibility, support for current and new technologies, facilitate agile development and allow easy deployment and maintenance throughout its life cycle are paramount. All of this has to be done in an effective, efficient and affordable manner. These considerations are valid whether we are looking for a single line of business application or an entire infrastructure solution. Do realize solutions are fluid. There is no permanent solution, just the best solution at that moment for a certain period of time for that particular situation. That situation might be unique or it might be more common than muck. Much care must be taken not to build up an unsustainable technology debt and to keep working on reducing any non-managed debt present.

To end, never make the mistake of thinking that having a strategy is enough. You need the right people, the skills, the attitude and the guts where and when they are needed to execute your strategy. In essence it is never easy and it takes a lot of hard work. Plans and ideas that are not executed are worthless. Beware that acknowledging this must result in more than lip service to this vision. You must act upon it, translate it into plans, and provide leadership and guidance to achieve an IT strategy that will produce results. Money and tools are no substitute for solid skills and motivation. But those are subjects for another discussion. So for now, stop faking, study hard, work hard and build real good solid IT!

This is part 2 of a series on IT Strategies. Just some musings. You can find Part 1 here: https://blog.workinghardinit.work/2010/08/02/it-strategies-from-window-nt-3-51-to-the-cloud-era-part-1/

IT Strategies from Window NT 3.51 to the Cloud Era – Part 1

Because hope don’t float!
This is part 1 of a series on IT Strategies. Just some musings. You can find Part 2 here: https://blog.workinghardinit.work/2010/08/14/it-strategies-from-window-nt-3-51-to-the-cloud-era-part-2/
Not many people I meet in businesses seem to be able to define “ICT strategy” without playing some sort “Bullshit Bingo” I’ll give you my opinion. During my years in IT I’ve read and thought a lot about the value of what we design and build. What you’ll find here stems from well over a decade of reading, thinking, working, discussing and helping to develop IT Strategies with colleagues, businesses and consultants whilst exploring ways to deliver value through ICT. One thing I should perhaps add that I have never been in a sales role, so this is not from an account manager’s perspective. But I do recognize and accept that everyone whatever his or her function has a “sales” role in order to be allowed to execute ones proposals. I owe a debt of gratitude to so many people over the years who have helped shape my vision on IT. There are so many voices and opinions, some I agree and some I disagree with, that have influenced my ideas, that in the end all of what you read here is a collection of all those opinions combined with my interpretation of them. Part 1 is about where IT strategies fit in and why they are. Part 2 will address my opinion on how to achieve them.

Introduction

The reason you hear more and more about strategic IT and commodity IT in recent years is due to the attention cloud computing is getting in the media. One of the main forces driving the cloud computing business is economic pressure and the need to provide affordable, scalable IT in a commodity market. Combine this with the “discovery” of business and IT alignment by main stream management and the “strategic” plans will flow abundantly. They’ll almost certainly throw in an “IT needs to learn it’s here for the business needs”. What does that really mean? That management is not capable of using IT for its business needs and allows money to be wasted. Whose responsibility is that? What if they let the same happen between financials and sales? In my humble opinion main stream business managers urgently need to make the effort to learn about the realistic use and benefits of IT. A divide between business and IT is a manmade artifact and not something natural, it is a result of management failure. Bar the stereo type nerds, I see more efforts of IT managers & architects to think business minded than by main stream management in thinking about using IT as a true competitive differentiator. Once the words economy and competition are in play you start talking strategies, just like the military. That’s not a coincidence. Take away the niceties and business is a non-lethal form of warfare. I guess that’s why “The Art of war” was or is such a popular book in top business circles. Just do an internet search for “The art of war business strategy”. The correct definition of what strategy means is out there in plain sight for all of to read and learn.

So why is it that when talking about strategies, ICT related or otherwise, you rarely get a solid response that truly addresses the subject? People seem to mistake simple long term planning and goals for strategies. Plans are used to realize strategic goals; they do not define a strategy. A strategy is what you will do to out flank your competitors to gain an advantage. That advantage, in today’s world, means being different and good. It is almost certainly not about being the best. What is best depends too much on the unique situation of every organization, its specific needs and circumstances at that moment in time. It’s indeed all rather fluid and dynamic, so “best” is very time limited. Anyway you’d better have something that differentiates you from the competition in a positive way. Otherwise there is no compelling reason to become your customer.

Why is having a unique approach and being good at it so important? Being the same as anyone else makes you plain, a commodity that’s readily available. If on top of that your customer’s service sucks, you’ll start losing customers as no one is willing to pay for that. This drives down prices even more and robs you of all potential benefits of any unique selling points you might have. That is far from competitive. Unless your aim is to become king of low priced, bulk delivery for a product that doesn’t require services whatsoever that’s a bad strategy and even then you will have to be better than your competitors in that particular playing field. You have to stand out somehow.

Also, a strategy has to be correct and honest. False assumptions, self-deception, faking and lying as in “methodology religion” will make you lose all professional credibility with your personnel and investors. Once you’ve sunk to that level there is little or no hope of ever recovering from that position. You really cannot get away with faking a strategy.

So what is “my” strategy as an infrastructure guy in a business world to make sure that we are different and good? Well, you already read the appetizer, now read on to find out. And believe me, you need to find out! Way too many business & IT strategies are esoteric boardroom level documents that have little or no correlation with the reality in the trenches. They are made to have some checkboxes ticked on an audit report or are actually just plans with not strategic content what so ever. Sometimes you really wonder why they even bother making them. At least they could have avoided wasting the time and effort.

Defining how the ICT strategy relates to the business strategy

Before we can define what makes a good ICT strategy we need to talk business. It needs to be a part of the strategic business plan or you shouldn’t even bother having one. Oh, and by the way, if you don’t have a high quality business plan made by and supported in actions by knowledgeable, passionate, driven, motivated and hardworking management, walk away. No good ICT strategy will ever come from such a situation. Buying technology cannot fix organizational problems. Please repeat this last sentence at least three times out loud. You need to hear it and let the message sink in! In such a situation having an IT strategy is the least of your problems.

We already stated that a strategy is about distinguishing you from your competitors. This can mean many different things depending on the circumstances. Better products, the same product but with better services, cheaper but good enough for its purpose, etc. Be brutally realistic. If what you do does not set you apart from your competition in a positive way, you have no strategy or have been ill advised on what constitutes a strategy. The fact that “no one else does what we do” is not a strategy, it will not last! The fact that people are obligated to use your services by law is not a strategy. It might be a short term advantage, but it creates no good will with your customers, especially not if your services or products are mediocre or bad. And please be more than just be the odd one out, sure you’re different but that’s not the different you’re looking for.

You must also realize that strategies have a limited shelf life. Sooner or later your competitors will realize what your strategy is and if it works they will start copying it. More often than not they will add some improvements having the benefit of 20/20 vision through hind sight. This means that, over time, what was once a distinguishing solution that gave you a competitive advantage becomes a mere main stream commodity. Now please realize that being a commodity does not mean irrelevant or useless. Power, heating, fuel, telephones, e-mail, storage, file servers… are all commodities we cannot do without! But in the commodity sector you will compete by being different in pricing, quality of services and added value. Only when technology becomes a blast from the past by significant advances or changes in science it becomes economically useless. Think steam engines … but … retro does exist and come backs do occur. Windmills any one?

I know the cloud hype recuperates just about everything that is delivered can be over the internet and is service oriented but please realize that not all commodities are or will be services.

clip_image002[5]

Given the fact that strategies are far from long lasting entities, what does this mean for an IT strategy? Simply put: speed and agility are of the essence. We must be capable of moving fast and decisive. There is no time anymore for years of thinking, contemplating and testing. The long term vision must still exist and it is extremely important, but it is not the same as a strategy. To reap the benefits of long term thinking one needs to survive long enough to be around! 15 year long term strategies are doomed. These are day dreams. A bit like the Maginot Line the French build for over a decade but was utterly useless as its concept was out of date by the time World War II broke out. Long terms visions are realized through several sequential and adaptive strategies. As you can see in the figure with time strategic solution becomes a commodity and a part of the IT infrastructure that needs to be maintained. A good strategy takes this into account so that the strategic solution can evolve into an operational and tactical commodity instead of a very expensive drain of resources.

One could say that all IT can be found somewhere between following states:

· Strategic: Technology that differentiates us in support of (new) business strategies. This is what makes us more competitive, that adds value, because it’s unique and innovative.

· Commodity: Stuff we need but no longer adds competitive advantage. It does provide tactical or operational benefits and you can’t do without them. Make note, strategic commodities exist, oil is a prime example, water another one. So commodity is not a synonym for low value, it just doesn’t add value in itself or no longer serves as a differentiator.

Everything else, whether it is cheap or expensive, subsidized or self-sustaining are frankly technology hobbies (not in the picture). Where are the value and the profit? Management should avoid this. The best thing that can happen here is that you actually learn something building & maintaining it or use it as a lab for creative innovation. But that’s not a hobby anymore … that’s a dream job for engineers.

Tell me what an ICT strategy is already!

An ICT strategy supports the business initiatives that provide a competitive advantage to an organization in such a way that it does not become a pain in the ass over time. Only incur technology debt where and when needed and manage it carefully. But how the hell does that materialize in reality you must be wondering by now? Well it is the combination of creatively building, deploying, operating and using solutions that deliver value by making you more competitive. Solutions scan be realized using standard software and hardware, with custom build applications or a combination of both. Whatever the case … the solution requires very knowledgeable people, serious skill sets, a mind driven by curiosity and the need for results.

People buy results, not services or efforts. This is one of the big mistakes in the thinking of many modern so called service driven companies. They fail to provide good services, let alone results. They are in effect just low cost / low value operators. If you provide services they need to be there to produce the results. Otherwise you are, for all practical purposes and intent, lying which will come back to bite you. Take note however that too much service is financial suicide. Don’t cater to individual and unique needs unless that is your core business.

Since solutions are custom build on development platforms and infrastructure it is critical to realize that the choice of platforms and infrastructures can mean the success or failure for an organization since it directly relates to its ability to compete. Yes, once again, the reason for having a strategy in the first place!

The most common issue we see when dealing with an IT strategy is that many organizations have no clear picture about what they do, how and why. They just seem to do “stuff” and expect of rather hope that hard work and effort will help them realize their goals. But without clear and well defined goals there is no way of achieving them. Efforts and hard work alone will not produce results. Customers do not pay for hard work, they do not reward efforts. That was something that worked in kinder garden but fails in a business environment. Remember, customers pay for results. You cannot buy a product that will deliver these out of the box.

So what must an ICT strategy achieve?

Since we have seen that strategic solutions eventually become commodities, any combination of infrastructure, platforms and solutions must work well during their entire life cycle. Decisions that focus only on strategy might lead to the implementation of the latest and greatest technology. This can lead to very divers, esoteric and heterogeneous environments with very high integration & support costs. It also incurs the cost of finding, retaining and maintaining good developers and engineers with knowledge about such systems.

On the other side of the equation one can not only worry about keeping short- and long-term support costs low. This will lead to missing out on the business benefits that new technologies can bring. In the end finding the right balance between these two is very important and failing to do this will be very costly in financial repercussions, lost opportunities and failed projects. This ends in the downfall of the organization since it becomes irrelevant in the market and has no more means to support itself.

Custom build solutions do not exist in isolation. They need to run on an infrastructure, connect to other systems, be able to be secured etc … This is called integration and if this is overlooked it can become a financial burden that negates the added value of an IT solution and make it a cost instead of an asset. For example “Best of breed” has often failed in the sense that is did not deliver enough value to justify the high cost of acquisition, maintenance and integration. The real killer here is the efforts and thus cost involved in integrating all these. Even if you do get it to work it is often in a way that negates good practices, reduces security and incurs a high, cumbersome administrative overhead which is error prone and expensive. It does make a good revenue stream however for “consultants”.
This is part 1 of a series on IT Strategies. Just some musings. You can find Part 2 here: https://blog.workinghardinit.work/2010/08/14/it-strategies-from-window-nt-3-51-to-the-cloud-era-part-2/

Pollution of the Gene Pool a Real Life “FTP over SSL” Story

Imagine you get asked to implement a secure temporary data exchange solution for known and authenticated clients as fast as possible. You’re told to use what’s available already so no programming, buying products or using services. The data size can be a few KB to hundreds of megabytes, or even more. At that moment they already used FTP, both anonymous and with clear text authentication but obviously that’s very insecure. You’re told they need the solution a.s.a.p. meaning by the end of the week. So what do you? You turn to FTP over SSL in Windows 2008 (IIS 7.0, Release To Web -RTW- download) or Windows 2008 R2 (IIS 7.5, Integrated) as the one thing the company did allow for was the cost of a commercial SSL certificate and they had Windows 2008. If you want to read up on configuring that please have a look at the following entries http://learn.iis.net/page.aspx/304/using-ftp-over-ssl/ and http://learn.iis.net/page.aspx/309/configuring-ftp-firewall-settings/ where you’ll find lots of practical guidance.

You set it all up, test it, user folder isolation, NTFS permissions regulated with domain groups, virtual directories links are used for common data folders between users, etc. It all looks pretty good & is very cost effective. Customers start using it and if they have a problem they are helped out by the service desk. Good, mission accomplished you’d think. Except for someone who is not having any of that insecure firewall breaching FTP over SSL and starts kicking and screaming. The gross injustice of being forced into opening of some ports in their firewall is unacceptable. That same someone has been using clear text authentication for FTP downloads for many years and never even blinked at that has now discovered “security”.

FTP in a security Conscious World

We live, for all practical purposes, in a NAT/PAT & firewall world. These things became necessities of live after the FTP protocol was invented. You see, IPv4 has come a long way since its creation as have the protocols used over it. But originally, by design, it was not meant to provide security, just communications. Security in those early days was armed military personnel guarding physical buildings where you had access to the network and if you didn’t belong there they’d just shoot you. As a result TCP/IP is a lot like a flower power love child living a very secure universe where everyone loves everyone. Fast forward 30 years and that universe looks more like something out of a post-apocalyptic movie like Doomsday or Mad Max. If you don’t have security you become road kill and rather fast. So we built security on top of TCP/IP and we retrofitted it to the stack (a lot of the security in IPv6 was back ported to IPv4). We also invented firewalls acting like the walls of medieval castles. To add some more complexity there was not enough IPv4 love (i.e. public IP addresses) to go around which makes them expensive and/or unavailable. Network Address Translation came to the rescue. So we ended up where we are today with hundreds of millions of private IP range networks that are connected to the internet through NAT/PAT and are protected by firewalls. The size of these private networks ranges from huge corporate entities in the Fortune 500 list to all those *DSL & Cable Modem/Routers in our homes.

All of this makes the FTP protocol go “BOINK”. FTP needs two connections and quite liberal settings to work. But as the security story above indicates the internet world has moved from free love to the AIDS era so that doesn’t fly anymore. We need and have protection. But we also need to make FTP work.

Let’s first look at the basics. FTP client software needs two connections between the client and the server. One is the control channel (port 21 server side) the other is the data channel (port 20 server side). On the client side dynamic ports are used (1024-65535). These two connections present a problem for firewalls.

So port 21 needs to be allowed through the firewall on the FTP Server side. That’s pretty easy, but it’s not enough. Port 21 is the control channel that we use to connect, authenticate and even the delete and create directories if you have the correct file system permissions. To view and browse/traverse folders structures and to exchange data we need that data channel to pass through the firewall as well. That’s a dynamic port on the client that the server needs to connect to from port 20. Firewall admins and dynamic ports don’t get along very well. You can’t say “open range 1024 to 65553 for me will you?” to firewall administrators without being escorted out of the building by physical security people.

But still FTP seems to work, so how does this happen? For that purpose a lot of firewall/NAT devices make live a bit more secure and a lot easier by pro-actively looking at the network traffic for FTP packets and opening the required dynamic port automatically for the duration of the connection. This is called state full FTP. Now this is the default behavior with a lot of SOHO firewall/NAT devices so most people don’t even realize this is happening. You do not need to define rules that punch holes in the firewall. Instead the firewall punches them transparently when needed for FTP traffic. This is a risk as it happens without the users even being aware of this, let alone knowing what ports are being used. This isn’t very pretty but works quite well.

Here’s an illustration of Active FTP in action

clip_image002

You see initially there was only Active FTP, which is very client side firewall unfriendly because it means opening up dynamic ports on client side for traffic initiated by a remote FTP server. This needed to be fixed. That fix is Passive FTP and is described in RFC 1579”Firewall Friendly FTP”. Here it is the server that listens passively on a dynamic port and the client connects actively to that port. So Passive FTP makes the automatic punching of holes for incoming FTP traffic in the firewall/NAT devices more secure on the client side. With passive FTP the server does not initiate the data connection, the client does. When the client contacts the FTP server on port 21 it gets a response, then the client asks for passive FTP using the PASV command. The FTP server responds by setting up a dynamic port to which the client can connect. The client is notified about this using the Port command. Outgoing traffic initiated on the client from a dynamic to a port on the FTP Server is more firewall friendly (i.e. more secure) for the clients and thus more easily accepted by the security administrators. On the server side it is somewhat less secure.

clip_image004

Be aware that there are FTP clients which you need to explicitly configure for passive FTP (Internet browsers, basic FTP Client software). Some old or crappy clients don’t even support it, but that should be rare nowadays. When the client software automatically tries both active /passive to connect the user often doesn’t even know what’s being used which can lead to some confusion. Also keep in mind that often multiple firewalls are involved, both on the host as on the edge of both client and FTP server networks, that all need the proper configuration.

As an example of client side stuff to keep in mind: Configuring Internet Explorer to use Passive FTP and making sure ftp can also be used in Windows Explorer.

clip_image006

clip_image008

Improving FTP Security

One of the ways to reduce the number of ports that are used and as a result must be opened on the firewalls involved is to use a small predefined range of dynamic ports. Good FTP servers allow for this and so do IIS 7 and IIS 7.5. This reduces the number of ports to be allowed through and thus the conflicts with the security people enormously.

Now when we use FTP over SSL it becomes a practical necessity to use a small pre-defined range of dynamic ports to use. Snooping around in the packets to see if it’s SSL traffic so as dynamic ports can be opened just doesn’t work anymore because the traffic is encrypted. Opening thousands of ports is not an option. Those would become targets of attacks. Another hic up you can trip over is that some firewalls by default block SSL/TLS traffic on any other port than port than 443 (HTTPS).

So what do we need for FTP over SSL/TLS:

· Use Passive FTP and port 21 (Explicit SSL) or 990 (Implicit SSL)

· Select a small range of dynamics ports to define on the firewall and communicate that with your clients. This range needs to be opened in their outgoing rules for the clients that want to connect and the incoming/outgoing rules on the server side. Both the FTP server and the FTP clients need to respect this range.

· Use a FTP client that supports FTP over TLS. I used passive FTP with Explicit SSL to maintain the default port 21 for the connection channel. If the client doesn’t negotiate data encryption we refuse the connection. See FTPS on http://en.wikipedia.org/wiki/FTPS for more information on this.

· Buy a commercial SSL from a trusted source (VeriSign, Comodo, GoDaddy, Thawte, Entrust, …)

By using a commercial SSL certificate that securely identifies and verifies the FTP server, by limiting the communication through the firewall to some well-defined ports and by only allowing that traffic between a limited number of hosts, the risks are reduced immensely. The risks avoided are connecting to falsified hosts, password sniffing and data theft. The traffic that is allowed is far less risky and dangerous than anonymous or, what they used to do and allow, clear text authentication to non-verified servers on the internet. But still some people insisted that the FTP over SSL solution was introducing a serious security risk. Really and this isn’t the case with passive FTP without SSL? Sure it is, you just don’t realize that it happens and allow FTP traffic to wide range of dynamic ports and unknown hosts. So frankly crying wolf about properly configured FTP over SSL is like using “coitus interuptus” for birth control because you’ve read that condoms are not 100% failsafe. You’ll end up pregnant and infected with aids. That kind of logic is pure gene pool pollution. It’s also proof of an old saying: “never argue with an idiot, they drag you down to their level and beat you with experience”

Beware of NAT/PAT

As we mentioned in the beginning NAT has its own issues to deal with, so we still have to touch on the subject of NAT/PAT with FTP servers. Let’s first look at what is needed to make this work. You have already seen how the basics of passive FTP data connection work. The client sends a PASV command and the server responds by entering passive mode and telling the client what port to use.

Now with NAT/PAT devices the IP address needs to be swapped around. To do this these devices sniff the network traffic for the PASV command to find what port is used and turns the FTP server response from “227 Ok, Entering Passive Mode (192,168,1,32,203,8)” into 227 Ok, Entering Passive Mode (193,211,10,27,203,8).

As you can see the private IP address (blue, the first 4 numbers) is swapped to the public IP address (green) on which the FTP server is reachable and retains the port to use (red). The last to numbers in red describe the port number as follows: 203*256+8 =51976. When the client connects the reverse process takes place, the public IP is swapped for the private one.

PassiveFTPNatRewrite

You can already see where this is going with SSL. The NAT/PAT device cannot sniff the traffic for the PASV & PORT commands to see what on what dynamic port the client should establish the data channel and also due to the encryption it cannot alter the PASV command to swap around the IP addresses.

The best solution to this is to specify a firewall helper address for passive FTP which we can set to the public IP address of our FTP Server. Your FTP Server must support this; you’ll find that IIS 7.0 and IIS 7.5 do.

Other possible solutions and workaround are:

· FTP Clients that “guess” the address to use when the IP address in the PASV command doesn’t work (that would be an internal private range IP address). They then try to use the public IP address to establish the connection, which can work as the change is it is the public IP address of the FTP server or the public IP address of the NAT/PAT device. No guarantees are given that this will work.

· NAT/PAT devices sometimes allow for specified ranges to be forwarded to a specific IP address. So you could configure this to be the case for the small range of dynamic ports you defined for Passive FTP.

· Some FTP servers support he EPSV command (Extended Passive Mode), which only sends the port and where the IP address is the one used for establishing the control connection.

Be Mindful of Load Balancing on Server and/or Client Side

If Load Balancers are in play we must make sure that the communication always goes via the same node and IP address when using SSL or you’ll break SSL. If multiple IP addresses are used to route certain traffic via a certain device you make sure the FTP client doesn’t switch to another IP address for the data connection as this will fail. Both control and data channels must use the same IP address or passive FTP will fail even without using SSL. Also don’t forget some customers uses load balancers to route traffic based on purpose, cost, redundancy, etc. So this is also a concern on the client side. In the IIS log you’ll see that it complains about IP addresses that do not match. I’ve had this happen at 2 customer sites, which were easily fixed, but took some intervention of by their IT staff. Luckily they both had a competent SMB IT consulting firm looking after their infrastructure.

Table with FTP risks and mitigations

RISK MITIGATION RESULT
Server Connects to Client Use passive FTP Client initiates connection
Dynamic ports in use Select smaller fixed range of ports Less ports to open on firewall
Server not verified Use commercial SSL Certificate Server can be verified
Authentication not encrypted Use SSL for authentication Authentication encrypted
Data not encrypted Use SSL for data transport Data transport encrypted
Connections from & to unknown hosts Allow only trusted clients and/or servers No more FTP from/to any host.

Direct Access Step By Step Guide Version 1.2 released

I’m about to start work on a Windows 2008 R2 / Windows 7 Direct Access project and while gathering some resources (I played with it in the lab last fall) I noticed the Step by Step guide has been updated to version 1.2 which was published on June 18th 2010. It’s a great kick start for demoing Direct Access in a lab for management. Grab it here. http://www.microsoft.com/downloads/en/confirmation.aspx?familyId=8d47ed5f-d217-4d84-b698-f39360d82fac&displayLang=en. If you’re hooked and need more info, check out the Direct Access pages on TechNet: http://technet.microsoft.com/en-us/network/dd420463.aspx

Some people complain Direct Access is (overly) complicated. Well, it’s not a simple wizard you can run or some SOHO NAT device that you plug in, but come on people. We’re IT Pro’s. We did and do more complicated stuff than that. As a matter of fact I remember some feedback John Craddock got last year at Tech Ed Europe (2009). Some consultancy firm employees told him he should not make it look that easy. Organizations need consultancy to get it right. Really? Some will, some won’t. I have nothing against consulting, when done right and for the right reasons. I even consult myself from time to time with partners who need a helping hand. But take note that the world does run on people, and consultants are people (really!). What they can learn,  you can learn. Just put in the effort. So go have fun setting up Direct Access and giving your road warriors and IT Pro’s some bidirectional and transparent connectivity to company resources. To me Direct Access was one of the big selling points for Windows 7 / Windows 2008 R2. Better together indeed 🙂