Attending the Global MVP Summit 2019

Attending the Global MVP Summit 2019

This week I registered for attending the Global MVP Summit 2019. My flights are booked, as is the hotel. The train tickets to and from the airport can still wait a bit. I am already looking forward to it. Getting out of the office to interact with so many people with various back grounds and roles is a great way to evaluate where you stand, what you do and where you are going. That’s why I also attend and present at conferences and I am happy to contribute to community programs like Microsoft Most Valuable Professional, Veeam Vanguard, DELL Rockstar etc. It is time and money well spent.

Being a Microsoft MVP is a great experience. You get to give and receive feedback on their technology and gain a better understanding about the why.

The experience

I for one can say that arriving in Bellevue and at the Redmond campus has always been an educational, stimulating experience. There is always a bit of a buzz and a focus. They’ll gladly pull you in a room with a few program managers to discuss issues, ideas or concepts you brought up and, in some cases, you’ll see the results of that discussion later that year. That’s kind of cool and satisfying to help make something a little bit better with our contributions.

This doesn’t mean we agree on everything, far from, but discussion is good as challenging things help find the strengths and weaknesses. There is a reason for the word “real-world knowledge, independent,expertise and passion to describe what MVPs do and are. As one manager puts it “MVPs help keep us honest”. It is one way of prevent tunnel vision and I think this might be the most important part for Microsoft. They don’t need a couple of fan boys showing up but people with feedback and opinions, and real world, hands on experience.

I realize that I do not I get to set the course at Microsoft or determine their business plans and strategies. Far from. But you are a valued contributor, a real-world bio indicator of how well they’re are doing in the global IT ecosystem. In that week we work together and learn from each other. I feel appreciated.

Even when I drop in to their offices when passing by Redmond, I always got a warm welcome. Many of my own employers or managers have never made me feel that appreciated and welcome. We can learn something from that. Appreciate people who dare tell you what you need to hear, not what you want to hear. Whatever you do after that feedback is fine, but at least genuinely listen and evaluate it.

Conclusion

I think it is all the above that keeps me coming back to the MVP Summit. Sure, it costs me: vacation days, travel expenses, out of pocket expenses. All costs that I cannot declare or recuperate. I’m traveling for 20 hours inbound and outbound but, with or without my employers support, I am going.  It is an investment in myself and you cannot let others decide on everything you learn or do. Remain valuable, remain independent and don’t be afraid to do the right thing. With or without a sponsor or your employers consent, go! Go and be all you can be.

Kemp LoadMaster template import issue: Command serverinit needs 1 parameter(s)

Introduction

The error Kemp LoadMaster template import issue: Command serverinit needs 1 parameter(s) was encounter during a migration project I was involved in recnetly. Recently. The job was to migrate the virtual services of an aging Kemp LoadMaster HA solution (LM-2200, 32 bit) to newer and more capable versions (running 7.2.43, X64 bit). Even more important, migrate to a version that is still in support. The LM-2200 series are like many older ones End of Life (EOL). Kemp still delivers important fixes for critical security issues like this year (7.1.35.5 and 7.1.35.6) but that’s it in long term support (LTS). This is not bad, these have been supported for a very long time but the aging 32-bit hardware has reached the point where replacing the is the only right thing to do.

With Kemp we can select new hardware, virtual machines or appliances in the cloud. So on-premises, hybrid and public cloud needs can be taken care off with the same familiar load balancer / Application delivery controller (ADC).

One technique to migrate Kemp workloads is to export the virtual services to templates and use these to configure the new solution. It is during that process we ran into an error: Kemp LoadMaster template import issue: Command serverinit needs 1 parameter(s)

Kemp LoadMaster template import issueCommand serverinit needs 1 parameter(s)

One of the applications is a mission critical one that has many different virtual services. FTP/FTPS/HTTP/HTTPS and also a bunch of TCP/UDP connectivity over ports that are very application/industry specific. A dozen virtual services, all on the same IP address with different ports and configuration needs.

We exported all of them to templates which we then used to recreate them on the replacement ADC. While doing so we used a different IP address for these newly created virtual services for testing purposes. This test IP address is changesd to the original production IP when ready to make the move and after disabling the old virtual services. That also allows for a quick exit / roll back if needed.

The process of creating a new virtual service requires a little work to be done still such as defining a gateway, SSL certificates, adding real servers, … but the bulk of the work is done for you.

But with certain templates we got an error: Syntax error detected in template file: Command serverinit needs 1 parameter(s)

clip_image001

When looking at the template in a text editor we see the following:

clip_image002

Clearly “servinit” does not have a parameter set. So, it probably needs one, to specify which type of the Server Initiating Protocol needs to be set for our generic service.

clip_image003

Finding a solution

We need to find out what parameter goes with our setting and add that the template. That value is found is easy enough. Via educated trial and error. On a test virtual service on the current version ADC we set the value for the Server Initiating Protocols to our value (other server initiating) and then export this virtual service to a template. We look at the template and note the number. That way we map what server initiating protocol corresponds to what value. In our case other server initiating is “3”.

clip_image004

We edit our exported templates to have the correct parameter value set that we find out this way and try to import it again.

clip_image005

That’s all that we needed to do to get these exported templates to be imported with no further issues.

clip_image006

clip_image008

All that’s left to do is to finish configuring the virtual service and continue our migration.

Conclusion

Don’t panic. Many problems you encounter have a solution, workaround or fix. Maybe this will help someone out there. Take care and until next time.3

Upgrading to DELLEMC Unisphere Central for SC Series

Upgrading to DELLEMC Unisphere Central for SC Series

To prepare for rolling out SCOS 7.3 (see my blog post SC Series SCOS 7.3 for more information on this version) we upgraded our Dell Storage Manager Data Collector and DELL Storage Manager Client to 18.1.10.171. I am happy to report that went flawlessly. This means we are ready to work with CoPilot and will upgrade our SANs over the next week. That is always a phased roll out, to minimize risk.

Upgrading to 18.1.10.171 actually means we are upgrading to DELLEMC Unisphere Central for SC Series, which was announced as part of the SCOS 7.3 upgrade benefits.

The upgrade process itself is straight forward and isn’t different from what we are used to. First you upgrade the Storage Manager Data Collector and then the Storage Manager Client. If you have a remote Storage Manager Data Collector you must then upgrade that one as well.

image

Make sure you have successful backup and create a checkpoint before you start the upgrade. That way you also have an easy exit plan when things go south.

Upgrading the Storage Manager Data Collector & Client

This needs to be done first. It can take a while so be patient. Run the Storage Manager Data Collector 178.1.10.171.exe with elevated permissions. It unpacks and asks you to select a language.

image

Click OK to continue and just follow the wizard.

image

It will ask you to confirm you want to upgrade.

image

Click yes and follow the wizard.

image

Click “Next” to kick of the upgrade and relax.

image

The wizard will provide you with plenty of feedback of what it is doing along the way.

image

The final step after the upgrade is to start the Data Collector service.

image

Starting the Data Collector service can take quite a while. Be patient. When it’s done the wizard will inform you of this.

image

Click “Finish” to close the installer.

On your desktop you’ll notice that you now have an icon called DELL EMC Unisphere Central. This indicates that the storage management for DELL EMC offerings are converging.

image

Do note that if you have a remote Storage Manager Data Collector you must now upgrade that one also, Do NOT forget to keep both deployments at the same software levels.

You are now ready to upgrade the Storage Manager Client. Run the installer with elevated permissions.

image

Just follow the wizard, normally this goes really fast and that’s it. You can log into the new and see that the GUI is very familiar to anyone using already.

What is new is the look and feel of the DELLEMC Unisphere Central for SC Series. It’s not you father’s data collector any more.

image

We’ll talk about DELLEMC Unisphere Central for SC Series later when we have had a chance to work with it some more in real live.

Nuclear data waste

Introduction

Nowadays everyone seems to be heading to the hills to try and cash in on the new gold rush. Data! You have all heard that data is the new gold. Some call it the new oil, as that is their favorite fantasy, that’s all good. But there are drawbacks to this.

The cost of data and data waste

Like any resource you mine it does come with associated costs. A cost that has to be covered by the value you derive from it. That value has to exceed the money spend to gather, process and consume it. That can be an expensive business.

On top of that you have to deal with “the waste” it creates as a byproduct. Waste can be toxic. Mining data tends to produce nuclear data waste, the bad kind where “safe levels” are hard to determine. In the rush to grab the gold many forget a couple of important lessons from history. We should know by now we need to act proactively to avoid waste. That is the cheapest option in the long run and mitigates many of the risks. We should also know that not all data is gold, some of it just glitters, but it isn’t valuable. Fool’s data, like fool’s gold, is essentially worthless no matter how much money you have spent. Even worse, it’s still produces nuclear data waste asset than can get you into (legal) trouble.

Data storages and backups

How much data to you need to get to the gold and at what cost. Storage capabilities as well as storage capacity grows fast and cost seems under control for now. But will this last forever? And even if so, what’s the ratio of data gold versus raw data stored? Can we improve this ratio? Because even when things are cheap, why even do it if it is not needed?

Protecting the data and the waste

And then there is the cost with protecting that data as well as the governance around it.  The sad reality with data is that once you have it the probability that it will get you into trouble is real. Data, sooner or latter will get lost, misplaced, sold, hacked, leaked, … it’s almost guaranteed. Ask any real InfoSec professional (not the standard issue, policy quoting security officers, those are just windows dressing) and they will open your eyes to the reality of the risks. It’s very sobering.

The gold rush

As with any hype or gold rush we can avoid costly mistakes buy looking at history. Think about who benefited and who lost out. Think about why this happened and how. Can you see any parallels?

  • Many people are drawn to the data gold fields. Very few strike a gold vein.
  • There is a lot of money to be made selling the tools, supplies and gear to mine the data, process it, store and protect it.
  • Gathering raw data and processing it can be highly toxic
  • Storing and protecting the gold is expensive and hard.

Let’s dive a bit deeper into these issues, what these mean and how they materialize. They all have one thing in common for sure and that is that the fear of missing out is one of the driving factors.

Gold diggers

The reality is that many people that now become “data scientists” are not all highly skilled mathematicians and experts at statistics analysis. It’s a new hype, just like OLAP tools and data mining where before. We now have BI, big data and data science. That’s where the gold can be found so that’s where gold diggers flock to. Some have the skills, abilities and the luck to derive wealth from that. Most will just have the job digging.

Pick Axe

There is a lot more data than there is science in the hype created around data scientists. Data scientists should be great at math and statistics. Those are not very fields of human endeavor that do not scale well. They are not even popular. Attaching “scientist” to something doesn’t make it a science. Be sure of the quality of your gold and make sure it is not fool’s gold. But the gold rush is on. There’s money to be made. In an era where science is viewed by many as “an opinion” the urge to derive some credibility from adding “science” to any endeavor is a paradox. It is on the rise. Clearly, this shows the value of real science even when some only seem to like it when it suits their agenda.

But the field is exploding as companies want people working on all the raw data they collect. As one statistician stated: “I used to be a boring, underpaid geek with glasses, now that I’m a data scientist I’m cool, in demand and paid very well even if the work is less scientific.” That’s the nature of the beast. Her employer got a real statistician, but as the mines require a lot more bodies, many will make due with less.

Merchants

The “sure” money is in the supply chain. Storage, networking, compute … no matter where it is (cloud, fog or on-premises computing). There is money to be made with tools to process the data, protect it (backups) and secure it against unwanted prying eyes and theft. If you’re selling any of those business is a booming.

Everyone seems obsessed with collecting data. Luckily storage costs are down per GB and we can store ever more. We also need to protect more. But whoever deletes data? Who dares push the button? A lot of data is collected “just in case”. We might find gold in there later and if we don’t have it we cannot for sure. The fear of missing out in action. That is great if you’re selling stuff. Data lakes, data ponds, storage blobs or tables, Mongo DB or SQL PaaS, storage arrays, data processing technology and data protection. These can be products or services, it doesn’t matter, there is money to be made. And while you’re selling you’re not asking the buyers if the really need it. You don’t question them, you praise their insights and help them protect their investment. Everyone is doing it, so must you. The copy/paste strategy in action.

Nuclear data waste

While the vast growth in data is spectacular. A lot of it is crap. But there is very little effort put into  being selective. It’s too cheap right now to collect and store it. No one want to say “we don’t need it” and be the one to blame if you don’t have it.

But in the age of data leaks, hackers, privacy concerns and ever more legislation around data protection it’s worth making sure you don’t store data just because you can. Storing data holds inherent risks. Risk of losing it, corrupting it, deriving faulty information from it, leaking it, have it stolen or abused. It

In the age of GDPR and many other rightful privacy and data protection concerns collecting data should be treated like nuclear power. The value it brings is undeniable. But you don’t need vast amounts of nuclear fuel to deliver that value. You do need very good processes, fail safes, regulation, capable people and technology.

We should start looking at data as nuclear fuel and as such, after use and processing, part of it is left as toxic nuclear data waste. It’s a long-term toxic by product of the process of deriving information form data. Minimize the collection, storage and of data to achieve your goals at minimum cost and risk. Luckily, we have a very good solution for toxic data waste. You can delete it and wipe it securely.

We have to stop thinking that more is better when much of it is junk. The overhead of caring for that junk is ridiculous. We might have to do so for nuclear waste out of need. But for data there are alternatives. Destroy it if you don’t need it. It’s the safest way to handle the legal and reputation risks related to it. That will take a conscious effort.

Efforts and costs

Critical thinking about collecting data is lacking. That is understandable. There is a lot of the money to be made in data mining is in providing the tools to collect, process, store and protect the data. Even with many people that warn us of the security issues and legal responsibilities around it is often about selling services and products. For many all this might turn out to be a lot like the other gold rushes. There were a lot more suppliers of the tools that got rich than actual finders of profitable gold mines. This means there is also a lot of pressure and incentive to feed the “data is the new gold” beast.

Where a SQL database or a data warehouse at least meant you had to put effort into collecting the data, the rise of unstructured data technologies means way too often we don’t care and we’ll figure it out later. Imagine doing that to nuclear fuel! For now, the technical advances in storage and data technologies has allowed us to act without too much deliberation on the sanity of our choices. That might change, it might be wise to avoid the cold shower when it does and benefit from minimizing toxic data risks today.

Conclusion

Now true data gold is very valuable, but make sure you can recognize it. Just going through the motions and buying the tools, copying “in the know” statements from the internet isn’t going to cut it. That is called pretending. Sure, it’s fun. It is also a very dangerous and costly mistake when things get real. At best you look like an idiot with money. Many sales people will separate you from your money very efficiently.

The smarter organizations already have a data strategy that includes waste avoidance, reduction and management. Many don’t unfortunately. Collecting data for those is the main goal, driven by the tyranny of action over strategies. You have to be seen acting and being in charge. The buzz words have to be present and you have to come across as a “can do sir, yes sir” person. Well that is what will kill you. The late Norman Schwarzkopf knew this all too well.

Take care of your weaknesses, figure them out before they hurt you and before they destroy your ability to exploit your strengths. That people is a strategy exercise. I can do that for you and it will cost you a lot of money. But remember, strategies are not products you can buy, they are not commodities and as such buying them is a paradox. A strategy is what will give you the edge over your competitors.If you have others determine your strategy, your competitors will pay them more to find out . So, roll up your sleeves and put in the effort yourself. In the end, it’s all about common sense and this is true for data-mining, AI and BI as well.