Why The Cloud is Risky

As many people know we are in the age of the cloud.  The entire purpose of the cloud is to minimize implementation timelines and reduce costs.  Typically, with these savings come some risks.

Risks of the Cloud

  • Hidden Costs
  • Lack of Accountability
  • Security
  • Insider Threat
  • More Silos

The cloud isn’t all bad, but there’s an old saying of “junk in, junk out.”  If you move all your servers to the cloud, all you did was move your problems.  You need to optimize your environment.  You still have vulnerabilities and things to patch – you just moved them into a shared environment.  A shared environment that is still prone to the same vulnerabilities as your previous environment, only now you don’t know what the underlying software and hardware is.

Hidden Costs

A one-time move can cost thousands of dollars!  You have data storage costs and network bandwidth to think of.  These are things you did all in-house previously that you took for granted.  You’re paying for long-term storage of data and the growth you accumulate every month.  Integrating Apps become even more complex when you think about trying to integrate a cloud-hosted email provider with a different cloud hosted UC solution.  Remember the goal of moving to the cloud was to have that instant on and off capability.  This complicates things a little bit.  It’s not impossible, it’s just complicated!

Embedded costs such as power and rent are not always embedded in IT budgets.  These are things you had previously that you took for granted.  Well, now all these costs are embedded into the cost of a hosting provider.  By having to include things such as power costs and floor space this can negatively increase IT budgets.  To be frank, these are IT costs.  They should be included in all costs!  If you purchase a server that consumes enough floor space to warrant the need for a huge warehouse it needs to be a factor of the purchase!

Lack of Accountability

I’m the most important customer!  I can easily hold my cloud provider accountable!  Well, I hate to break it to you, but that’s not true.  In September 2015, Amazon had a massive outage that took Netflix, Airbnb, Tinder, and IMDB offline.  It was huge!  You have to realize these are high dollar spenders than 100% require their services to be online or it will have a significant impact to their revenues.  There wasn’t an apology or a viable explanation for the outage.  Now think about government users where the outage costs lives or a failed mission!  I’m not trying to bash Amazon, it’s just a scary truth about shared resources that you have little to no control over.  I don’t know about you, but I like to be able to control my destiny.

Security

Data breaches are now becoming everyday occurrences.  Dropbox, Ashley-Madison, OPM, ADP, IRS, and more.  The point here is that it’s a serious threat!  Your virtual machines will be no less prone to security vulnerabilities just because it’s moved to a shared service provider.  You still have to patch and maintain all of your servers.  The real issue here is you inherit all vulnerabilities of the software and hardware of your cloud vendor.  Who is liable when a security incident occurs?  What do you do?  Read the fine print.  Most cloud provider’s customer agreements say if your operations are down because the cloud provider has an outage, then they are not responsible.

Insider Threat

Insider Threat has never been more of an issue than it is today.  Data is available at lightning speed and then everyone keeps track of everything on everyone forever.  Think finances, medical records, phone, text media, social, etc.  Who has access to what information is not only a concern for the government but by HIPAA and Sarbanes-Oxley.  You don’t want anyone to have access to your private medical records, financial records, or better yet, the private conversation you were having with a spouse.  This is what is at stake here.  Insider threat and the cloud is where things come off the tracks.  Do you know exactly who has access to your data at the cloud provider?  Should they have access to it?  Do they have the proper training to handle the type of data you have?

More Silos

The biggest complaint I have about the cloud is it creates yet another silo for your organization.  Or better yet, it can create a virtual junk drawer to throw everything in and forget about it.  As a small business, we internally have seven cloud applications that each have a username and password.  You have to manage all the provisioning of users, authentication, licensing, and more and more.  The silos that exist today don’t just go away because you moved to the cloud.  They are amplified under a magnifying glass.  You now need a cloud authentication provider to manage all your user accounts.  You still have a networking team, a storage team, an Active Directory Team, and a security team.  These things are all compartmentalized.  It’s just virtual and not in your hands.

What do I do about the Cloud?

The intent of this post was not intended for you to think the cloud is entirely risky.  It’s intended to make you change your conversations about the cloud and to think it through.  I personally like to control my own destiny.  If you like to control risks and be the owner of your destiny, then sign up today to see if you qualify for a free data center assessment.

Get Your Data Center Assessment Now

References

Should my desktop team or my server team manage VDI?

I have had customers take different approaches in determining who manages the virtual desktop infrastructure.  Enough customers have asked that you should consider these things when making a decision:

Politics and budget often determine who owns VDI

Well, yes of course… politics.  I’ve seen this many times.  Let’s be clear, politics is the number one killer of VDI.  If funding from one team drives the need of VDI, then it pushes you into a project. Just know that lots of communication can solve any problems before they become problems.  Otherwise, you will become Customer A or Customer B.

Customer A: The desktop team owns VDI

This is by far the most nightmare situation.  The reason is that desktop admins have little experience in managing servers and the least amount of experience with all the other technologies that the VDI will need to function properly.  To name a few things: DHCP, DNS, Active Directory (Specifically AD Sites and Services, GPOs), network routing, and switching.  Before I get all the hate mail this isn’t necessarily true for every organization,  this post is about a real world customer(s).

Pros:

No one knows how to provision and manage a desktop like the desktop team, period. Seriously, no one!  This includes a virtual desktop image.

Cons:

The desktop team lacks experience managing enterprise services.  There are way too many things that go into VDI outside of the desktop and user devices to leave the entire VDI project to a desktop team.  Not to mention, I’ve seen so many instances where a server team or enterprise team is reluctant to help out.  This leaves you in a bad situation if, say, the problem is networking related.

Customer B: The enterprise team owns VDI

As a former enterprise architect, I am a little more at ease, but we aren’t completely out of the woods yet.  A server or enterprise team has the least amount of interest or experience managing a desktop.  Even with four or more years as a desktop admin that was eventually promoted to a server admin, etc. it’s still a few years since they’ve had hands on experience as a desktop admin.  We haven’t even mentioned the apps yet.  There are so many customizations to make applications work for compatibility reasons, patch reasons, etc.  Only the desktop team knows of these fixes.

Pros:

Enterprise server teams typically have more years of experience and will be able to troubleshoot and fix issues as they arise that are beyond the desktop.

Cons:

With more years of experience comes a higher price tag.  You now have more expensive resources managing desktops.  Any hope of ROI goes out the window.  You’re not completely better off either.  You still need that desktop admin!  They know where all the bodies are buried when it comes to application provisioning and fixing those troublesome apps that your users have to have.

I heard lots of stuff but what do I do?

You are already doing what works best in your environment today.  Desktop admins manage desktops, server admins manage servers, network admins manage network issues.  How does this translate to VDI? You can’t do business as usual so let’s put that out of your mind.  You need a few members that are from all aspects of the environment to be your VDI team.  Having a dedicated team to manage VDI is a good idea, but not always practical given the size of the environment.  If you have a large deployment, a dedicated team with all areas of expertise is needed.  You need a lifeline into your network, desktop, and server teams.  It can’t be done in a silo, otherwise all the problems you’re trying to solve with VDI just create other problems.

Conclusion

You need someone with experience as a desktop admin, server admin, and network admin to be successful at VDI.  Don’t try to be everything to everyone.  You will lack one skill set or another to do this in a vacuum.  Get everyone involved.  All the best and brightest will have plenty of opinions.  Listen to the opinions.  Get a VDI SME or a consultant to assist you in building your environment and you will be much better off.

Strategic Rowdiness: A business mantra that allows for creativity and growth

It took me a while to coin this phrase that we use internally at DH Technologies.

I like this term because it describes our company well.

We try to think big picture and always look to the future.  This allows us to focus on emerging technologies and evangelize the ones we believe in.  We are often called the loudest group in a crowd.

I don’t mind being with the loud group; in fact, I encourage it.

It’s ok to stand out!

I think in this day and age the outcasts are the ones who become the most popular and successful.

What does it mean to be strategically rowdy?

You have to think about things differently.  You have to look at things in their most basic form and try to get a better understanding. Question why it is the way it is.

I was taught to not blindly follow rules but to get a better understanding of the purpose of the rule.  Unfortunately, we all get hung up on hearsay and bad information, and it creates a bias that is difficult to overcome.

You most certainly CAN NOT do business as usual.

You have to anticipate the future!  Think critically!

Rebuke biases and current beliefs; both your own and your customers.

Interpret the information at your disposal and look for opposing views.

Get someone to challenge your way of thinking so you can see different perspectives.

Educate yourself so you can be prepared.  Then go back and educate yourself again and again!  All of these things combined are an unorthodox way of thinking.  It’s ok to be different.

Finally, you have to take a stance! 

This is where the rowdiness comes in.  

You can’t stand short in a tall crowd.  You can’t be silent in a group of people.  You can’t watch as the pitch goes sailing past you.  Be noticed!  Be passionate!  Be a disruptor!  Get the word out! Be rowdy! Just don’t be rude!

We carefully orchestrated moves in the first couple years of business as DH Technologies.  We started with the right technologies.  We then focused on ensuring we have the right government contract vehicles for our customers.  We keep looking toward the future.  Maybe you can challenge our way of thinking or educate us on the next great thing.  Either way, we are proud of being rowdy, and encouraging creativity.  All these attributes have contributed to our growth and success.

Thinking like this has earned us a few accolades such as CRN’s #1 spot on Fast Growth 150, CRN’s 2016 Solution Provider 500, and Washington Business Journal’s 2016 Top Small Technology Companies list.  Obviously, that’s just the beginning of great things to come.  We are hiring!  If this sounds like something you want to be a part of apply for a position.  You can be part of our creativity!   Contact Us

The shortcomings of virtual desktops

I have never been one to shy away from controversy.  There are a number of things that make virtual desktops great!  Personally, we use virtual desktops internally to keep all our internal documents private and secure while providing a great desktop experience to our company.  It works for us.  We eat our own dog food to make sure that any issues we run into are solved before our customers run into them.  With that in mind, here are the biggest issues with VDI:

1. Users

This couldn’t be an honest document if I didn’t first address the elephant in the room.  Users cause a majority of the problems, right?  Well, that’s sort of true and also so far from the truth.  Users, when not trained on how to use a technology, will get creative.  Creativity when not directed will lead to problems.

Example: user says, “I couldn’t login all day so I didn’t get anything done”

Thank God for logs!  Otherwise this would have got me in trouble a long time ago.  This really happened!  A user decided to blame VDI for not being productive and getting something done.  The worst of it was the supervisor believed him/her and it almost lead to the end of a VDI pilot that otherwise was very successful.  You need to train the users.  Additionally, having a good tool to evaluate user login times, application launch times will help you identify a performance issue before a single help desk ticket is opened.

2. Data Forensics

You’re wondering why I even brought this up aren’t you?  Well here’s the problem, data forensics for a non-persistent virtual desktop is a huge problem.  If a network was breached by malicious intent or a user opening some bad link in an email, we need to track it and figure out what happened.  The problem is that these issues are often discovered hours or days later.  In a physical desktop environment, this isn’t a big deal.  You can remotely connect to their computer and pull down the logs or you can image their PC with something like FTK to include a binary dump of their hard drive and RAM contents and do the analysis.  You can’t do that on non-persistent VDI.  Or can you???  Yeah, it took a while to really solve this issue, I’m not giving it away in a blog post but I will be more than happy to have a discussion with any customers that have a concern.

To piggy back on the forensics issue, we had a customer that had a user download some terrible illegal pornography.  Yes, it happened at a government site!  NCIS showed up and asked to take the computer.  Well, I am all for complying to military policy but after explaining to a military police officer that a zero client is literally zero and would provide them with none of the things they were looking for, what do you do?  See the cliff hanger….  You have to contact me for the answer.  And no, I won’t tell you who the customer is.

3. The Network

This one is too easy!  If you’re the server guy/gal, it’s always the network.  If you’re the network guy it’s always the server.  Well, the truth of the matter is that if you don’t have a solid network then you won’t have a solid VDI.  Customer environment: the virtual desktops are all down for everyone on the west coast.  Server guy talks to network guy, “Are there any network changes? No?” *hears keyboard typing*  – He walks back to his desk…  Everything works again!  Let’s be clear, it was the network and the network team changed something in the middle of the day and now it’s working.  This happens all the time.  You have to realize that physical desktops can handle networking changes a little better.  You generally need connectivity, and while it can go slow for a brief period of the day, it’s likely that a user won’t open a ticket because their computer is going slow.  Those reboots the help desk tells you to do also just buy some more time for the solution to fix itself.  Now fast forward to VDI, slow network equals poor user experience.  The best part is, the VDI team will get blamed and not the network team.  It’s all VDI that causes the problem after all!

4. HBSS / Antivirus

It’s common knowledge that HBSS (Host Based Security Scanner) will kill any desktop experience, physical or virtual, if not implemented correctly. I have had my fair share of knock down drag outs with the HBSS team for making a change in the middle of the day that was thought to be benign and harmless.  How does this have anything to do with VDI?  Well, HBSS is a suite of applications such as a host based firewall and antivirus that is centrally managed.  The first concern is if a policy gets pushed to a virtual desktop and it kills the ports and protocols that are needed to connect then everyone will immediately get disconnected.  Yes, it’s happened…  No customer example necessary, that happened.  Additionally, antivirus policies that are typically deployed to physical environments want to scan everything opened, read, modified, closed, and do the same thing daily at a specific time.  In a physical desktop world with a thousand PC’s you have a thousand hard disks.  In a virtual desktop environment, you could have 100 hard disks.  You have to treat those shared resources carefully or you can inadvertently cause a denial of service on your network by doing something like running an antivirus scan in the middle of the day.  True story, a government customer I worked for once thought they were being hacked on an anniversary of 9/11 (not saying who) because the previous day they implemented significant and untested HBSS changes that would check everything.  I was one of fifty people evaluating the hack and the only one who accurately identified it was the HBSS settings.  I should point out that every VDI deployment DH Technologies does comes with ports/protocols and network diagrams BEFORE an engineer comes onsite to eliminate these issues.  Also we have this awesome document that explains how to solve the deployment of the HBSS agents for VDI while still provisioning the framework necessary.  All you have to do is contact us.

5. User Persona

User persona is unique to virtual desktops.  User persona is essentially anything that you changed or created on a desktop.  It’s basically your profile but it’s also registry keys and outlook email signatures, printers, etc.  When a user persona isn’t setup correctly or a small blip occurs in Active Directory that causes the persona to not process correctly users get logged in with none of their data or settings.  This generally causes panic and users think all their data is gone.   After all, that’s exactly what had to happen on a physical desktop for them to see that kind of scenario.  This is probably one of the most common issues that I see regularly.  It happens in our corporate environment from time to time.  It’s usually caused by not properly checking a patched master image to see if it still processes the user persona policy properly or an Active Directory GPO conflict.  Easily fixed.

6. Printers

Printers in my opinion are what’s wrong with the world.  It’s kryptonite.  What happens when your virtual desktop is in a data center 400 miles away and you want to print to a printer that’s sitting in the same room as you?  The print job has to spool to a print server that’s hopefully in the data center and then all the way back to the printer that was sitting right next to you.  Well this can cause it to go slower and it will definitely create some additional network bandwidth that you would never see in physical desktops.  Let’s not get too freaked out.  There’s lots of different ways to solve this issue.  To be honest this was more of an issue three years ago. Solutions: ThinPrint, Uniprint, direct USB printing, location-based printing and more…

7. Slow Login Times

This is by far my favorite complaint with VDI.  The reason why is because I can already tell you what caused it and I know literally nothing about your environment.  First off, remember these virtual desktops are not physical desktops so stop treating them like they are.  Every time a user logs into a non-persistent virtual desktop it’s like the first time they logged on every time.  They get to walk through the out of box experience, profile setup, etc.  GPO’s have to process (almost always setup incorrectly) and lots of other first time things.  The VDI industry has solved this issue slowly but steadily.  Additionally, Liquidware Labs provides an awesome tool that will actually breakdown the login process to tell you exactly how much time it takes to find a domain controller, process GPOs, etc. to determine the exact cause of slow login times.  Did you know the most common login time killer??  GPOs, CA certificates (for smart card logins only), and printers.  Yes, printers also kill the login times.

Top 7 VDI Shortcomings

  1. Users
  2. Data Forensics
  3. The Network
  4. HBSS / Antivirus
  5. User Persona
  6. Printers
  7. Slow Login Times

 

How do you optimize your virtual desktop image and save up to 40% resources and increase performance?

A non-optimized virtual desktop image consumes additional CPU, memory, network bandwidth, and IOPS.  Why?  Because Windows 7 and Windows 10 weren’t created to be a virtual desktop image.  Additionally, the applications themselves are all greedy selfish little apps that want to consume as many resources as they can with little regard to the other applications.  Now, not all apps are created equal.  Some are more important than others.  But in the end, they all need to be evaluated to determine how much resources do they consume and is that okay or not.

First off, how do I even begin to evaluate resource usage?

While I try to make these posts agnostic to vendors, there is one particular vendor that does some amazing things when it comes to doing an assessment:  Liquidware Labs. We use this vendor exclusively to perform virtual desktop assessments of customer environments.  Additionally, we have built some custom tools to read the raw data and build a design document out of it.

Little Known Fact: Apps can be optimized too, and they should be.

Yep, that’s right, applications can be optimized too.  As a matter of fact, optimizing the applications can reduce more resources than optimizing the windows operating system.  Imagine this: a customer deploys 1000 virtual desktops and doesn’t optimize it.  This happens all the time.  What would happen if we optimized the OS AND the apps and saved 40% of the resources?  You could get 400 virtual desktops for free!  Now, imagine if we did a customer environment with 50,000 virtual desktops.  If we optimized that environment it would save enough resources to support an additional 20,000 desktops.

Get your environment optimized today!

Contact us to have a consultant come on site and evaluate your environment.  Here’s what we look at:

  • Hypervisor hosts
  • Virtual desktop image
  • Applications

DH Tech’s Windows Optimization Guide – Don’t lose the look and feel of windows and sacrifice to get a high performing virtual desktop image.  Citrix and VMware tell you to turn off the awesome look of Windows 7/10 to get the performance you need.  We found a way to do that without the sacrifice.  We will help you optimize your desktop image and give you a copy of our optimization guide.

Reference:

Citrix Windows 10 Optimization guide for XenDesktop

VMware Windows 10 Optimization guide for VMware View

5 ways your VDI project can be more successful

Many organizations are moving to virtual desktops for a variety of reasons.  I have had the luxury to observe both successful and failed VDI projects.  For the first couple of years when we started our company, we made a majority of our money by saving failing VDI projects.  Over the years, I began to think: “What do all the successful VDI projects have in common?”  Well, here’s the list of things I came up with:

1. Buy In From the Top

You can’t force a new technology on users without buy in from upper leadership.  Ideally, you have already aligned organizational goals to capabilities and features of VDI.  I recommend a Requirements Traceability Matrix (RTM) to ensure all the requirements are met, but that’s for another post.  Upper management and leadership needs to be onboard with the changes that VDI introduces to any organization.  If upper leadership doesn’t believe in your mission and project goals, then what makes you think the users will?  If you are wondering why you need to care about what your users think, then skip to point #2.

2. Communication, Communication, Communication

You have to be truly transparent with your users and leadership about what your plan is and how it will benefit your organization.  There are countless benefits that virtual desktops provide your users, but if you can’t very easily articulate them to you will have a rocky project.

Provide pamphlets, computer based training, and user outreach for the end users.  If you show users the benefits of something simple such as session persistence, which provides you the ability to move from device to device without needing to login/logout, you will immediately win over a vast majority of the users.  One of the most successful outreach events we lead was done in a government cafeteria.  I’ve always been a fan of lunch and learns.  We had a line of twenty or more people and it was quickly begging to get longer and longer.  Not only did this provide outreach to educate the end users but it got our government customer extra funding for his project.

What doesn’t VDI touch?  No seriously, what does it not impact?  VDI changes the user devices, network, data center footprint, energy usage (reduces), applications, licensing, management, troubleshooting, provisioning, and more.  This is just another case of why you need to communicate with your users and all the other departments.

3. Pick the best and most simple technology that’s highly scalable

All successful VDI technologies share common attributes: scalable and simple.  Don’t use 10 different technologies when five will do the job.  Don’t use five technologies when three will do just fine.  You really have to keep it simple.  Why, you might ask?  Well, if I have ten technologies, I have users and administrators trained in ten different things.  I also have the potential for ten different things to fail at some point which increases my trouble tickets for the help desk.  Not to mention the decision tree for troubleshooting for the help desk is complex and long which increases the time it takes to close a ticket.  I’m not saying this is always true, but generally speaking it is.  We have been leading our VDI deployments with hyper-converged solutions which take out the complexity of deploying VDI.  How? We eliminate the installation time because the hyper-converged solutions we deploy are deployed in an automated way which cut install times down to hours and not days or weeks.  Additionally, I don’t have to have a SAN admin, or someone to do zoning or masking.  Not to mention it’s highly scalable and predictable which makes it easy to size for small, medium, and large enterprise deployments.  Ask for customer references before you chose a technology!  You can thank me later.

4. Choose the right integrator

This is where things get tricky.  There are two different perspectives on this and I have a biased position which needs to be addressed.  I am an integrator.  Lets get that out of the way.  Obviously I would prefer you to use our services.  You may prefer to do the work yourself for financial reasons, or political reasons.  Let me explain why you should consider using an integrator for at least some of the work:

There’s a trade off between user experience and technology that is a work of art.  You need to always think about the user experience in order to have a successful implementation.  It’s not something you typically think about when deploying a new server.  It’s something you need to constantly think about when moving to VDI.  How will this impact the users’ experience?  You need to communicate changes to users regularly and always before it happens.

User experience is derived from look and feel, AND Performance.

Example:

  • user experience is different from Windows 7 to Windows 10
  • user experience changes between Office 2013 and Office 365
  • user experience is different from 1 CPU at 1GHZ and 2 CPUs at 2GHZ
  • user experience is different from 2GB RAM and 4GB RAM
  • user experience is different from a software GPU vs a virtual GPU
  • user experience is impacted by login times, printing times

I can install this myself, I have done virtualization before

On a recent VDI project I was able to determine within five minutes that a WAN link was insufficiently sized which would cause a problem.  There are things that someone with experience can quickly pickup.  It’s the tell tale indicators.  Lets be clear, i’m certain you can install the hypervisor stack if you or your team have done it before.  After all, i just said we chose a simple and automated solution by leveraging hyper-converged solution.  As someone who has overseen more than a hundred virtual desktop solutions, I can say with confidence there are many differences between a server virtualization project and a desktop virtualization project.  Users will see everything you do and if things run a little slow for even a short period throughout the day, you will get several help desk calls/tickets.  You typically don’t have user profiles on servers.  User profiles are a majority of the number of calls/ tickets.  You don’t want a good solution for user persona management, you want the best!

So what’s the solution then?

Any good consultant can help build a plan and leverage your team’s abilities without a crazy bill.  There are several good approaches to building out the environment.  Leverage an integrator to do a full turn-key deployment or take a hybrid approach and leverage your team with a consultant to build the environment out.  Either way, do not try to do VDI without someone who has done it before.  You will make mistakes.  It’s inevitable, and a failed VDI pilot is the quickest way to kill any hope to deploy VDI for your organization.  Besides, a subject matter expert can not only help the project out, but you can benefit from on the job training your team will receive.  Think of it like a safety net for your architecture and deployment plan.

The design difference

A server virtualization project is designed from the data center out to the edge.  A properly designed desktop virtualization project is designed from the user to the data center.  You start with use cases and performing a desktop virtualization assessment while working towards the data center.  This will help you size out the environment and determine if network segments are sized appropriately, determine application requirements, etc.  Not one successful VDI project in the past five years has ever been done for more than 500 users without performing an assessment.

5. Change Management

Yikes!!!  Seriously, this is more important than you think.  I have been to countless customer environments where I was told, “The system runs slow, fix it.”  You can’t make a little change in VDI with out it having a huge impact.  For example:

  • Windows Patch – No one can log in any more (This happened during patch Tuesday)
  • Application Patch – all users that use print to PDF don’t work any more (Not a VDI problem)
  • Network Update – a simple update moving users on a segment to an MPLS network causes the MTU size to drop by just a little bit and now everyone on that network segment can’t connect (this happened to a customer!).
  • Recomposed the desktop = BOOT STORM (lets be honest, that was a 5yr ago problem)

Don’t fret – VDI can ease a lot of these issues.  If you mess up a desktop image, you can simply revert back to a previous version of the snapshot and push all the new users trying to log on to the previous image.  Same for applications: you can push the previous version back to users.  You’re kind of hosed on the network update if you don’t have any easy methods to undo that change.

You saw the part where VDI touches everything, so make sure VDI is a priority during other enterprise changes so that something that appears to be simple doesn’t have an unintended impact to your VDI deployment.

Get a test system

You should definitely get a test system to perform updates and test patches on.  If you choose the right technology, you can get a scaled down version of what you deployed to do all your tests on.  The goal is to reduce outages and trouble tickets without increasing the bill.

Conclusion

VDI can make your end users and organization more agile to meet new demands and keep up with the ever changing world that we live in.  It’s a lot harder to implement these new capabilities and features without upper management buying in on the project.  By ensuring communication is constant, it will ensure you have happy users and happy administrators.  Taking the time to evaluate and choose the right technology will not only make the difference in success, but it will have an impact on training and administration.  Choosing an integrator with hands-on experience brings subject matter experience that is vital to speeding up a deployment and making it a success.  Change management isn’t sexy, but it means that you have a great technology that stays running long-term with little to no service interruptions.

 

Back to the Future

Today is October 21st, 2015. The day that Doc brought Marty McFly and Jennifer Parker out of 1985 to save their future children. In the span of 30 years from 1985 to 2015, a lot has changed in the movie: hoverboards became a reality, everyone owns a flying car, and fashion dictates sharp, metallic apparel (as is true with all movies based in the future). In real life, certainly, we would have expected at least one of these to come to fruition.

Lexus claims that they’ve created a hoverboard, however, after this hoax in 2014 I’m not sure if I can open my heart up again to the possibility of a real hoverboard. Flying cars, unfortunately, aren’t as accessible as they are in the film, and I don’t know if mini-Cessnas, however cool they look, would qualify as a practical commuter vehicle. And as for metallic clothing of the future, it’s only reserved for celebrities, and I don’t think many will hop on board with the style. But in the world of technology, specifically, virtualization, leaps, and bounds have been made over the last 30 years that have changed the way we see our servers, our storage, and our SAN (or lack thereof).

In fact, 30 years ago, the setup for virtualization involved closing your eyes and pretending you were somewhere else – a setup that certainly still works in practice, but is becoming less and less practical due to the fact that incredible advances from companies like Dell, VMware, Nutanix, and Arista are making. The data center isn’t what it used to be – a giant warehouse full of servers made out of floppy disks and spare car parts even 15 years ago would be considered a technological blessing. New technologies are coming out all the time that are able to seamlessly integrate with our business every day.

What are our predictions for the future? Over the next ten years, we believe that complex infrastructure will be a thing of the past. There will be no need for multiple hops between machines, which will decrease any kind of bottlenecks within the data center. Enterprise-class solutions will be made readily available to anyone who needs a virtualized application that saves rack space and can easily scale to fit any project. An invisible infrastructure will create an environment where the end user doesn’t need to focus on any behind-the-scenes work.

You won’t need any nuclear reactor to generate the proverbial 1.21 Gigawatts of energy that you need to jump into the future, because the future of virtualization technology will come quicker than you think, without the need for a flux capacitor.

DH Tech at Nutanix .NEXT

This past week Nutanix held its first annual .NEXT user conference. Over 1, 000 customers, partners and employees were in attendance. The biggest announcement from the hyper convergence company was the announcement of Nutanix Acropolis – their KVM based hypervisor and the Nutanix Xtreme Computing Platform (XCP), which is powered by Acropolis and the Nutanix Prism UI. Many see this is as a slap in the face to VMware (who noticeably was not in attendance) others see this as a next step in the software defined evolution of enterprise level technology. Its hard to determine where announcements like this will take us as an industry.

Nutanix Federal Partner of the Year
Nutanix Federal Partner of the Year – DH Technologies

I used to call myself a Server guy, VMware guy or more recently Virtualization guy. As technologies converge (see what I did there) I don’t think I can use those labels anymore. I like the term Technologist. It’s fitting for the “Swiss army knives” we’ve become. The traditional silos are breaking down in IT and as our customers are required to wear multiple hats so are we. During the conference, Nutanix CEO Dheeraj Pandey said it best, “While our competitors focus on us, we are focusing on our customers

This year we were awarded the Nutanix Federal Partner of the Year award for the second year in a row. Customer focus remains number one here at DH Tech.