The shortcomings of virtual desktops

I have never been one to shy away from controversy.  There are a number of things that make virtual desktops great!  Personally, we use virtual desktops internally to keep all our internal documents private and secure while providing a great desktop experience to our company.  It works for us.  We eat our own dog food to make sure that any issues we run into are solved before our customers run into them.  With that in mind, here are the biggest issues with VDI:

1. Users

This couldn’t be an honest document if I didn’t first address the elephant in the room.  Users cause a majority of the problems, right?  Well, that’s sort of true and also so far from the truth.  Users, when not trained on how to use a technology, will get creative.  Creativity when not directed will lead to problems.

Example: user says, “I couldn’t login all day so I didn’t get anything done”

Thank God for logs!  Otherwise this would have got me in trouble a long time ago.  This really happened!  A user decided to blame VDI for not being productive and getting something done.  The worst of it was the supervisor believed him/her and it almost lead to the end of a VDI pilot that otherwise was very successful.  You need to train the users.  Additionally, having a good tool to evaluate user login times, application launch times will help you identify a performance issue before a single help desk ticket is opened.

2. Data Forensics

You’re wondering why I even brought this up aren’t you?  Well here’s the problem, data forensics for a non-persistent virtual desktop is a huge problem.  If a network was breached by malicious intent or a user opening some bad link in an email, we need to track it and figure out what happened.  The problem is that these issues are often discovered hours or days later.  In a physical desktop environment, this isn’t a big deal.  You can remotely connect to their computer and pull down the logs or you can image their PC with something like FTK to include a binary dump of their hard drive and RAM contents and do the analysis.  You can’t do that on non-persistent VDI.  Or can you???  Yeah, it took a while to really solve this issue, I’m not giving it away in a blog post but I will be more than happy to have a discussion with any customers that have a concern.

To piggy back on the forensics issue, we had a customer that had a user download some terrible illegal pornography.  Yes, it happened at a government site!  NCIS showed up and asked to take the computer.  Well, I am all for complying to military policy but after explaining to a military police officer that a zero client is literally zero and would provide them with none of the things they were looking for, what do you do?  See the cliff hanger….  You have to contact me for the answer.  And no, I won’t tell you who the customer is.

3. The Network

This one is too easy!  If you’re the server guy/gal, it’s always the network.  If you’re the network guy it’s always the server.  Well, the truth of the matter is that if you don’t have a solid network then you won’t have a solid VDI.  Customer environment: the virtual desktops are all down for everyone on the west coast.  Server guy talks to network guy, “Are there any network changes? No?” *hears keyboard typing*  – He walks back to his desk…  Everything works again!  Let’s be clear, it was the network and the network team changed something in the middle of the day and now it’s working.  This happens all the time.  You have to realize that physical desktops can handle networking changes a little better.  You generally need connectivity, and while it can go slow for a brief period of the day, it’s likely that a user won’t open a ticket because their computer is going slow.  Those reboots the help desk tells you to do also just buy some more time for the solution to fix itself.  Now fast forward to VDI, slow network equals poor user experience.  The best part is, the VDI team will get blamed and not the network team.  It’s all VDI that causes the problem after all!

4. HBSS / Antivirus

It’s common knowledge that HBSS (Host Based Security Scanner) will kill any desktop experience, physical or virtual, if not implemented correctly. I have had my fair share of knock down drag outs with the HBSS team for making a change in the middle of the day that was thought to be benign and harmless.  How does this have anything to do with VDI?  Well, HBSS is a suite of applications such as a host based firewall and antivirus that is centrally managed.  The first concern is if a policy gets pushed to a virtual desktop and it kills the ports and protocols that are needed to connect then everyone will immediately get disconnected.  Yes, it’s happened…  No customer example necessary, that happened.  Additionally, antivirus policies that are typically deployed to physical environments want to scan everything opened, read, modified, closed, and do the same thing daily at a specific time.  In a physical desktop world with a thousand PC’s you have a thousand hard disks.  In a virtual desktop environment, you could have 100 hard disks.  You have to treat those shared resources carefully or you can inadvertently cause a denial of service on your network by doing something like running an antivirus scan in the middle of the day.  True story, a government customer I worked for once thought they were being hacked on an anniversary of 9/11 (not saying who) because the previous day they implemented significant and untested HBSS changes that would check everything.  I was one of fifty people evaluating the hack and the only one who accurately identified it was the HBSS settings.  I should point out that every VDI deployment DH Technologies does comes with ports/protocols and network diagrams BEFORE an engineer comes onsite to eliminate these issues.  Also we have this awesome document that explains how to solve the deployment of the HBSS agents for VDI while still provisioning the framework necessary.  All you have to do is contact us.

5. User Persona

User persona is unique to virtual desktops.  User persona is essentially anything that you changed or created on a desktop.  It’s basically your profile but it’s also registry keys and outlook email signatures, printers, etc.  When a user persona isn’t setup correctly or a small blip occurs in Active Directory that causes the persona to not process correctly users get logged in with none of their data or settings.  This generally causes panic and users think all their data is gone.   After all, that’s exactly what had to happen on a physical desktop for them to see that kind of scenario.  This is probably one of the most common issues that I see regularly.  It happens in our corporate environment from time to time.  It’s usually caused by not properly checking a patched master image to see if it still processes the user persona policy properly or an Active Directory GPO conflict.  Easily fixed.

6. Printers

Printers in my opinion are what’s wrong with the world.  It’s kryptonite.  What happens when your virtual desktop is in a data center 400 miles away and you want to print to a printer that’s sitting in the same room as you?  The print job has to spool to a print server that’s hopefully in the data center and then all the way back to the printer that was sitting right next to you.  Well this can cause it to go slower and it will definitely create some additional network bandwidth that you would never see in physical desktops.  Let’s not get too freaked out.  There’s lots of different ways to solve this issue.  To be honest this was more of an issue three years ago. Solutions: ThinPrint, Uniprint, direct USB printing, location-based printing and more…

7. Slow Login Times

This is by far my favorite complaint with VDI.  The reason why is because I can already tell you what caused it and I know literally nothing about your environment.  First off, remember these virtual desktops are not physical desktops so stop treating them like they are.  Every time a user logs into a non-persistent virtual desktop it’s like the first time they logged on every time.  They get to walk through the out of box experience, profile setup, etc.  GPO’s have to process (almost always setup incorrectly) and lots of other first time things.  The VDI industry has solved this issue slowly but steadily.  Additionally, Liquidware Labs provides an awesome tool that will actually breakdown the login process to tell you exactly how much time it takes to find a domain controller, process GPOs, etc. to determine the exact cause of slow login times.  Did you know the most common login time killer??  GPOs, CA certificates (for smart card logins only), and printers.  Yes, printers also kill the login times.

Top 7 VDI Shortcomings

  1. Users
  2. Data Forensics
  3. The Network
  4. HBSS / Antivirus
  5. User Persona
  6. Printers
  7. Slow Login Times


How do you optimize your virtual desktop image and save up to 40% resources and increase performance?

A non-optimized virtual desktop image consumes additional CPU, memory, network bandwidth, and IOPS.  Why?  Because Windows 7 and Windows 10 weren’t created to be a virtual desktop image.  Additionally, the applications themselves are all greedy selfish little apps that want to consume as many resources as they can with little regard to the other applications.  Now, not all apps are created equal.  Some are more important than others.  But in the end, they all need to be evaluated to determine how much resources do they consume and is that okay or not.

First off, how do I even begin to evaluate resource usage?

While I try to make these posts agnostic to vendors, there is one particular vendor that does some amazing things when it comes to doing an assessment:  Liquidware Labs. We use this vendor exclusively to perform virtual desktop assessments of customer environments.  Additionally, we have built some custom tools to read the raw data and build a design document out of it.

Little Known Fact: Apps can be optimized too, and they should be.

Yep, that’s right, applications can be optimized too.  As a matter of fact, optimizing the applications can reduce more resources than optimizing the windows operating system.  Imagine this: a customer deploys 1000 virtual desktops and doesn’t optimize it.  This happens all the time.  What would happen if we optimized the OS AND the apps and saved 40% of the resources?  You could get 400 virtual desktops for free!  Now, imagine if we did a customer environment with 50,000 virtual desktops.  If we optimized that environment it would save enough resources to support an additional 20,000 desktops.

Get your environment optimized today!

Contact us to have a consultant come on site and evaluate your environment.  Here’s what we look at:

  • Hypervisor hosts
  • Virtual desktop image
  • Applications

DH Tech’s Windows Optimization Guide – Don’t lose the look and feel of windows and sacrifice to get a high performing virtual desktop image.  Citrix and VMware tell you to turn off the awesome look of Windows 7/10 to get the performance you need.  We found a way to do that without the sacrifice.  We will help you optimize your desktop image and give you a copy of our optimization guide.


Citrix Windows 10 Optimization guide for XenDesktop

VMware Windows 10 Optimization guide for VMware View

5 ways your VDI project can be more successful

Many organizations are moving to virtual desktops for a variety of reasons.  I have had the luxury to observe both successful and failed VDI projects.  For the first couple of years when we started our company, we made a majority of our money by saving failing VDI projects.  Over the years, I began to think: “What do all the successful VDI projects have in common?”  Well, here’s the list of things I came up with:

1. Buy In From the Top

You can’t force a new technology on users without buy in from upper leadership.  Ideally, you have already aligned organizational goals to capabilities and features of VDI.  I recommend a Requirements Traceability Matrix (RTM) to ensure all the requirements are met, but that’s for another post.  Upper management and leadership needs to be onboard with the changes that VDI introduces to any organization.  If upper leadership doesn’t believe in your mission and project goals, then what makes you think the users will?  If you are wondering why you need to care about what your users think, then skip to point #2.

2. Communication, Communication, Communication

You have to be truly transparent with your users and leadership about what your plan is and how it will benefit your organization.  There are countless benefits that virtual desktops provide your users, but if you can’t very easily articulate them to you will have a rocky project.

Provide pamphlets, computer based training, and user outreach for the end users.  If you show users the benefits of something simple such as session persistence, which provides you the ability to move from device to device without needing to login/logout, you will immediately win over a vast majority of the users.  One of the most successful outreach events we lead was done in a government cafeteria.  I’ve always been a fan of lunch and learns.  We had a line of twenty or more people and it was quickly begging to get longer and longer.  Not only did this provide outreach to educate the end users but it got our government customer extra funding for his project.

What doesn’t VDI touch?  No seriously, what does it not impact?  VDI changes the user devices, network, data center footprint, energy usage (reduces), applications, licensing, management, troubleshooting, provisioning, and more.  This is just another case of why you need to communicate with your users and all the other departments.

3. Pick the best and most simple technology that’s highly scalable

All successful VDI technologies share common attributes: scalable and simple.  Don’t use 10 different technologies when five will do the job.  Don’t use five technologies when three will do just fine.  You really have to keep it simple.  Why, you might ask?  Well, if I have ten technologies, I have users and administrators trained in ten different things.  I also have the potential for ten different things to fail at some point which increases my trouble tickets for the help desk.  Not to mention the decision tree for troubleshooting for the help desk is complex and long which increases the time it takes to close a ticket.  I’m not saying this is always true, but generally speaking it is.  We have been leading our VDI deployments with hyper-converged solutions which take out the complexity of deploying VDI.  How? We eliminate the installation time because the hyper-converged solutions we deploy are deployed in an automated way which cut install times down to hours and not days or weeks.  Additionally, I don’t have to have a SAN admin, or someone to do zoning or masking.  Not to mention it’s highly scalable and predictable which makes it easy to size for small, medium, and large enterprise deployments.  Ask for customer references before you chose a technology!  You can thank me later.

4. Choose the right integrator

This is where things get tricky.  There are two different perspectives on this and I have a biased position which needs to be addressed.  I am an integrator.  Lets get that out of the way.  Obviously I would prefer you to use our services.  You may prefer to do the work yourself for financial reasons, or political reasons.  Let me explain why you should consider using an integrator for at least some of the work:

There’s a trade off between user experience and technology that is a work of art.  You need to always think about the user experience in order to have a successful implementation.  It’s not something you typically think about when deploying a new server.  It’s something you need to constantly think about when moving to VDI.  How will this impact the users’ experience?  You need to communicate changes to users regularly and always before it happens.

User experience is derived from look and feel, AND Performance.


  • user experience is different from Windows 7 to Windows 10
  • user experience changes between Office 2013 and Office 365
  • user experience is different from 1 CPU at 1GHZ and 2 CPUs at 2GHZ
  • user experience is different from 2GB RAM and 4GB RAM
  • user experience is different from a software GPU vs a virtual GPU
  • user experience is impacted by login times, printing times

I can install this myself, I have done virtualization before

On a recent VDI project I was able to determine within five minutes that a WAN link was insufficiently sized which would cause a problem.  There are things that someone with experience can quickly pickup.  It’s the tell tale indicators.  Lets be clear, i’m certain you can install the hypervisor stack if you or your team have done it before.  After all, i just said we chose a simple and automated solution by leveraging hyper-converged solution.  As someone who has overseen more than a hundred virtual desktop solutions, I can say with confidence there are many differences between a server virtualization project and a desktop virtualization project.  Users will see everything you do and if things run a little slow for even a short period throughout the day, you will get several help desk calls/tickets.  You typically don’t have user profiles on servers.  User profiles are a majority of the number of calls/ tickets.  You don’t want a good solution for user persona management, you want the best!

So what’s the solution then?

Any good consultant can help build a plan and leverage your team’s abilities without a crazy bill.  There are several good approaches to building out the environment.  Leverage an integrator to do a full turn-key deployment or take a hybrid approach and leverage your team with a consultant to build the environment out.  Either way, do not try to do VDI without someone who has done it before.  You will make mistakes.  It’s inevitable, and a failed VDI pilot is the quickest way to kill any hope to deploy VDI for your organization.  Besides, a subject matter expert can not only help the project out, but you can benefit from on the job training your team will receive.  Think of it like a safety net for your architecture and deployment plan.

The design difference

A server virtualization project is designed from the data center out to the edge.  A properly designed desktop virtualization project is designed from the user to the data center.  You start with use cases and performing a desktop virtualization assessment while working towards the data center.  This will help you size out the environment and determine if network segments are sized appropriately, determine application requirements, etc.  Not one successful VDI project in the past five years has ever been done for more than 500 users without performing an assessment.

5. Change Management

Yikes!!!  Seriously, this is more important than you think.  I have been to countless customer environments where I was told, “The system runs slow, fix it.”  You can’t make a little change in VDI with out it having a huge impact.  For example:

  • Windows Patch – No one can log in any more (This happened during patch Tuesday)
  • Application Patch – all users that use print to PDF don’t work any more (Not a VDI problem)
  • Network Update – a simple update moving users on a segment to an MPLS network causes the MTU size to drop by just a little bit and now everyone on that network segment can’t connect (this happened to a customer!).
  • Recomposed the desktop = BOOT STORM (lets be honest, that was a 5yr ago problem)

Don’t fret – VDI can ease a lot of these issues.  If you mess up a desktop image, you can simply revert back to a previous version of the snapshot and push all the new users trying to log on to the previous image.  Same for applications: you can push the previous version back to users.  You’re kind of hosed on the network update if you don’t have any easy methods to undo that change.

You saw the part where VDI touches everything, so make sure VDI is a priority during other enterprise changes so that something that appears to be simple doesn’t have an unintended impact to your VDI deployment.

Get a test system

You should definitely get a test system to perform updates and test patches on.  If you choose the right technology, you can get a scaled down version of what you deployed to do all your tests on.  The goal is to reduce outages and trouble tickets without increasing the bill.


VDI can make your end users and organization more agile to meet new demands and keep up with the ever changing world that we live in.  It’s a lot harder to implement these new capabilities and features without upper management buying in on the project.  By ensuring communication is constant, it will ensure you have happy users and happy administrators.  Taking the time to evaluate and choose the right technology will not only make the difference in success, but it will have an impact on training and administration.  Choosing an integrator with hands-on experience brings subject matter experience that is vital to speeding up a deployment and making it a success.  Change management isn’t sexy, but it means that you have a great technology that stays running long-term with little to no service interruptions.


DH Technologies – #12 on Washington Business Journal's Small Technology Companies List

Leesburg VA, August 15, 2016 –

DH Technologies announced that it has been named to Washington Business Journal’s 2016 Top Small Technology Companies list. This list ranks technology companies with fewer than 150 employees that are based out of the Washington D.C. metropolitan area. Out of the 30 companies on the list, DH Technologies secured the #12 spot.

“Maintaining partnerships with cutting-edge technologies has been a major key to our growth as a small company,” said Devin Henderson, CEO of DH Technologies. “We’ve been fortunate enough to keep strategic partners that help us grow as much as we help them grow.”

Washington Business Journal’s Top Small Technology Companies list can be found on their website here.

DH Technologies is #1 on CRN’s Fast Growth 150 List

Leesburg VA, August 8, 2016 –

DH Technologies announced that it has been named to The Channel Company’s 2016 CRN ® Fast Growth 150 list. The list is CRN’s annual ranking of North America-based technology integrators, solution providers and IT consultants with gross sales of at least $1 million who have experienced significant economic growth over the past two years. The 2016 list is based on gains in gross revenue between 2013 and 2015, and the companies recognized to represent a total, combined revenue of more than $25,637,241,944.

“The companies on our 2016 Fast Growth 150 list are growing at an incredible rate, establishing themselves as clear leaders in today’s IT channel,” said Robert Faletra, CEO of The Channel Company. “Their rapid expansion in a climate of economic uncertainty and unprecedented technological advancement is especially impressive. We congratulate each of the Fast Growth 150 honorees and look forward to their continued success.”

“We aim to be more effective by not only working harder but also by working smarter. We are also focused on emerging technologies that enable us to be on the leading edge of our business.”

– Devin Henderson, CEO of DH Technologies

The Fast Growth 150 list is highlighted in the August issue of CRN and can be viewed online at

DH Technologies Novated NASA SEWP V Contract

Leesburg, VA – June 27, 2016 – DH Technologies is pleased to announce that SEWP V Contract NNG15SC70B has been novated from FASTech, Inc. to DH Technologies effective June 23, 2016. The awarded SEWP V is a five year Indefinite Delivery Indefinite Quantity (IDIQ) Government-Wide Acquisition Contract (GWAC) with a five-year option period, which gives the contract an effective ordering period of May 1, 2015 through April 30, 2025. The SEWP V contract has a maximum value of 20 billion per contract. There are 145 Prime SEWP V Contract Holders and DH Technologies is proud to be one of the 119 small businesses on that list. Within SEWP V, DH Technologies has the award under Group C.


“I am extremely honored to be part of the government’s most popular Government Wide Acquisition Contract.  I think not only is it good for us as a company but it brings better options to our customers for acquisition capabilities. To my knowledge, this is the first novation of a SEWP V contract, and we have worked hard to make this novation smooth and successful.  I really owe our finance team and contracts manager for making this novation happen in such a short time period.”

Devin Henderson, CEO



Prime Contractor: DH Technologies

Contract No. NNG15SC70B

Contract Type: Indefinite Delivery Indefinite Quantity (IDIQ)

Performance Period: May 1, 2015 through April 30, 2025


Press Contacts

Brent Seth

DH Technologies


DH Technologies Named to CRN’s 2016 Solution Provider 500 List

Leesburg, VA, June 6, 2016 –

We are proud to announce that CRN®, a brand of The Channel Company, has named DH Technologies to its 2016 Solution Provider 500 list. The SP500 list is CRN’s annual ranking of the largest technology integrators, solution providers and IT consultants in North America by revenue.

The SP500 is CRN’s predominant channel partner award list, serving as the industry standard for recognition of the most successful solution provider companies in the channel since 1995.

“The 2016 Solution Provider 500 represents a total, combined revenue of over $334 billion—a testament to their success in keeping pace with the rapidly changing demands of today’s IT market,” said Robert Faletra, CEO, The Channel Company. “This prestigious list recognizes those companies with the highest revenue and serves as a valuable industry resource for vendors seeking out top solution providers to partner with. We congratulate each of the Solution Provider 500 companies and look forward to their continued success.”

We are proud to have been part of this prestigious list and hope to be even higher on it next year.

A sampling from the 2016 Solution Provider 500 list will be featured in the June issue of CRN Magazine and at

Back to the Future

Today is October 21st, 2015. The day that Doc brought Marty McFly and Jennifer Parker out of 1985 to save their future children. In the span of 30 years from 1985 to 2015, a lot has changed in the movie: hoverboards became a reality, everyone owns a flying car, and fashion dictates sharp, metallic apparel (as is true with all movies based in the future). In real life, certainly, we would have expected at least one of these to come to fruition.

Lexus claims that they’ve created a hoverboard, however, after this hoax in 2014 I’m not sure if I can open my heart up again to the possibility of a real hoverboard. Flying cars, unfortunately, aren’t as accessible as they are in the film, and I don’t know if mini-Cessnas, however cool they look, would qualify as a practical commuter vehicle. And as for metallic clothing of the future, it’s only reserved for celebrities, and I don’t think many will hop on board with the style. But in the world of technology, specifically, virtualization, leaps, and bounds have been made over the last 30 years that have changed the way we see our servers, our storage, and our SAN (or lack thereof).

In fact, 30 years ago, the setup for virtualization involved closing your eyes and pretending you were somewhere else – a setup that certainly still works in practice, but is becoming less and less practical due to the fact that incredible advances from companies like Dell, VMware, Nutanix, and Arista are making. The data center isn’t what it used to be – a giant warehouse full of servers made out of floppy disks and spare car parts even 15 years ago would be considered a technological blessing. New technologies are coming out all the time that are able to seamlessly integrate with our business every day.

What are our predictions for the future? Over the next ten years, we believe that complex infrastructure will be a thing of the past. There will be no need for multiple hops between machines, which will decrease any kind of bottlenecks within the data center. Enterprise-class solutions will be made readily available to anyone who needs a virtualized application that saves rack space and can easily scale to fit any project. An invisible infrastructure will create an environment where the end user doesn’t need to focus on any behind-the-scenes work.

You won’t need any nuclear reactor to generate the proverbial 1.21 Gigawatts of energy that you need to jump into the future, because the future of virtualization technology will come quicker than you think, without the need for a flux capacitor.

DH Tech at Nutanix .NEXT

This past week Nutanix held its first annual .NEXT user conference. Over 1, 000 customers, partners and employees were in attendance. The biggest announcement from the hyper convergence company was the announcement of Nutanix Acropolis – their KVM based hypervisor and the Nutanix Xtreme Computing Platform (XCP), which is powered by Acropolis and the Nutanix Prism UI. Many see this is as a slap in the face to VMware (who noticeably was not in attendance) others see this as a next step in the software defined evolution of enterprise level technology. Its hard to determine where announcements like this will take us as an industry.

Nutanix Federal Partner of the Year
Nutanix Federal Partner of the Year – DH Technologies

I used to call myself a Server guy, VMware guy or more recently Virtualization guy. As technologies converge (see what I did there) I don’t think I can use those labels anymore. I like the term Technologist. It’s fitting for the “Swiss army knives” we’ve become. The traditional silos are breaking down in IT and as our customers are required to wear multiple hats so are we. During the conference, Nutanix CEO Dheeraj Pandey said it best, “While our competitors focus on us, we are focusing on our customers

This year we were awarded the Nutanix Federal Partner of the Year award for the second year in a row. Customer focus remains number one here at DH Tech.

Always-on Virtual Desktop Solution for U.S. Department of Defense

DH Technologies Designs Always-on Virtual Desktop Solution for U.S. Department of Defense

In early 2009, DH Technologies was solicited to design and implement a virtual infrastructure that would address the following urgent and immediate challenges for an agency within the U.S. Department of Defense (DOD):

  • Replace emergency operations center desktops that were globally dispersed and void of management, centralized authentication and configuration management.
  • Enable efficient, centralized provisioning of an enterprise suite of applications across all associated user communities.

Read More…