When I started application packaging twenty years ago, Microsoft had only just released its installer called MSIEXEC.EXE. The primary goal of Microsoft Installer was to create a generic way to install and uninstall applications on Windows while allowing you to manage and version-control in a structured way.

Back then, Microsoft needed this for its new Office 2000 setup routine on Windows NT and Windows 95 to ensure that each installation of this complicated set of applications behaved and functioned post installation.

In parallel to this new method of installing applications, many companies would use Novell ZenWorks and other more manual approaches to distribute files/registry and custom build batch files to manipulate user configurations and settings.

Most companies had complicated environments to manage and the hope was that the Microsoft Installer technology would help to address this by centralizing the way the enterprise and software vendors would package up the application into .MSI file format.

DLL Hell And Other Application Performance Quality Control Issues

At that time, most packaging teams would have a Quality Assurance/Checking team to validate that the standards of the .MSI files were robust and could install and uninstall cleanly. I recall creating huge test documentation based upon how the application that was being packaged would interact with the desktop operating system and its hardware.

Typically, only a few performance metrics were being tracked back then. For one, we tracked the memory used because it was expensive and understanding it was key in the context of purchasing additional memory only for certain business units that needed to use memory-heavy apps. We also tracked CPU usage for applications that required graphic cards. In addition, we monitored software compatibility.

If you worked in that field twenty years ago, you will remember “DLL Hell” — tracking DLLs in case earlier versions of DLL were included in the .MSI installer routine because it was possible that they would adversely affect and potentially cause performance related issues with other applications that shared these files.

Application Performance Testing Decreases In Importance — With Dire Consequences

With the introduction of application virtualization, layering, and containerization over the past 10 years, many application packaging teams adopted the newer technology due to increasing pressures to reduce costs and improve efficiencies in the packaging and testing areas of technology.

End users demanded that applications be made available faster and in a more repeatable fashion. While hardware was getting faster and memory and hard disk space was getting cheaper, applications remained the heavy lifting and shifting of desktop migrations and the costly part of the technology budget.

As hardware budgets were no longer a constraint, and since application packaging and testing was an expensive and lengthy process, checking for application packaging quality slowly decreased in importance.

In addition, with the introduction and wide adoption of virtualization technologies such as SoftGrid, now Microsoft App-V, and other forms of virtual or layering technologies such as VMWare ThinApp, VMWare AppVolumes, and Citrix AppDisk, as well as more recent containerization technologies like Numecent Cloud Paging, CloudHouse and Droplet Computing, it became increasingly easier to create and distribute packages in their own isolated way.

Thankfully, DLL Hell became a thing of the past!

But when things get too easy, often some bad decisions are made. It was all too easy to remove the quality checking aspects of virtualized application packaging and testing. Consequently, many organizations lowered the performance aspect of their application quality control checks even on physical MSI Installer routines!

Why Application Performance Is The #1 Issue App Testers Care About Today

While MSI technology is still going strong today, it is set to be replaced by MSIX, Microsoft’s new attempt to create a modern installer for all devices and platforms, within the next five years because it lets us build applications utilizing both established and new technology, e.g., .EXE, MSI, App-V, UWP, AppX.

Of course, organizations moving to MSIX need to plan on giving it time to mature and be adopted by software vendors as the preferred method of delivery. It has taken App-V ten years to mature, but the possibility of MSIX taking a more fast-tracked route to adoption looks very promising.

However, these major developments create a new challenge for application packaging and testing teams as they embrace the “Modern Desktop”. This problem can be broken down into two main areas:

1) Evergreen IT

Just as hardware life cycles rapidly shorten, Microsoft and other major applications are releasing major updates faster than they ever have before. While there are countless benefits to this, there is also a drawback. Multiple releases of Windows 10, Office 365 and SCCM will cause issues with application performance. Whether those are OS changes to the display files, the network files, or the office installation files, software changes will cause performance impacts:

    1. Most organizations will want their end users to complete a satisfaction survey of a successful migration or to raise a ticket to measure this migration and the causes of problems if they occur.
    2. To be able to measure a machine’s memory, CPU, and graphics performance for each application packaged, it is of the utmost importance that we store performance data in a database centrally (Access CAPTURE does this).
    3. Tools, such as Splunk, Lakeside Systack, and Aternity, can provide .EXE details on the desktop or Virtual Machine. Once we have the data, we can cross check the results from packaged and tested application performance against an individual machine’s performance, as there could be underlying factors why the machine is exhibiting results different from that of the signed-off packaged app.

2) Hardware Choices

Additionally, we will have to make our hardware choices accordingly.

    1. For desktop and laptop: Applications will have varied performances on various hardware models. This needs to be taken into consideration when making applications available to various parts of your organization that may have lower specification machines.
    2. For Virtual Machines: Applications clearly perform differently in a virtual environment. This is mainly due to graphic display rendering with technologies such as Citrix ICA, Microsoft RDP and VMWare tools. This needs to be taken into account in regards to app performance.
    3. For Stateless VDI: Memory leaks generally occur after a few days of application up time. Therefore, a product like Access CAPTURE can provide these KPIs to you to provide peace of mind as you upgrade not only your operating system but your users onto a stateless VDI experience.

Conclusion

With the hardware and software delivery changing to accommodate their end users’ demands, application performance is key and a vital part of users being happy with new operating systems or new hardware devices.

How an application performs is the first thing a user will raise a ticket about and the number one thing they will be unhappy about if they have a degraded experience. The solution is to measure the performance of each application, maintain this data in a centralized database, and always test against hardware, platform, and operating system differences.

Leave a Reply

Your email address will not be published. Required fields are marked *

Post comment