Dynamic Testing

Dynamic Analysis enables much more in-depth data gathering and analysis than other methods based on Static Analysis. Not only can an installation be scanned on much greater detail, covering not only what might happen in the installation process, but actually monitoring what does happen. But also has Dynamic Analysis proven to be a very flexible methodology with an endless amount of use cases.

The origin of our Dynamic Analysis approach is Quality Assurance in the Application Rollout Management process. It is often times claimed, that Static Analysis is sufficient to scan the installation of an application prepared for large scale rollout. However there are some flaws in this claim:

For one thing, the maximum detail in which Static Analysis can scan an installation file is rather limited. This is by design. It is possible to scan an MSI package, for example, to find out where a certain file is goin to be installed, or where a registry key is written to. However, many installations perform different tasks depending on which kind of system they run on (e.g. Windows 10 or Windows 7).

While this can still be worked around to some extent, many installations also include executable files that perform certain tasks. These cannot be read like an MSI file (which basicaly is a database), but contain binary executable code. The only method to find out what these executable parts do, is executing them, which makes Dynamic Analysis necessary.

Dynamic Analysis however is not limited to what the installation file does. Many applications perform post installation tasks when first started by a user. Static Analysis cannot possibly find out what the application does after the installation is completed - Dynamic Analysis can!

Also, Dynamic Analysis is not limited to application installations. Imagine, for example, a server cluster consisting of hundreds of machines. Each of these machines must be the same as all others to make sure all machines work together nicely. But how to make sure all machines are the same? Dynamic Analysis can help you with that, too.


Rulebased Analysis

The Rule Engine is the heart of the analysis of fingerprints in QtestBASE. It consists of a number of extensive configurable filters (scenario, HCT) and a fully scalable rule system. From simple data queries to interconnection of multiple data scopes in one fingerprint to complex relations of multiple Fingerprints to each other, the Rule Engine supports automatic analysis of the large amounts of data accassible by our Fingerprint Technology.

QtestBASE can be configured in a multitude of ways. Before creation of snapshots, the over 20 different scan modules of our agent can freely be enabled and disabled. Using includes and excludes, the results can be adjusted in great detail. This configuration is what we call a "scenario".

After creation of a Fingerprint, it is imported into the database for further processing. During the automatic evaluation by the Rule Engine, the Fingerprint first runs by the HCT (High Compression Technology). There the large amount of data points is reduced non-destructively. That means, that filtered data will not be used in the following analysis steps, but will stay in the database, accassible if necessary. Later inclusion of the filtered data into the analysis is thus possible at any point.

In the consequent processing, the data is analysed and rated according to the configured rules. In contrast to previous processing steps, which are mainly focused on data reduction and preparation, the rules actually are single rating criteria. They not only filter the data, but perform a rating of the data using complex analysis algorithms. The result is presented in the form of a traffic light.


Reporting & Statistics

Report Detail Target Group
Project Statistics Overview Projekt Management
Rating Summary Overview Projekt Management/Customer
Conflict Matrix Overview Internal/Customer
Rating Report Single Result Customer
Rating Detail Single Result Internal
Rule Detail Detailed Result Internal