TESTING FIWARE GES

FIWARE is market-ready software, able to respond to the demands of production environments in which the platform must scale in a reliable manner under real workload conditions. This fact implies that each FIWARE component (referred as FIWARE Generic Enabler – GE) must provide a quality, reliability and performance level which is appropriate for commercial-grade systems. Because of that, a dedicated task is defined under the umbrella of the FIWARE initiative to analyze and assess the level of quality of each GE, providing diverse kind of methods, scripts, reports, labels for GEs and an assessment dashboard.

KINDS OF TESTS

The quality of each FIWARE GE is evaluated from three different angles:

  • Verification of the GE specification (functional testing), developing the appropriate test cases to assess if the implementation of the FIWARE GE matches what is defined in its specification.
  • Assessment of performance, stability and scalability in operational environments, like under excessive workload (stress testing). Test scenarios are defined and executed so that limits of the GE under testing are identified, and can be compared with defined reference levels.
  • Integrity of GEs documentation (documentation testing), both inspecting the code and the accompanying documentation (installation manuals, user guidelines, academy courses and similar). The goal of this assessment is to warrant that FIWARE users can leverage on high-quality documentation for installation, configuration and operation of FIWARE technology.

FUNCTIONAL TESTING

These tests aim to verify FIWARE GE functionalities that are described by the related API specification and to report any mismatching between the expected results and the actual results. Therefore, the main output produced is the ratio between tests failed and tests executed (TF/TE) for each tested GE.

NON-FUNCTIONAL TESTING

Three kind of non-functional tests are performed on any FIWARE GE:

 

  • Performance test. This is the typical stress test for the component. Most frequent scenarios of execution are overloaded by launching many requests that are increasing over the time up to reach the limits of the component. It is measured by the number of updates per second which the corresponding component’s function (API) is able to process without being stocked.
  • Stability test. This kind of test shows the behaviour of the GE along the time with a constant and usual charge of work. The objective here is detecting if the GE experiences memory leaks or CPU saturation during a long period of constant execution. If the component is well programmed, the memory and processor usage must be well managed and must not cause any problem.
  • Scalability test. This test performs an estimation of how much scalable would be the component by simulating the number of machines were the component runs. It is measured by using the ratio between the growth velocity of the response time and the growth velocity of the number of users. The closest value to 1 the best scalability ratio, as it indicates that time of response is growing at same pace than the load.

DOCUMENTATION TESTING

  • The FIWARE Catalogue itself, where the user must be able to easily navigate and find useful information about each GE. Therefore with this test it is verified that such information is reachable and correctly referenced by this entry point.
  • The installation manuals which provide the information needed needed for deploying each GE. The related assessment verifies:
    • First the existence and well functioning of a related Dockerfile since the FIWARE Community agreed to use Configuration as a Code (CaaC) methodology.
    • In case of non existence of a Dockerfile, verify the steps defined in these manuals in order to assure the best user experience for each user.
  • The FIWARE Academy which is the place where tutorials and courses on each GE are collected. The FIWARE QA team continuously performs evaluation tests of the training courses available on the FIWARE Academy according to defined Quality Assurance (QA) guidelines. All the training material created and published in the FIWARE Academy is evaluated on the basis of these guidelines criteria; in this way an efficient provision of training can be granted. On the basis of QA criteria the published courses get the “Not Good/Sufficient/Good” label by FIWARE team. If there is a problem with the course, the team connects to the community behind the GE to fix the issues.

GE LABELING

Analysing the complete set of results of the testing performed on the various FIWARE Generic Enablers would be a hard and time consuming task, so it would be desirable to provide a simple but meaningful labeling schema which helps developers to understand what is the overall quality of a particular GE. Following a proven model, a schema similar to the EU energy devices labeling has been defined to be applied to FIWARE GEs.

The goal is to have a label established per GE which summarizes the overall quality level of the GE. This overall label is calculated as the average of the label obtained for each of the testing categories.

A summary of assigned labels to each GE in each criteria as well as at overall level can be found here.

The QA label assigned to each GE is also visible in the FIWARE Catalogue.

 

A detailed description of the process and methodology used for the testing can be found here.