Test case sections

../_images/ride_03d_test_case.png

An Alyvix test case should be structured in at least 3 different sections: the main section <testcase_name>, the subsections setup and teardown.

The main section <testcase_name> is created selecting New Test Case from the suite options.

The subsections setup and teardown are created selecting New User Keyword from the suite options. Moreover, they have to be really set as the test Setup and Teardown sections: click on the button of the suite settings and set them in Suite Setup and Suite Teardown.

Main body

../_images/ride_03e_test_case.png

The main body <testcase_name> is executed after the setup section and it is a list of Alyvix system keywords, Alyvix visual keywords, Robot Framework 3.0 syntax, Python 2.7 methods and subsections (created selecting New User Keyword from the suite options).

Note

It is recommended to subdivide the test case in subsections that are logically isolated.

Setup section

../_images/ride_03f_test_case.png

The setup section codes all the declarations of performance measures that are expected at the test end as its output.

Typically, a declared performance measure corresponds to the name of an Alyvix visual keyword <keyword_name>.

The declaration of a performance measure should be done properly using the Add Perfdata keyword and setting the keyword detection_settings as Performance and Break.

Declaration in the test Break option in the keyword Performance option in the keyword Description
Add Perfdata | <keyword_name> break_checked_box performance_checked_box Blocking transaction, with latency: I do not solve it, I break the test and I report CRITICAL
Add Perfdata | <keyword_name> | <timeout_seconds> break_unchecked_box performance_checked_box Blocking transaction, with latency: I solve it, in any case I continue the test and I report CRITICAL; <timeout_seconds> should be bigger than <critical_threshold>
Add Perfdata | <keyword_name> | 0 break_unchecked_box performance_checked_box Possible transaction, with latency: I try to solve it, in any case I continue the test and I report OK; it has to be inserted in the row right before the declared keyword
  break_checked_box performance_unchecked_box Blocking transaction, with no latency: I do not solve it, I break the test; it has not to be insert it as the last keyword
  break_unchecked_box performance_unchecked_box Possible transaction, with no latency: I try to solve it, in any case I continue the test

Warning

Be sure to do not declare performance measures as follows, just because they make no sense (e.g. expecting a performance that will not be measure).

  Declaration in the test Break option in the keyword Performance option in the keyword Description
Do not do this! Add Perfdata | <keyword_name> break_unchecked_box performance_unchecked_box It makes no sense!
Do not do this! Add Perfdata | <keyword_name> break_checked_box performance_unchecked_box It makes no sense!
Do not do this!   break_unchecked_box performance_checked_box It makes no sense!
Do not do this!   break_checked_box performance_checked_box It makes no sense!

Teardown section

../_images/ride_03g_test_case.png

The teardown section codes the termination procedures from every possible broken state of the test case: the test must always (re)start from the same beginning state (e.g. a clean desktop of the probe). For building a proper termination procedure you can use visual keywords, close windows, send shortcuts and kill processes.

Finally, you have to print the performance Nagios output.

Eventually, you could publish the performance points, store the test features and the performance measures and store the scraped strings.