Launchers / Automated tests

XQual Xci (Xci.jar)

The Xci launcher allows running XcontinuousIntegration, automating the creation of test sessions for any existing test campaign under XStudio.

Although this is not purely a test, this allows launching several successive sessions of different campaigns.
Note: By using the Junit result format, XStudio will able to parse and analyse the session results and decide on its success or failure based on defined ratios. By doing so, you can manage a release pipeline from installation, to smoke tests, to integration test, to several functionnal campaigns and so on.
This enable smart DevOPS or more exactly DevTESTOps, within XStudio.
At the end of each session, the launcher checks the results againts a set of performance levels you set for each of your sessions.
As of the current release this includes:
  • Need to reach at least a set % of successes
  • Need to stay under a maximum % of failures
  • Need to stay under a maximum % of skipped test cases (for whatever reason it was skipped)


Warning: XQual can provide assistance to its commercial customers to enable this smart feature. please send us a message.

Configuration

The Xci.xml file is just a template and must NOT be edited. It's used by the system to build dynamically the form that the user will be able to fill in from the GUI when creating a custom execution configuration.

Parameter Description
General
Windows command (leave blank if not executing under Windows) Under MS Windows you need to start a cmd thread to run your executables.
By default we suspect to be executing under MS Windows and using a 64 bits context. Otherwise adapt it to your environment


Default value is: C:\WINDOWS\SysWoW64\cmd.exe
Xci
Xci instal path This must indicate the path where XContinuousIntegration has been installed.

Default value is: C:/<user>/Xqual/Xstudio/
xci Engine This must indicate the exact name of the XContinousIntegration engine.

Default value is: xContinuousIntegration_console.exe
synchronous This must indicate if you whish to wait for the end of the session. This is nedeed if you want to get a report.
Valid values are 'true' or 'false'

Default value is: true
generateReport This must indicate if you whish to get a report at the end of the session. You also need to set 'synchronous' to 'true'.
Valid values are 'true' or 'false'

Default value is: true
reportTransform This must indicate the type of transform you expect for the report.
Please consult the XContinuousIntegration documentation for more details.


Default value is: XML_JUnit
reportStyle This must indicate the type of rendering you expect for the report.
Please consult the XContinuousIntegration documentation for more details.


Default value is: XQual
reportOutputPath As there's not exactly any test, this must indicate where the session results and log files will be located.
In case this is a Junit file, it can be analyzed to decide if the session is succesful, based on the thresholds set in the 'promotion' tab or in the attributes (explained below).


Default value is: c:\xci
Promotion
Min % of success This must indicate the minimum % of successfull test cases you expect out of any session for this configuration.

Default value is: 100
Max % of fails This must indicate the maximum % of failed test cases you expect out of any session for this configuration.

Default value is: 0
Max % of Not Executed This must indicate the maximum % of skipped test cases you expect out of any session for this configuration.

Default value is: 0

These values can be changed while creating the campaign session from XStudio.
Note about file path parameters:
Any parameter referring to a file or folder path (for instance Test root path) can be provided either using \ separator (if the tests are going to be executed on a Windows agent) or / separator (if the tests are going to be executed on a linux or MacOSX agent).

On windows, if you provide a path containing an OS-localizable folder such as C:\Program Files, always prefer the English version (i.e. NOT C:\Programmes if you're using a french-localized Windows) or the corresponding native environment variable (i.e. %PROGRAMFILES%).


Process

Some attributes are provided with the test representing the session.
The below attributes are mandatory:
  • com.xqual.xci.sutId provides the SUT id for which XCI will generate a new sesssion (xciSutId)
  • com.xqual.xci.campaingId provides the campaign id for which XCI will generate a new sesssion (xciCampaingId)
  • com.xqual.xci.agentIds provides the agents id with which XCI will generate a new sesssion (xciAgentIds)
  • com.xqual.xci.configurationsIds provides the configuration id for which XCI will generate a new sesssion (xciConfigurationIds)
Please consult the XContinuousIntegration documentation for more details.

The below attributes are optionnal and will overload the informaiton provided in the the configuration 'promotion' tab if present
  • com.xqual.promotion.minPctSuccess will overload the "Min % of success" if provided
  • com.xqual.promotion.maxPctFailures will overload the "Max % of fails" if provided
  • com.xqual.promotion.maxPctNotExecuted will overload the "Max % of Not Executed" if provided


The minimum command is generated as follows:
"<XciInstallPath>/<xci Engine< --campaignId <xciCampaingId> --agents <xciAgentIds> --sutId <xciSutId> --configurations <xciConfigurationIds> <all other flags for synchronous and reports>
Please consult the XContinuousIntegration documentaion for more details.

Following is an example of a generated command under MS Windows 10:
C:\WINDOWS\SysWoW64\cmd.exe /C "C:\<user>\xContinuousIntegration_console.exe" --campaignId 25 --agents 2:1:0 --sutId 76 --configurations 24:27 --synchronous --generateReport --reportTransform XML_JUnit --reportStyle XQual --reportOutputPath C:\xci --reportName smoke_tests
Explainations: A session is automatically created and executed for Campaign 25. Results impact SUT 76. It is executed by the Xagent 2 (once it is started).
We use the configuration 27 for the test category 24 for this session.
We wait for the session to end before generating a Junit report.
This one is place in the c:\cxi folder. The xci launcher will detect this is a Junit report and will analyze it against the promotion threshold.
Once done, it will decide if the session is successful or failed.

The trace file and the xml report will be attached to the test case result.

Permissions

WARNING: if you're running your tests on Windows, it may be required to run the tests as administrator.
Having an account with Administrators permissions may even not be enough in some cases (especially if you're using Windows 10) and you may need to disable completely the UAC (User Access Control) on your computer.
To do so:
  • Press the Windows + R key combination
  • Type in regedit
  • Go to HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Policies\System
  • In the right-side pane, look for EnableLUA and set the value 0
  • Close the registry editor
  • Restart your computer

Debug

If your tests are not executed correctly or are reporting only failures, this is very likely because your configuration is incorrect or because you used a wrong naming convention for your tests and test cases.

The best way to quickly find out what's wrong is to look at the traces generated by XStudio (or XAgent).
The traces always include the detailed description of what the launcher performs (command line execution, script execution, API calling etc.) to run a test case. So, if you experiment some problems, the first thing to do is to activate the traces and look at what's happening when you run your tests.

Then, try to execute manually in a cmd box the exact same commands.
This will normally fail the same way.
At this point, you needs to figure out what has to be changed in these commands in order to have them run properly.

When you have something working, compare these commands to what's described in the Process chapter above. This will tell you exactly what you need to change.

Most of the time, this is related to:
  • some incorrect values in some parameters of your configuration,
  • the name of your tests,
  • the name of your test cases,
  • the canonical path of your tests