Difference between revisions of "Writing Acceptance Tests"

From wiki.gpii
Jump to: navigation, search
m (An Example Walkthrough: "git checkout" instead of "get checkout")
(Added a link to the video tutorial on acceptance tests)
Line 10: Line 10:
  
 
Below are instructions on how to add new integration tests. Since the acceptanceTesting depends on the platform, your acceptance tests should be added to the relevant repository (e.g. Windows, Linux or Android). Note that for each SP3 application, we add a new acceptance test file, to allow the tester to run the tests individually based on what applications they have available.
 
Below are instructions on how to add new integration tests. Since the acceptanceTesting depends on the platform, your acceptance tests should be added to the relevant repository (e.g. Windows, Linux or Android). Note that for each SP3 application, we add a new acceptance test file, to allow the tester to run the tests individually based on what applications they have available.
 +
 +
There is a video tutorial on how to write acceptance tests [https://drive.google.com/folderview?id=0ByfE6R0ipKCjUWN6WXF4bUFKNTA&usp=sharing in Google Drive].
  
 
=== Create a JIRA Ticket ===
 
=== Create a JIRA Ticket ===

Revision as of 07:46, 14 August 2014

Introduction

The core architecture framework includes an automatic acceptance testing framework, that makes it fairly straightforward to write or add new acceptance tests.

In this context, an acceptance test is a test that will test the entire (real-time framework) system end to end. Basically, it starts up the server, logs a user in, and checks that the system is modified accordingly. Then the user is logged out and again a check is made to ensure that system is returned to it's proper state.

Note that this means that the actual system will be configured (and restored) when running the acceptance tests.

Writing New Acceptance Tests

Below are instructions on how to add new integration tests. Since the acceptanceTesting depends on the platform, your acceptance tests should be added to the relevant repository (e.g. Windows, Linux or Android). Note that for each SP3 application, we add a new acceptance test file, to allow the tester to run the tests individually based on what applications they have available.

There is a video tutorial on how to write acceptance tests in Google Drive.

Create a JIRA Ticket

First, create a ticket for your solution's acceptance tests in the GPII project on JIRA. You can later reference this ticket in the name of the Git branch that you create for the acceptance tests and in the pull request on GitHub.

Bring Master Up to Date

Ensure that your master branch is up to do date: go to gpii/node_modules/universal and run git fetch upstream - where the 'upstream' reference can be added by writing git remote add upstream git://github.com/GPII/universal.

Create a Branch for the Acceptance Tests

In your local version of the GPII code, create a branch using the command git checkout -b GPII-XXX, where GPII-XXX represents the JIRA ticket. (You run this command under /gpii/node_modules/universal/.)

You can then type git status to verify that there are no files to commit yet.

Creating the NP Sets and Device Reporter Payload

The first step in writing your acceptance test is ensuring that you have the NP set you want to write the test for, and that you have the appropriate device reporter payload. For acceptance testing, both these live in a special folder in the universal repository, namely:

  • testData/preferences/acceptanceTests: this is where you should put the preferences set you want to user for the acceptance test
  • testData/deviceReporter/acceptanceTests: the device reporter file you want to use. Use a name describing the content of the file (eg. "nvda.json" for a file containing the entry for NVDA).

Creating the Config File

The next step is to create a configuration file to use when launching the server for the acceptance tests. The purpose is to point the server to the correct device reporter file.

On the Windows platform, this should be located in the folder windows/tests/acceptanceTests/configs. The filename should describe the acceptance test in question, so for example for NVDA, the name could be something like: "nvda_config.json".

The content of the file should be:

<source lang="javascript"> {

   "typeName": "acceptanceTests.nvda",
   "options": {
       "components": {
           "server": {
               "options": {
                   "components": {
                       "deviceReporter": {
                           "options": {
                               "installedSolutionsUrl": "file://%root/../../../testData/deviceReporter/acceptanceTests/nvda.json"
                           }
                       }
                   }
               }
           }
       }
   },
   "includes": [
       "../../../../node_modules/universal/tests/acceptanceTests/localInstall.json"
   ]

} </source>

Where the important part is the "installedSolutionUrl" key-value pair. The value of this should point to the device reporter file created in the previous step (in the above example, "nvda.json"). You should also make sure that the typeName of the first line is descriptive of your acceptanceTest.

Writing the acceptance test

The final step is to write the actual acceptance test. For this you should create a new file with a descriptive name under the folder: windows/tests/acceptanceTests. For example "AcceptanceTests_nvda.js".

As a template, you can use the following:

<source lang="javascript"> /*

GPII Acceptance Testing

Copyright 2014 Raising the Floor International

Licensed under the New BSD license. You may not use this file except in compliance with this License.

You may obtain a copy of the License at https://github.com/GPII/universal/blob/master/LICENSE.txt

  • /

/*global require,process*/

"use strict"; var fluid = require("universal"),

   path = require("path"),
   gpii = fluid.registerNamespace("gpii");

fluid.require("./AcceptanceTests_incl", require);

var testDefs = [

   {
       (... test descriptions goes here ...)
   }

];

gpii.acceptanceTesting.windows.runTests("nvda_config", testDefs); </source>

where "nvda_config" should be replaced by the name of the config file you created above.

We're now ready to add the actual test descriptions (in the (... test descriptions goes here ...) area of the above template)

Steps:

  • Add a new test fixture to acceptanceTesting.js using the below template:

<source lang="javascript">{

   name: "Testing os_win7 using Flat matchmaker", //REPLACE with title describing your test
   token: "os_win7", //REPLACE with the token you want to log in with
   appinfo: encodeURIComponent("{\"OS\":{\"id\":\"win32\"},\"solutions\":[{\"id\":\"org.nvda-project\"}]}"),
   settingsHandlers: {
       //INSERT Settings handlers entries here (see below)
   },
   processes: [
       //INSERT code to check processes that are running here
   ]

} </source>

  • Modify the test fixture to have the NP set as value for the token key (eg. token: "screenreader_nvda")
  • Temporarily copy the NP set you want to use into the regular folder holding preferences (universal/testData/preferences)
  • Run the server in the same mode you want the acceptance test to run, and use node-inspector. For locally installed solutions, this would be done by running, from the windows folder: node --debug gpii.js
  • Set a breakpoint after line 189 of LifecycleManager.js :
    var settingsHandlerPayload = gpii.lifecycleManager.specToSettingsHandler(solutionId, handlerSpec);.
  • Log in with the NP set you want to use for testing.
  • You should now hit the breakpoint - and here you can inspect the data sent to the settings handlers
  • handlerSpec.type contains the settingshandler type (<SETTINGSHANDLERTYPE>)
  • JSON.stringify(settingsHandlerPayload) will give you the settings in the following format <SETTINGS>:
  <source lang="javascript">"SOLUTION ID": [
      {
          "settings": {....},
          "options": {....}
      }
  ]</source>
  • Now update your test fixtures settingsHandlers block accordingly, in the format:

<source lang="javascript"> settingsHandlers: {

   "<SETTINGSHANDLERTYPE>": {
      "data": <EVERYTHING THAT YOU GOT FROM JSON.stringify(settingsHandlerPayload) EXCEPT "SOLUTION ID"
  }

} </source>

In the end you will get something like:

<source lang="javascript"> settingsHandlers: {

 "gpii.windows.registrySettingsHandler": {
     "data": [{
         "settings": { ... },
         "options": { ... }
      }]
  }

} </source>

  • Add any process checking to ensure that the login was successful and computer was properly configured
  • Repeat for each settingshandler type. If several solutions use the same settingshandlers, just add the content as another object to the "data" array of that settings handler
  • MANUALLY check that the settings you've registered makes sense by going through the settings and solutions registry transformations.
  • The test can be run with a "node tests\acceptanceTests\AcceptanceTests_nvda.js" (where AcceptanceTests_nvda.js should be replaced by the name of your acceptance test.
  • For debugging, use node_inspector (with --debug-brk flag). It's a good idea to put a breakpoint in line 103 of timers.js - the "e" variable will contain any error message that wont get printed to output.

An Example Walkthrough

We assume the following for this walkthrough:

  • Solution: org.gnome.desktop.a11y.magnifier.
  • User: carla (renamed to fm_gnome_magnifier).
  • JIRA ticket for the acceptance test for this solution: GPII-586.
    • Note: each solution should have a JIRA ticket for acceptance tests.


The steps of the walkthrough:

  1. Ensure that your master branch is up to do date:
    • go to gpii/node_modules/universal and run git fetch upstream - where the 'upstream' reference can be added by writing git remote add upstream git://github.com/GPII/universal.
    • If you have local changes that need to be merged first, run git merge upstream/master.
  2. Create a new branch (GPII-586 for this example): git checkout -b GPII-586.
  3. Create one or more NP sets (or copy an existing one) in the acceptance tests folder (gpii/node_modules/universal/testData/preferences/acceptanceTests; see them on GitHub); use sensible names for the NP sets (e.g. to avoid name clashes). You can create a few variations of the same NP set to test out different aspects (e.g. transformations).
  4. Create a file for the acceptance tests in node_modules/universal/tests/acceptanceTests/ (see also GitHub).
    • It's easiest to copy an existing acceptance tests file.
  5. testDefs in the acceptance test file represents an array of acceptance tests that you want to run for your solution. It can contain as many tests as you want. For each token:
    • give the test a name
    • note the token
    • note the GET request address
    • the id of the solution in the solutions registry (Windows, Linux, Android)
    • find the expected return payload through the following steps:
      • set the following environment variable: set NODE_ENV=cloudBasedFlowManager
      • start the server by running node node_modules/kettle/lib/init.js tests/acceptanceTests/configs/. Node should now be running on port 8081.
      • in a browser, manually hit the URL
        • For the format of the URL see the Flow Manager documentation. The general format is http://127.0.0.1:8081/<TOKEN>/settings/{"OS":{"id":"linux"},"solutions":[{"id":"org.gnome.desktop.a11y.magnifier"}]} .
  6. Add the payload from the Flow Manager to the acceptance test:
    • copy the returned payload from the browser window, pretty-print it (e.g. using JSONLint ) and paste the result in the variable "expected".
  7. Run the new tests using node tests/acceptanceTests/<NameOfTestFile.js> to check whether they pass.
  8. In the folder node_modules/universal/tests/acceptanceTests/ create a text file that describes the purpose of the acceptance test. In our case, the name of the test file would be AcceptanceTests_gnome_magnifier.txt.
  9. Add your acceptance test to .../universal/tests/all-tests.js.
    • You can check all tests by running node tests\all-tests.js: this runs all the acceptance tests.
    • If all goes well, you should get a message like jq:   All tests concluded: 58/58 total tests passed in 6798 ms - PASS at the end of the terminal output.
    • If the tests end with an error message like warn - error raised: Error: listen EACESS, you should check that you have no other services listening on port 8080, e.g. Apache Tomcat.
  10. When everything looks correct, you are ready to commit the test code and create a pull request:
    • Run git status to make sure you don't overlook any files that need to be added to the commit
    • Run git add ... to add the new files
    • Commit the changes and make sure that you mention the JIRA in the commit message, e.g. git commit -m “GPII-586: added acceptance tests for Gnome Magnifier”.
    • git push origin <branch> (in our example: git push origin GPII-586).
    • Create a pull request. When you go to GitHub, you should be able to find the button "Compare & pull request". You can also create a pull request from your own branch.

Wiki Categories