Testing and DevOps

From wiki.gpii
Jump to: navigation, search

From its inception, the GPII architecture team has emphasized the role of comprehensive testing and automation to help ensure that the software will be able to grow from research into production. Further effort has been invested in refining this aspect of the architecture over the past several months.

Automated Deployment

One of the most notable technical benefits of the cloud revolution has been the move towards DevOps (“developer operations”) techniques that aim to unify development, QA, and administration tasks using automated tools (Roche 40). Without tooling and architectural support, deploying services such as the GPII Preferences Server would be complex and time-consuming. Dependencies need to be installed, roles configured, and databases populated with data. Left unmanaged, this administration cost would prohibit the horizontal scaling techniques typical of cloud environments (discussed in D105.1.1), which involve responding to load on the fly by “spinning up” additional virtual machines to serve as nodes managed by a load balancer.

In order to make horizontal scaling feasible, we have adopted the Ansible configuration management tool , which helps automate the process of spinning up virtual machines configured for GPII services. Playbooks (Ansible’s term for an automation script) have been implemented for deploying generic Node.js and CouchDB-based virtual machines, instances of the Preferences Server, and Nginx-based load balancer nodes . This will make it much easier to quickly update and redeploy GPII services, or to add additional resources in cases of high load.

Acceptance Testing

The Cloud4All realtime framework has featured a large suite of unit tests since its inception. This form of testing helps to catch bugs within a single module or unit of code. Nonetheless, unit tests alone are not sufficient to verify that a whole system is working; integration and acceptance tests are also required. The innovative GPII testing infrastructure has been architected to make the process of writing testing cases easier, supporting the generation new tests algorithmically. For example, our acceptance tests, being written in a declarative JSON form, can easily be reprocessed to run as integration tests, where the platform-specific elements are replaced with mocks.

End-to-end acceptance tests have been implemented for much of the GPII realtime framework. These allow us to validate the system throughout the autopersonalization workflow using realistic user scenarios. To this end, all new changes to the realtime framework must now be accompanied by acceptance tests.

Documentation and tutorials for how to write acceptance tests is available in the GPII wiki , and workshops have been presented to the GPII community in order to increase our test coverage for contributed code and documents.

Automated Testing

Our approach to cloud automation and acceptance testing has recently been combined, introducing automated daily tests of the entire GPII system. Using a combination of Vagrant -based virtual machine provisioning and Ansible playbooks for automated test invocation, a Windows 8.1 VM is automatically spun up and configured with the latest GPII source code and SP3 assistive technologies. The system automatically runs the acceptance test suite on this VM (which is hosted on the IDRC’s cloud servers) and reports the results back to a Jenkins continuous integration server. Developer are notified if any of the tests fail, helping to ensure that new features and code changes do not cause regressions in the system.

Colin Clark, December 2014