Technical Validation

From wiki.gpii
Jump to: navigation, search

This page is about technical validation in the Cloud4all project (November 2011 – October 2015) and is now outdated.

Technical Validation Task Description

This activity takes care of technical and interoperability testing issues related to the solutions/applications developed in the framework of SP3 . Specific technical validation and interoperability testing plans will be developed before each testing phase and for each prototype, in close collaboration with all the relevant development teams, in order to detail the goals, the indicators and the approaches to be followed. The use cases and application scenarios are also taken into account, as they indicate the targeted user experience when using the application.

Testing are carried out by the development teams of each Cloud4All prototype and should be conducted before the iterations of the user testing of SP4 in each phase, in order to allow adequate time for debugging, if needed, before the user’s experience with the applications. In case significant technical failures are noticed, then the respective prototypes will move to the next user test iteration. It is expected that in the course of this testing possible modifications to the APfP mechanisms, as well as major or minor bugs that could prevent proper adaptation of the desktop and web environment to the end user profile may be identified and flagged up to the developers of the prototypes. The findings of this activity will be continuously fed back to the respective development teams. In this way, from each evaluation round, results both from the users (SP4 testing) but also from the developers point of view (SP3 testing) will be aggregated to lead to further optimisation in the next version of the prototype or application. In each technical validation iteration, the versions of the prototypes will be assessed in terms of technical performance against the initial specifications (before being commissioned to be tested with users). The features to be assessed may include:

  • Functionality Testing: Assessment for its correct functioning according to its functional and technical specifications;
  • User Interface Testing: Assessment for its easy operation, content navigation, etc.;
  • Interaction Testing: Assessment for errors in the interaction with other modules developed in Cloud4All (e.g.the profile server);
  • Compatibility Testing: Assessment for compatibility with diverse devices (e.g. smartphones, Java-based phones and PC’s), diverse OSs (e.g. Windows and Linux) and diverse browsers (e.g. IE, and Firefox);
  • Performance Testing: Assessment for its performance for diverse Internet connection speeds, its response to the diverse devices, OSs and browsers and stress testing;
  • Security Testing: Assessment for unauthorised access to information, unsecured provision of private data, etc.


Technical Validation Report Template

A common document template is used among all applications/solutions to plan technical validation tests and report the outcomes of these tests.

Structure

The structure of this template is highlighted below:

1.TEST PLAN

1.1 Introduction

[Short description of application/tool. A justification of the need for testing may be included]

1.2 Test items

[This section should identify the item(s) to be tested]

The following items of […] application will be verified and validated during the 3-phace technical validation iterations:

  • Item 1
  • Item 2

1.3 Features/capabilities to be tested

[This section should identify all the features and combinations of features that are to be tested]

1.4 Testing Approach

[This section should specify the major activities, methods and tools that are to be used to test the designated groups of features]

1.5 Item pass/fail criteria

[This section should specify the criteria to be used to decide whether each test item has passed or failed testing]

1.6 Testing tasks – Environmental needs

[This section should identify the set of tasks necessary to prepare for and perform testing. Also, should specify both the necessary and desired properties of the test environment (i.e. hardware, software, tools, ..)]

1.7 Non functional requirements

[This section should identify the set of non functional requirements that are crucial for the particular application. For most cases rather than providing absolute values a percentage increment comparing to normal application performance (application without GPII functionality) is preferred. The table contains NFR according to ISO 25010, keep those applicable to your solution.]


Technical Key Indicator/Metric

Tools/ways to measure it

Success Threshold

Performance Efficiency: Time behaviour:  Response Time

[degree to which the response and processing times and throughput rates of a product or system, when performing its functions, meet requirements]

i.e.  Logs inside the app

i.e. < 5 secs increment

Performance Efficiency: Recourse utilization: CPU Usage

[degree to which the amounts and types of resources used by a product or system when performing its functions meet requirements]

i.e. System Monitor

i.e. <10% of increment

Performance Efficiency: Recourse utilization: Memory Usage

[degree to which the amounts and types of resources used by a product or system when performing its functions meet requirements]

i.e. System monitor

i.e. < 10% of increment

Performance Efficiency: Recourse utilization: Capacity

[degree to which the maximum limits of a product or system parameter  meet  requirements. Parameters can include the number of items that can be stored, the number of concurrent users, the communication bandwidth, throughput of transactions, and size of database ]

 

 

Compatibility: Co-existence

[degree to which a product can perform its required functions efficiently while sharing a common environment and resources with other products, without detrimental impact on any other product]

 

 

Compatibility: Interoperability

[degree to which two or more systems, products or components can exchange information and use the information that has been exchanged]

 

 

Reliability: Maturity

[degree to which a system meets needs for reliability under normal operation]

i.e. Mean time between failures (MTBF)

 

 

 

 

Reliability: Availability

[degree to which a system, product or component is operational and accessible when required for use]

 

 

Reliability: Fault tolerance

[degree to which a system, product or component operates as intended despite the presence of hardware or software faults]

i.e. #of Failures

i.e. < 1/10 fail rate

Reliability: Recoverability

[degree to which, in the event of an interruption or a failure, a product or system can recover the data directly affected and re-establish the desired state of the system]

i.e. Mean time to recover (MΤΤR)

 

Security: Confidentiality

[degree to which a product or system ensures that data are accessible only to those authorized to have access]

 

 

Security: Integrity

[degree to which a system, product or component prevents unauthorized access to, or modification of, computer programs or data]

 

 

Security: Non-repudiation

[degree to which actions or events can be proven to have taken place, so that the events or actions cannot be repudiated later]

 

 

Security: Accountability

[degree to which the actions of an entity can be traced uniquely to the entity]

 

 

Security: Authenticity

[degree to which the identity of a subject or resource can be proved to be the one claimed]

 

 

Maintainability: Modularity

[degree to which a system or computer program is composed of discrete components such that a change to one component has minimal impact on other components]

 

 

Maintainability: Reusability

[degree to which an asset can be used in more than one system, or in building other assets]

 

 

Maintainability: Analysability

[degree of effectiveness and efficiency with which it is possible to assess the impact on a product or system of an intended change to one or more of its parts, or to diagnose a product for deficiencies or causes of failures, or to identify parts to be modified]

 

 

Maintainability: Modifiability

[degree to which a product or system can be effectively and efficiently modified without introducing defects or degrading existing product quality]

 

 

Maintainability: Testability

[degree of effectiveness and efficiency with which test criteria can be established for a system, product or component and tests can be performed to determine whether those criteria have been met]

 

 

Portability: Adaptability

[degree to which a product or system can effectively and efficiently be adapted for different or evolving hardware, software or other operational or usage environments]

 

 

Portability: Installability

[degree of effectiveness and efficiency with which a product or system can be successfully installed and/or uninstalled in a specified environment]

 

 

Portability: Replaceability

[degree to which a product can be replaced by another specified software product for the same purpose in the same environment]

 

 


2.TEST ANALYSIS (for first iteration cycle validation tests)

[Test analysis for each item identified in chapter 1 to be tested. Ideally the analysis should be repeated each time an item changes version]

 2.1 Item 1

2.1.1. Expected Outcome

[Describe or depict shortly the expected result of the test]

2.1.2. Functional Capabilities / Performance

[Describe the functional capabilities and compare the outcomes with the expected results]

2.1.3. Deviations from Test Plan

[Describe any deviations from the testing plan that occurred during performance of the test.  List reasons for the deviations.]

2.2 Next item

2.3 Non functional requirements

[Provide the measured values of the defined non functional requirements and list reasons for the deviations]


3.SUMMARY AND CONCLUSIONS (for first iteration cycle validation tests)

3.1 Demonstrated Features / Capabilities

3.2 System Deficiencies

3.3 Recommended modifications / improvements

3.4 System Acceptance

Template Document

The MS Word template document  can be downloaded at: Cloud4all_SP3_TechnicalValidation_Template_V01.docx


First Iteration Technical Validation Reports

The following table summarizes the results of 1st iteration phase validation tests:

Solution

Partner

Contact

Status

Report

SP1 Tools

Preference Management Tool - WP102 (A102.1-A102.3)

CERTH

Kostas Kalogirou (kalogir@certh.gr)

Chris Petsos (cpetsos@certh.gr)

Ready

TechnicalValidation_PreferenceManagementTool_v03.docx

SP3 Applications/Solutions

Windows 7 – A301.1

 

 

 Not Ready

 

Linux/Gnome – A301.1

EMERGYA

Javier Hernández Antúnez [jhernandez@emergya.com]

Ready

Cloud4all_SP3_TechnicalValidation_GNULinux-Gnome_V03.docx

UI Options – A303.3

 

 

 Not Ready

 

EASIT4All (social networking app) – A303.4

BDIGITAL

Xavier Rafael [xrafael@bdigital.org]

Ready

Cloud4all_SP3_TechnicalValidation_Easit4all_V03.docx

Maavis – A304.1

OPENDIR

Steve Lee [steve@opendirective.com]

 Ready

 Cloud4all_SP3_TechnicalValidation_Maavis_V03.docx

Mobile Accessibility for Android – A304.2

CODEFACTORY

Ferran Gállego [ferran.gallego@codefactory.es]

Ready

Cloud4all_SP3_TechnicalValidation_MobileAccessibility_V03.docx

Read&Write Gold – A304.3

TEXTHELP

David Hankin [d.hankin@texthelp.com]

Ready

Cloud4all_SP3_TechnicalValidation_Read&Write Gold_V03.docx

MS Pixel Sense – A305.2

SILO

Karantjias Thanos  [tkaratzias@singularlogic.eu]

Ready

Cloud4all_SP3_TechnicalValidation_MicrosoftPixelSense-SOCIABLE_V03.docx





 

 

 

 

 

 


Second Iteration Technical Validation Reports

The following table summarizes the results of the 2nd iteration phase of the technical validation tests:

   Readyhttps://drive.google.com/file/d/0B-SVJw9oZa-mOXBHWFh6TE0xUU0/view?usp=sharingReady

Solution

Partner

Contact

Status

Report

SP2 Tools

A202: Generation & Maintenance of Metadata for content and solutions


CERTH/ITI

Nikolaos Kaklanis

[nkak@iti.gr]

Ready
Cloud4all_SP2_TechnicalValidation_SAT_v08.doc

SP3 Applications/Solutions

Windows 7 – A301.1

 

 

 Not Ready

 

Linux/Gnome – A301.1

EMERGYA

Javier Hernández Antúnez [jhernandez@emergya.com]

 Ready

Cloud4all_SP3_TechnicalValidation_GNULinux-Gnome_V05.docx

A301.2: Auto-configuration of accessibility features of web browsers
ILUNION

José Antonio Gutiérrez

[JAgutierrez@consultoria.ilunion.com]

Ready
Cloud4all_SP3_TechnicalValidation_Extensions_v5.doc
A302.1: Auto-configuration of accessibility features of simple phones’ OS
CERTH/HIT
Kostas Kalogirou [kalogir@certh.gr ]
Ready

Cloud4all_ SP3_TechnicalValidation_JavaMobileApplication_v03.docx

A302.2:Auto-configuration of accessibility features of smart phones' OS

 FVE

 Tomas de Andres [tomas.deandres@vodafone.com ]

 Ready

Cloud4all_SP3_TechnicalValidation_Android_V03.docx

EASIT4All (social networking app) – A303.4

BDIGITAL

Xavier Rafael [xrafael@bdigital.org]

Maavis – A304.1

OPENDIR

Steve Lee [steve@opendirective.com]

 Ready

Cloud4all_SP3_TechnicalValidation_Maavis_V05.docx

A304.1: Automatic selection and configuration of downloadable ATs
OMNITORE
Christer Ulfsparre [christer.ulfsparre@omnitor.se]
Ready
Cloud4all_SP3_TechnicalValidation_Omnitor_05.docx

Mobile Accessibility for Android – A304.2

CODEFACTORY

Ferran Gállego [ferran.gallego@codefactory.es]

Ready

Cloud4all_SP3_TechnicalValidation_MobileAccessibility_V02.docx

A305.2: MS PixelSense and Sociable

SILO

Karantjias Thanos  [tkaratzias@singularlogic.eu]

Ready

Cloud4all_SP3_TechnicalValidation_MicrosoftPixelSense-SOCIABLE_V04.doc

A305.3: Smart Houses Preference Invoker

ASTEA

Boyan Sheytanov  [bsheytanov@asteasolutions.com]

Ready

Cloud4all_SP3_TechnicalValidation_SmartHouses_v5.doc

A305.4: DTV Preference Invoker
TP Vision

Thomas Soens

[Thomas.Soens@tpvision.com]

Ready

Cloud4all_SP3_TechnicalValidation_DTV_v4.doc

 

 

 

 

 

SP3 Applications Monthly Technical Validation Testing Results

The integration of SP3 applications with the GPII is tested periodically, to ensure that changes performed on either side (application of gpii changes) do not affect the integration. The test results are updated after the completion of each testing cycle in an excel report: SP3Applications_MonthlyTesting_03.xlxs


See also

Wiki Categories