1st Phase Testing Experimental Plans templates

From wiki.gpii
Revision as of 10:36, 12 February 2013 by FraunhoferIAO (talk | contribs)
Jump to: navigation, search

Rule-based and statistical MM

Name of the solution to be tested:

Rule-based & statistical MM

'Related 'Cloud4all activities:

A204.3: Rules/heuristics based algorithms for profile matching
A204.4: Statistical methods for dynamic profile matching

Developer(s):

Rule-based matchmaker: TUD, CERTH-ITI
Statistical matchmaker: HdM

Hardware requirements/AT:

Hardware:

  • 1 gigahertz (GHz) or faster 32-bit (x86) or 64-bit (x64) processor.
  • 1 gigabyte (GB) RAM (32-bit) or 2 GB RAM (64-bit).
  • 900 MB available hard disk space. (150 MB for the Architecture + 126 MB for Java SE Runtime Environment + 245 MB for Java SE Development Kit + 100 MB for rule-based MM + 2 MB for statistical MB + 50 MB for NVDA + 200 MB for Firefox + some space for the installers, which can be removed after the installation process)
  • Internet access (because the Preferences Server will not be running locally).
  • speakers (for speech output by the screen readers).
  • Braille display, if available at the test site; check in advance whether it can be used with both NVDA and Orca.

Note: The processor and memory requirements are based on Windows 7. The requirements for Fedora 18 are less heavy:

  • 400MHz or faster processor.
  • at least 768 MB memory (RAM), 1 GB recommended.
  • at least 10 GB hard drive space.

AT:

  • On Linux/GNOME, the GNOMEShell Magnifier will be used by users who need magnification, and the screen reader Orca will be used by blind users. This should not require an additional installation step.
  • On Windows, the built-in features (e.g. magnification and contrast) will be used by low-vision users, and the screen reader NVDA will be used by blind users. NVDA is not part of Windows and requires a separate installation step.

Software requirements:

Java: Java SE Development Kit (JDK) 6 or higher (for the rule-based matchmaker).

Linux/GNOME:

  • Fedora 18 (with GNOME 3.6; released 15 January 2013)
  • No GNOME Shell extensions or third-party modifications to the original GNOME UX experience will be installed, with the following exception: GnomeTweakTool will need to be installed on distributions where the font size cannot be set through System Settings (e.g. Fedora 18). Installation instructions for GnomeTweakTool: http://wiki.gpii.net/index.php/GNOME_Tools.

Windows OSs (Windows 7, 32) and built-in features: Magnifier, high-contrast settings etc.

Note: We will not use 64-bit versions of Windows.

Downloadable link to the solution to be tested:

Please, provide with an executable file or similar to install the demo/test components/solution.

Available languages: Please check and update the following information if necessary:

The matchmakers do not provide an interface to end users, so the languages in which these components can be tested is determined by the available localisations of the operating systems and assistive technologies mentioned above.

Windows 7 and its accessibility features are available in German, Spanish and Greek.

NVDA's user interface can be changed to German, Spanish and Greek.

Fedora 18 is available in German, Spanish and Greek. The user interface language needs to be selected during installation. Orca is also available in German, Spanish and Greek.

Target audience:
  • Low-vision users
  • Blind users

Note: Users without disabilities will not be included in the first phase, because the selected needs & preferences terms are tailored to users with visual impairments.

Step-by-step description of proposed procedure for running the demo/test (tasks):

(1) The user has defined his/her N&P set for a specific platform A; this N&P set is stored in the system. For Windows users, platform A would be Windows; for Linux users, platform A would be Linux.

(2) The user logs in on platform B and configures his/her settings manually. For Windows users, platform B would be Linux; for Linux users, platform B would be Windows.

(3a) Platform B is auto-configured according to settings provided by the rule-based matchmaker.*

Or

(3b) Platform B is auto-configured according to settings provided by the statistical matchmaker.

(4) The user is asked to perform the following tasks:

  • READING: Reading an email or a longer text on a static webpage.
  • BROWSING: Searching information on a static webpage.
  • FORM FILLING: Writing some text or filling in a web form.

So the tasks are done using the preferences inferred by the matchmakers.

(5) The user with the assistance of an expert logs in on platform B, using the user’s token and configures his/her settings manually until the user is satisfied with the settings. (Since the user will probably not be familiar with this OS, the expert needs to help the user here. So the expert needs to know where all the settings can be found.) These final settings needs to be saved, so they can be compared with the settings inferred by the matchmakers.

Research objectives of the test:

The rule-based matchmaker succeeds if it scores at least 50% compared to the settings created by the user and the expert in the last step of the test procedure. (Success criterion for M18)

The statistical matchmaker succeeds if it scores at least 50% compared to the settings created by the user and the expert in the last step of the test procedure. (Success criterion for M18)

Note: the score is calculated by multiplying a distance metric with 100. The distance metric is a value between 0 and 1 that combines the distance metrics for each of the preferences or settings used in this test phase. These preferences and their distance calculation are described in the wiki page http://wiki.gpii.net/index.php/Cloud4all_Testing:_Essential_Registry_Terms.

Pre-Test questionnaire. Is there any specific information (related to your demo/test) you would like to gather from the users before the tests? If yes, please, write each one as a question for including in the Pre-Test questionnaire

Apart from the standard questions we will have (common for all prototypes) such as for example demographics, time (years) using computers, etc. please check if there is any relevant information you would like to know in advance from the users before they do the test. e.g. screen reader they usually use (giving them few options – the most relevant), which changes they usually perform in their computer before using it, which problems they face when trying to use another computer (not their own computer)… Please, try to provide them as questions in order to avoid misunderstandings between what you want to gather and what the person who changes it into a question (it could happen if you send them as sentences). An example:

1. When you use a PC, which operating system do you use most frequently?

  • Windows
  • Linux
  • iOS
  • Which version of this operating system do you use?

2. When you use a mobile device, which operating system do you use most frequently?

  • iOS
  • Android
  • Symbian
  • Windows
  • Others

2. Which AT are you using (screen reader, magnifier, other)?

For screen-reader users: Which screen reader do you use?

  • JAWS
  • NVDA
  • SuperNova
  • Window-Eyes
  • VoiceOver
  • Orca
  • Other: ...

For screen magnifier users: Which screen magnifier do you use?

  • Magnifier (Windows)
  • Magnifier (ORCA)
  • ZoomText
  • Virtual Magnifying Glass
  • Other: ...

3. Do you sometimes switch between devices/platforms?

If yes, between which devices/platforms? 

___________________________________________

Do you experience problems when moving from your system to another? If yes, which ones?

___________________________________________

4. What are your needs (necessary to interact with PC, e.g. speech output)?

____________________________________________

5. What are your preferences (some preferred settings if known, e.g. volume, speech rate)?

____________________________________________

6. I have a good knowledge of my system.

[ ] Strongly disagree [ ] Disagree [ ] Neither agree nor disagree [ ] Agree [ ] Strongly agree

6.1. Can you change all the settings of your system and your AT that you need to change?

6.2. Do you often need help to change a setting?

7. Who has set up/configured your system?

____________________________________________


Please check and update the following information:

Add which parameters are going to change when running the tests, e.g. if the screen reader is configured automatically, specify which parameters are going to be changed: speed, pitch, male or female voice…
OS and magnification settings for which changes will be monitored:

  • ForegroundColour (depends on theme)
  • BackgroundColour
  • FontSize
  • CursorSize
  • MagnifierEnabled
  • Magnification (i.e. magnification level)
  • MagnificationPosition
  • Tracking

Screen reader settings for which changes will be monitored:

  • ScreenReaderTTSEnabled
  • AuditoryOutLanguage
  • SpeechRate
  • SpeakTutorialMessages
  • KeyEcho
  • WordEcho
  • AnnounceCapitals
  • ScreenReaderBrailleOutput
  • PunctuationVerbosity
  • ReadingUnit

For each of these settings, a distance measure will be calculated between the values inferred by the matchmakers and the values from the final set-up step.

Other parameters:

  • Completion per task
  • Number of UI settings changed by user per task
  • Number of user errors per task
  • Post-task: User-perceived difficulty level per task
  • Post-task: User experience.

Post-test questionnaire. Is there any specific information (related to your demo/test) you would like to gather from the users after the tests? If yes, please, write each one as a question for including in the Post-Test questionnaire:

Apart from the standard questions we will have (common for all prototypes) such as for example if they would use Cloud4all, improvement recommendations, further comments, etc. please check if there is any relevant information you would like to know from the users after they do the test. e.g. if they found any specific difficulty when performing the test, if they changes provided are enough, etc… Please, try to provide them as questions in order to avoid misunderstandings between what you want to gather and what the person who changes it into a question (it could happen if you send them as sentences). An example:

1. Did you find difficulties when using the solution XXX? [Yes/No + further justification]

Please, try to be as much specific as possible.

Contingency plans. Identify possible errors and bugs that might appear during the test and how to mitigate them:

Please add any error prompt that the system could present, and how to avoid or mitigate them. Please, be as much detailed as possible in order to avoid relevant problems during the test performance.

Contact for technical support (name and e-mail):

Rule-based matchmaker:

  • Claudia Loitsch: claudia.loitsch(at)tu-dresden.de
  •  ? Kostas Votis: kvotis(at)iti.gr

Statistical matchmaker:

  • Andy Stiegler: stiegler(at)hdm-stuttgart.de

MM User Manual

Name of the solution: Rule-based and statitical MM
Long description of the functionality of the Demo/Test:  User manual, including images and link to videos if needed:

Link to other relevant training information:

Add links to tutorials, documents or training courses, related with the solution.

Detailed installation instructions for running the Test/Demo:

Please provide step by step instructions to install the solutions, and clarify if there any plugin or similar that needs to be installed:

Fedora 18 Installation Guide: https://docs.fedoraproject.org/en-US/Fedora/18/html/Installation_Guide/index.html. Make sure you remember the root password that you set during installation.

GnomeTweakTool installation instructions: http://wiki.gpii.net/index.php/GNOME_Tools.

NVDA User Guide: Getting and Setting Up NVDA: http://www.nvda-project.org/documentation/userGuide.html#toc9.

Setting up the GPII/Cloud4all runtime framework: http://wiki.gpii.net/index.php/Core_%28real-time%29_Framework_v0.1_-_Installation_Instructions.

Installation instructions for the statistical matchmaker: @@.

JDK 7 and JRE 7 Installation Guide: http://docs.oracle.com/javase/7/docs/webnotes/install/index.html.

Installation instructions for the rule-based matchmaker: @@.

Needs and preferences management tool with basic functions

Name of the solution to be tested:

Needs and preferences management tool with basic functions

Cloud4all Activity related:

A102.2 User interface parameters for viewing and editing profiles

A102.3 Tool for users and care-givers to work with their profiles

Developer(s):

CERTH/ HIT

Fraunhofer

Hardware requirements/AT:

Laptop or standard PC with Internet connection.

Primary test users are elderly people with rather mild forms of visual, hearing or cognitive disabilities, since the test is meant to provide rather usability information than accessibility information.
AT has to be provided according to the needs of the invited users: If a test participant uses an AT already at home or work, then this AT should be provided in the test, too. Particularly, for the visually impaired users, magnifying AT and screen readers need to be held available on the test PC.

Software requirements:


A102.4 S/W prototype available

Optional: connection to statistical Matchmaker (needed for the preview). General usability of the tool can also be tested without data transmission from and to the MM. In this case a pre-defined setting change has to be part of the test task, the preview then will be a simulation of setting changes.

Downloadable link to the solution to be tested:

Please, provide with an executable file or similar to install the demo/test components/solution.Will be provided by CERTH

Available languages: Please check and update the following information if necessary:
  • English
  • Spanish (Translated by Fraunhofer)
  • German (translated by Fraunhofer)
  • Greek (translated by CERTH)

Target audience:


•          Visual impaired users (only mild forms of vision impairements)

•          Elderly users mixed vision, hearing, and cognitive disabilities

•          Users with intellectual disabilities

•          Users with dislexia

•          Low literacy Users

Description of Step by Step proposed procedure for running the Demo/Test (Tasks):

Test procedure:

1. Welcome: First the experimenter should welcome the test users and thank the user for participating in the experiment. After giving an explanation of the test purpose and its proceeding, the experimenter collects information relevant for the test. This information includes the demographic characteristics of the test user (e.g. age and gender), information about his life situation (e.g. is the user still working or in retirement, assisted living, home for the aged or independent living situation), information about impairments, and information about his computer literacy (CLS). The test and the user should be recorded with a video camera, so a video release form has to be signed first.

2. Working on the scenario tasks: While the user works on the scenario task the experimenter takes notes of the user's actions, errors, comments and completion of the task. For the task scenraio processing the Think aloud technique should be used:

Think aloud

A thinking-aloud test involves having a test subject use the system while continuously thinking out loud (Lewis, 1981). By verbalizing his thoughts, the test user enables the developer to understand how he views the computer system. One gets a very direct understanding of what parts of the dialogue cause the most problems because the think-aloud method shows how users interpret each individual interface item (Nielsen, 1993). A problem of the thinking-aloud method is that it can’t be used together with time measurements, because the need to verbalize can slow users down. When using the think-aloud method the experimenter should often prompt the user to think out loud by asking questions like “What are you thinking now?” and “What do you think this message means?”

An important aspect that has to be considered when using the thinking-aloud method is that not every negative statement by the user is a real big usability problem. The experimenter should evaluate each statement together with his observed findings (What did the test user do? Does the user action correspond to the user’s statement?). Another important point is that users tend to ask questions about the system and the task while working with the system and thinking out loud. The experimenter should not answer those questions if the answer provides information that influences the test results, but instead keep the user talking with counter-questions. When using the ‘think aloud’ method, a video recording of the test is very helpful to get a look at the user’s behaviour and comments afterwards. 

3. Post-questionnaire and interview: After filling out hte post-questionnaires, the user is debriefed and is asked in an interview for further comments about problems or events during the test that were hard to understand for the experimenter.  Additional the user can give any suggestions for improvements. Fraunhofer will provide task scenarios for the following steps:

'1. 'Login/ Registration (first time user):

Start: Login/ Logout screen

Action: the user selects "register". The registration process is still not defined, so this link is non-functional. the examiner has to explain that there will be a registration process. After clicking on the "register" link the user will be led to the N&P set initialisation screen.

End: N&P Initialisation screen

Comprehensibility: The user should be ask what he/she expects for the three option fors preference set initialisation. For example "What du think will happen, if you select the option 'take over device settings'?"


2. Create an N&P set using the Preferences Management Editor/ by taking over the actual device settings  („N&P set initialisation“).

Start: N&P Initialisation screen

Action: The user selects 'take over device settings and will be led to the preference editor (in case mirroring device settings is not feasible for the prototype standard Desktop PC settings will be shown in the editor).

End: Preference Editor screen

WORDING TEST:

A wording test can be used to check the comprehensibility of applied labels/ buttons, distinction betwen interactive and non-interactive areas,....The examiner asks the user to have a look on the screen and to explain what the user assumes a UI element/area is for (what functionality is behind this element/ area; is it interactive or just displayed information; what will happen, when selecting this element). IMPORTANT: The user should not click on the screen, he/ she should just explain. 

3. Logout:

Start: Preference Editor screen

Action: The user logs out by clicking on the log-out button.

End: The Login/ logout screen will be shown.

4. Login (already registered user):

Start: Login/ Logout screen

Action: The user logs in by typing in user name and password (a dummy test account has been set up before).

End: After login the PMT main screen is shown (with the two options 'Preference Editor' and 'GPII Management')


5. Edit defined aspects of the N&P set, using the Preferences Management Editor („N&P set view and edit“):

Start: PMT main screen

Action: The user selects 'Preference Editor' and selects a defined UI Option (UI option pre-defined by examiner in the task). The UI-Option selection process can be done by using the search functionality, recently used UI-Options,'all categories' or 'common UI Options. Whene the user has done a selection the examiner asks, if the user knows/ imagines an alternative way for UI Option selection. After UI-Option selection the user adjusts the setting and saves the setting change.

End: Editor screen with saved setting changes

 

Research Objectives of the test:

It will be evaluated, if the PM Tool  is easy to use and provides a high usability. Labels should be clear to the users and the user should be able to easily navigate through the menu. Usability problems that minimize user satisfaction have to be identified.

Which changes have to be applied to the prototype in order to improve / maximise its accessibility and usability?

Pre-Test questionnaire. Is there any specific information (related to your demo/test) you would like to gather from the users before the tests? If yes, please, write each one as a question for including in the Pre-Test questionnaire:

Computer Literacy Scale: will be provided in English, German, Spanish (has to be translated to Greek - Fraunhofer will provide English version for translation)

Indicators/parameters. Which are the parameters or indicators of success in the Demo/Test (including metrics and success thresholds)?

Please check and update the following information:

Add which parameters are going to change when running the tests, e.g. if the screen reader is configured automatically, specify which parameters are going to be changed: speed, pitch, male or female voice…

·  The user understands the planned functions.

·  The users regard the functions as sufficient for their purposes.

·  The functions are presented in a usable manner. 

- high score on satisfaction scale (AttrakDiff)

- high score on usability scale (SUS or ISOMETRICS-S)

Post-test questionnaire. Is there any specific information (related to your demo/test) you would like to gather from the users after the tests? If yes, please, write each one as a question for including in the Post-Test questionnaire:

Apart from the standard questions we will have (common for all prototypes) such as for example if they would use Cloud4all, improvement recommendations, further comments, etc. please check if there is any relevant information you would like to know from the users after they do the test. e.g. if they found any specific difficulty when performing the test, if they changes provided are enough, etc… Please, try to provide them as questions in order to avoid misunderstandings between what you want to gather and what the person who changes it into a question (it could happen if you send them as sentences). An example:

1. Did you find difficulties when using the solution XXX? [Yes/No + further justification]

Please, try to be as much specific as possible.

What did you like the most?

What did you like less?

AttrakDiff (http://www.attrakdiff.de/en/Home/)

System Usability Scale

Contingency plans. Identify possible errors and bugs that might appear during the test and how to mitigate them:

Please add any error prompt that the system could present, and how to avoid or mitigate them. Please, be as much detailed as possible in order to avoid relevant problems during the test performance.To be edited by CERTH

Contact for technical support (name and e-mail):

Kostas Kalogirou <kalogir@certh.gr>


User manual (at the moment this is not prioritary)

Name of the solution: Needs and preferences management tool with basic functions
Long description of the functionality of the Demo/Test:  User manual, including images anfd link to videos if needed:

Link to other relevant training information:

Add links to tutorials, documents or training courses, related with the solution.

Detailed installation instructions for running the Test/Demo:

Please provide step by step instructions to install the solutions, and clarify if there any plugin or similar that needs to be installed:


Needs and preferences management tool with extended functions

Name of the solution to be tested:

Needs and preferences management tool with extended functions

Cloud4all Activity related:

A102.4 Research and development of user interface enhanced functionalities

Developer(s):

Fraunhofer

Hardware requirements/AT:

Please check and update the following information if necessary:

None. mock ups test only

Software requirements:

Please check and update the following information if necessary:

None. mock ups test only

Downloadable link to the solution to be tested:

None. mock ups test only

Available languages: Please check and update the following information if necessary:
  • English
  • Spanish (Translated by Fraunhofer)
  • German (translated by Fraunhofer)
  • Greek (translated by CERTH)
Target audience:

• Elderly users mixed vision, hearing, and cognitive disabilities

• Low literacy Users
 

Description of Step by Step proposed procedure for running the Demo/Test (Tasks):

Please check and update the following information with detailed step by step instructions for running the demo/test:

The evaluation of the advanced PMT functionalities will be a part of the evaluation of the basic PMT. The basic PMT evaluation will be supplemented with specific tasks for advanced functionalities.

At first the basic PMT will be evaluated with the users. then the prototype will be exchanged (functional to non-functional) and the user will do some tasks for advanced functionalities using the second prototype. The difference to the first prototype is, that hte examiner has to simulate the system behaviour (wizard of OZ).

The tasks for advanced functionalities will be added later (we are still working on the conceptual design)

1. Changing privacy settings

2. changing identification method

3. Edit conditions for prefernece sets


Mention all the details from the detailed test instructor manual.

Research Objectives of the test:

Please check and update the following information. Try to summarize the research objective/s as statements instead of questions (e.g.

“The solution XXX will be succeed whether users find more appropriate to use it than the standard solution provided by the OS itself.”

“It will be evaluated whether the rule-based matchmaker provides better results that the statistical one, or the opposite.”)

It will be evaluated, if the advanced PMT functionalities are easy to use and provide a high usability. Labels should be clear to the users and the user should be able to easily navigate through the menu. Usability problems that minimize user satisfaction have to be identified.

Which changes have to be applied to the prototype in order to improve / maximise its accessibility and usability?

Pre-Test questionnaire. Is there any specific information (related to your demo/test) you would like to gather from the users before the tests? If yes, please, write each one as a question for including in the Pre-Test questionnaire:

see basic PMT

Indicators/parameters. Which are the parameters or indicators of success in the Demo/Test (including metrics and success thresholds)?

Please check and update the following information:

Add which parameters are going to change when running the tests, e.g. if the screen reader is configured automatically, specify which parameters are going to be changed: speed, pitch, male or female voice…

·  The user understands the planned functions.

·  The users regard the functions as sufficient for their purposes.

·  The functions are presented in a usable manner. 

- high score on satisfaction scale (AttrakDiff)

- high score on usability scale (SUS or ISOMETRICS-S)

Post-test questionnaire. Is there any specific information (related to your demo/test) you would like to gather from the users after the tests? If yes, please, write each one as a question for including in the Post-Test questionnaire:

Apart from the standard questions we will have (common for all prototypes) such as for example if they would use Cloud4all, improvement recommendations, further comments, etc. please check if there is any relevant information you would like to know from the users after they do the test. e.g. if they found any specific difficulty when performing the test, if they changes provided are enough, etc… Please, try to provide them as questions in order to avoid misunderstandings between what you want to gather and what the person who changes it into a question (it could happen if you send them as sentences). An example:

1. Did you find difficulties when using the solution XXX? [Yes/No + further justification]

Please, try to be as much specific as possible.

After each task, the test instructor asks open, non-suggestive questions in order to understand possible problems the user has encountered while solving the task.

After the test, the instructor categorizes the encountered usability problems and reports them in a pre-defined matrix format.

SUS


Contingency plans. Identify possible errors and bugs that might appear during the test and how to mitigate them:

Due to the fact that the prototype will be paper based only some functionality will be provided. In case he user selects an option without functionality behind the examiner has to explain that htere will be a functionality at a later time in development process.

Contact for technical support (name and e-mail):

 Anne Krüger anne-elisabeth.krueger@iao.fraunhofer.de, Vivien Melcher vivien.melcher@iao.fraunhofer.de


User manual (at the moment this is not prioritary)

Name of the solution: Needs and preferences management tool with extended functions
Long description of the functionality of the Demo/Test:  User manual, including images anfd link to videos if needed:

Link to other relevant training information:

Add links to tutorials, documents or training courses, related with the solution.

Detailed installation instructions for running the Test/Demo:

Please provide step by step instructions to install the solutions, and clarify if there any plugin or similar that needs to be installed:



Semantic alignment tool

Name of the solution to be tested:

Semantic alignment tool

Cloud4all Activity related:

Please provide the activity/ies related number/s and title/s.

Developer(s):

Please provide the name of the main organisations involved (short name(s) according to the DoW).

Hardware requirements/AT:

Please check and update the following information if necessary:

H/W required for the execution of the tests (including specifications): Not applicable.

AT: Not applicable. This tool is designed and will be tested with developers, not end users.

Software requirements:

Please check and update the following information if necessary:

The semantic alignment tool connected with the semantic framework (solutions ontology).

Downloadable link to the solution to be tested:

Please, provide with an executable file or similar to install the demo/test components/solution.

Available languages: Please check and update the following information if necessary:
  • English

Target audience:

Please check and update the following information if necessary:

•          Developers

Description of Step by Step proposed procedure for running the Demo/Test (Tasks):

Please check and update the following information with detailed step by step instructions for running the demo/test:

Adding a new solution/setting to the solutions ontology

1) The user uses the alignment tool for having access to the ontological data of the semantic framework.

2) The user can select the category through a categorization list for the new solution that must be added to the ontology (e.g. screen reader).

3) The user gives the description of the tool (e.g. text description, functionalities, etc.).

4) The user gives to the systems the customizable settings for the specific solution.

5) An alignment mechanism proposes to the user (through matching ontological concepts) the appropriate settings that should be aligned with the existing settings of the ontology (if they don’t exist new settings will be created).

6)  The solutions ontology is being updated through the tool.

Searching (by a free text and /or by categorisation lists) for solutions, devices settings, etc, that are stored in the ontology

1) The user is browsing the ontological data through the searching basic functionality of the tool (free text, list of categories)

Research Objectives of the test:

Please check and update the following information. Try to summarize the research objective/s as statements instead of questions (e.g.

“The solution XXX will be succeed whether users find more appropriate to use it than the standard solution provided by the OS itself.”

“It will be evaluated whether the rule-based matchmaker provides better results that the statistical one, or the opposite.”)

1) How the tool can support users for performing syntactic and semantic analysis of solutions and automatic categorization of solutions that are stored in the semantic Framework of content and solutions (i.e. by information about solutions, platforms, devices and settings stored in the ontology)?

 2) How the tool can assist users to help visualising annotated content presented in the ontology?

Pre-Test questionnaire. Is there any specific information (related to your demo/test) you would like to gather from the users before the tests? If yes, please, write each one as a question for including in the Pre-Test questionnaire:

Apart from the standard questions we will have (common for all prototypes) such as for example demographics, time (years) using computers, etc. please check if there is any relevant information you would like to know in advance from the users before they do the test. e.g. screen reader they usually use (giving them few options – the most relevant), which changes they usually perform in their computer before using it, which problems they face when trying to use another computer (not their own computer)… Please, try to provide them as questions in order to avoid misunderstandings between what you want to gather and what the person who changes it into a question (it could happen if you send them as sentences). An example:

1. Which screen reader do you usually use?

[ ] JAWS (version________________)

[ ] NVDA (version________________)

[ ] Other _________________

Please, try to be as much specific as possible.

Indicators/parameters. Which are the parameters or indicators of success in the Demo/Test (including metrics and success thresholds)?

Please check and update the following information:

Add which parameters are going to change when running the tests, e.g. if the screen reader is configured automatically, specify which parameters are going to be changed: speed, pitch, male or female voice…

·         time to completion the alignment task

·         time to storing data to the ontology

·         time to search completion task

·         success on the completion of the task

·         success value

Post-test questionnaire. Is there any specific information (related to your demo/test) you would like to gather from the users after the tests? If yes, please, write each one as a question for including in the Post-Test questionnaire:

Apart from the standard questions we will have (common for all prototypes) such as for example if they would use Cloud4all, improvement recommendations, further comments, etc. please check if there is any relevant information you would like to know from the users after they do the test. e.g. if they found any specific difficulty when performing the test, if they changes provided are enough, etc… Please, try to provide them as questions in order to avoid misunderstandings between what you want to gather and what the person who changes it into a question (it could happen if you send them as sentences). An example:

1. Did you find difficulties when using the solution XXX? [Yes/No + further justification]

Please, try to be as much specific as possible.

Contingency plans. Identify possible errors and bugs that might appear during the test and how to mitigate them:

Please add any error prompt that the system could present, and how to avoid or mitigate them. Please, be as much detailed as possible in order to avoid relevant problems during the test performance.

Contact for technical support (name and e-mail):

e.g. Silvia de los Rios (srios@lst.tfo.upm.es)


User manual (at the moment this is not prioritary)

Name of the solution: Semantic alignement tool
Long description of the functionality of the Demo/Test:  User manual, including images anfd link to videos if needed:

Link to other relevant training information:

Add links to tutorials, documents or training courses, related with the solution.

Detailed installation instructions for running the Test/Demo:

Please provide step by step instructions to install the solutions, and clarify if there any plugin or similar that needs to be installed:


SP3 Demonstrations

Maavis

Name of the solution to be tested:

SP3 demo of Maavis

Cloud4all Activity related:

Please provide the activity/ies related number/s and title/s:

Developer(s):

Please provide the name of the main organisations involved (short name(s) according to the DoW):

Hardware requirements/AT:

Please check and update the following information if necessary:

•          PC , laptop etc with entry level spec and pare resources

•          Sound output

•          If switch control is to be tested then USB joystic, games controller, or switch (via a joycable etc). However keyboard control as possible as a simulation

•          Some Features require network connectivity

Software requirements:

Please check and update the following information if necessary:

•          Windows 7 or 8 (not RT); 32 or 64 bit

•          Internet connection for certain features

Downloadable link to the solution to be tested:

Please, provide with an executable file or similar to install the demo components/solution:

Available languages:

Please check and update the following information if necessary:

  • English
Target audience:
Please check and update the following information if necessary:

•          Elderly users mixed vision, hearing, and cognitive disabilities

Detailed installation instructions for running the Demo:

Please provide step by step instructions to install the solutions, and clarify if there any plugin or similar that needs to be installed:

Demonstration Needs and Preferences set/s:

Please, provide with the different Needs and Preferences sets to be used during the demonstration:

Description of Step by Step proposed procedure for running the Demo (Tasks):

Please provide detailed step by step instructions for making the demonstration:

 

Contact for technical support (name and e-mail):

e.g. Silvia de los Rios (srios@lst.tfo.upm.es)

Read&Write GOLD

Name of the solution to be tested:

SP3 demo of Read&Write GOLD

Cloud4all Activity related:

Please provide the activity/ies related number/s and title/s:

Developer(s):

Please provide the name of the main organisations involved (short name(s) according to the DoW):

Hardware requirements/AT:

Please check and update the following information if necessary:

•          2 GB Free Disk Space

•          Speakers, Sound Card, Microphone (for speech input), DVD Player (for installation only)

•          Pentium IV 1.8GHz processor (2.4GHz recommended)

Software requirements:

Please check and update the following information if necessary:

•          Windows XP SP3 or above

•          512 MB RAM (1 GB recommended)

•          Internet connection for certain features

•          PDF Aloud requires Adobe Reader 9 or above or Acrobat Version 8 or above (This provides text-to-speech in Adobe Reader. For this to work Adobe Reader MUST be installed before Read&Write Gold)

•          Web Reading Supported in IE and Firefox Browsers only. IE8 and above and Firefox 4 and above

•          MS Word Support requires Office 2003 or above.

Downloadable link to the solution to be tested:

Please, provide with an executable file or similar to install the demo components/solution:


Available languages:

Please check and update the following information if necessary:
  • English
Target audience:
Please check and update the following information if necessary:

•          Elderly users mixed vision, hearing, and cognitive disabilities

Detailed installation instructions for running the Demo:

Please provide step by step instructions to install the solutions, and clarify if there any plugin or similar that needs to be installed:

Demonstration Needs and Preferences set/s:

Please, provide with the different Needs and Preferences sets to be used during the demonstration:

Description of Step by Step proposed procedure for running the Demo (Tasks):

Please provide detailed step by step instructions for making the demonstration:

 

Contact for technical support (name and e-mail):

e.g. Silvia de los Rios (srios@lst.tfo.upm.es)

EASIT

Name of the solution to be tested:

SP3 demo of EASIT (Extensible Adapted Social Interaction Tool)

Cloud4all Activity related:

A303.4 Auto-configuration of accessibility features of social networks

Developer(s):

BDigital (leader) IDRC FhG UPM TPV

Hardware requirements/AT:

•          PC

•          X64

•          4GB ram

•          Disk minimum 100GB

•          Screen size min 15

Software requirements:

•    Windows XP, 7 or any recent Linux distribution

•    Recently updated web browser: Chrome or FireFox

Downloadable link to the solution to be tested:

The web application is already up and running in a public domain name server: http://www.easit4all.com

Available languages: Please check and update the following information if necessary:
  • English
Target audience:
•          Elderly users mixed vision, hearing, and cognitive disabilities

Detailed installation instructions for running the Demo:

No plugin it is necessary to install. The application is reachable just introducing the url aforementioned

Demonstration Needs and Preferences set/s:

User1 preferences:
1. Text size: 1.1
2. Text style: arial
3. Line spacing: 1.2
4. Colour & contrast: yellow on back
5. emphasize links: true
6. make inputs larger: true

User2 preferences:
1. Text size: 1
2. Text style: comic sans
3. Line spacing: 1
4. Colour & contrast: colourful
5. emphasize links: false
6. make inputs larger: false
7. show table of contents: false

Description of Step by Step proposed procedure for running the Demo (Tasks):

The validation activities proposed will consist on

Preferences validation
1. User sign in into the application as test1
2. User changes default preferences in the application
3. Save preferences
4. Log out
5. User log in again into the application
6. Current visual application features are the ones specified before
4. Log out

Validation of social services operations
1. User log in
2. Access to connection section
3. Create connection to facebook
4. Access to any facebook functionality
5. Disconnect from facebook
6. Access to connection section
7. Create connection to twitter
8. Access to any twitter functionality
9. Disconnect from twitter
10. Log out application

Check other test users
1. User login as test user: test2
2. User logout


 

Contact for technical support (name and e-mail):

Xavier Rafael Palou (xrafael@bdigital.org)

Microsoft PixelSense/Sociable

Name of the solution to be tested:

SP3 demo of Microsoft PixelSense/Sociable(Tablet PC – all in one (standing up)

Cloud4all Activity related:

Please provide the activity/ies related number/s and title/s:

Developer(s):

Please provide the name of the main organisations involved (short name(s) according to the DoW):

Hardware requirements/AT:

Please check and update the following information if necessary:

•          Processor: x64

•          Memory: 4GB Ram

•          Disk space: 250GB

•          Screen: 12,1” min – preferably 21”

Software requirements:

Please check and update the following information if necessary:

•          Windows 7

•          Net framework 4

•          Surface toolkit runtime for Windows

•          XNA Redistributable

•          SQL Express 2008 and Management Tools

•          “Sociable” software

Downloadable link to the solution to be tested:

Please, provide with an executable file or similar to install the demo components/solution:

Available languages: Please check and update the following information if necessary:
  • English

Target audience:

Please check and update the following information if necessary:

•          Elderly users mixed vision, hearing, and cognitive disabilities

Detailed installation instructions for running the Demo:

Please provide step by step instructions to install the solutions, and clarify if there any plugin or similar that needs to be installed:

Demonstration Needs and Preferences set/s:

Please, provide with the different Needs and Preferences sets to be used during the demonstration:

Description of Step by Step proposed procedure for running the Demo (Tasks):

Please provide detailed step by step instructions for making the demonstration:

 

Contact for technical support (name and e-mail):

e.g. Silvia de los Rios (srios@lst.tfo.upm.es)

Mobile Accessibility

Name of the solution to be tested:

SP3 demo of Mobile Accessibility

Cloud4all Activity related:

Please provide the activity/ies related number/s and title/s:

Developer(s):

Please provide the name of the main organisations involved (short name(s) according to the DoW):

Hardware requirements/AT:

Please check and update the following information if necessary:

•          2 x Android 4.0 phone (suggested device Google Galaxy Nexus)

 

Software requirements:

Please check and update the following information if necessary:

•          NFC functionality

Downloadable link to the solution to be tested:

Please, provide with an executable file or similar to install the demo components/solution:

Available languages: Please check and update the following information if necessary:
  • English

Target audience:

Please check and update the following information if necessary:

 •          Elderly users mixed vision, hearing, and cognitive disabilities

Detailed installation instructions for running the Demo:

Please provide step by step instructions to install the solutions, and clarify if there any plugin or similar that needs to be installed:

Demonstration Needs and Preferences set/s:

Please, provide with the different Needs and Preferences sets to be used during the demonstration:

Description of Step by Step proposed procedure for running the Demo (Tasks):

Please provide detailed step by step instructions for making the demonstration:

 

Contact for technical support (name and e-mail):

e.g. Silvia de los Rios (srios@lst.tfo.upm.es)

UI options

Name of the solution to be tested:

SP3 demo of UI options

Cloud4all Activity related:

Please provide the activity/ies related number/s and title/s:

Developer(s):

Please provide the name of the main organisations involved (short name(s) according to the DoW):

Hardware requirements/AT:

Please provide with the hardware requirements for running the demo:

 

Software requirements:

Please provide with the software requirements for running the demo:

 

Downloadable link to the solution to be tested:

Please, provide with an executable file or similar to install the demo components/solution:

Available languages: Please check and update the following information if necessary:
  • English

Target audience:

Please check and update the following information if necessary:

•          Elderly users mixed vision, hearing, and cognitive disabilities

Detailed installation instructions for running the Demo:

Please provide step by step instructions to install the solutions, and clarify if there any plugin or similar that needs to be installed:

Demonstration Needs and Preferences set/s:

Please, provide with the different Needs and Preferences sets to be used during the demonstration:

Description of Step by Step proposed procedure for running the Demo (Tasks):

Please provide detailed step by step instructions for making the demonstration:

 

Contact for technical support (name and e-mail):

e.g. Silvia de los Rios (srios@lst.tfo.upm.es)