1st Phase Testing Experimental Plans templates

From wiki.gpii
Revision as of 17:40, 11 February 2013 by FraunhoferIAO (talk | contribs)
Jump to: navigation, search

Rule-based and statistical MM

Name of the solution to be tested:

Rule-based & statistical MM

'Related 'Cloud4all activities:

A204.3: Rules/heuristics based algorithms for profile matching
A204.4: Statistical methods for dynamic profile matching

Developer(s):

Rule-based matchmaker: TUD, CERTH-ITI
Statistical matchmaker: HdM

Hardware requirements/AT:

Hardware:

  • 1 gigahertz (GHz) or faster 32-bit (x86) or 64-bit (x64) processor.
  • 1 gigabyte (GB) RAM (32-bit) or 2 GB RAM (64-bit).
  • 900 MB available hard disk space. (150 MB for the Architecture + 126 MB for Java SE Runtime Environment + 245 MB for Java SE Development Kit + 100 MB for rule-based MM + 2 MB for statistical MB + 50 MB for NVDA + 200 MB for Firefox + some space for the installers, which can be removed after the installation process)
  • Internet access (because the Preferences Server will not be running locally).
  • speakers (for speech output by the screen readers).
  • Braille display, if available at the test site; check in advance whether it can be used with both NVDA and Orca.

Note: The processor and memory requirements are based on Windows 7. The requirements for Fedora 18 are less heavy:

  • 400MHz or faster processor.
  • at least 768 MB memory (RAM), 1 GB recommended.
  • at least 10 GB hard drive space.

AT:

  • On Linux/GNOME, the GNOMEShell Magnifier will be used by users who need magnification, and the screen reader Orca will be used by blind users. This should not require an additional installation step.
  • On Windows, the built-in features (e.g. magnification and contrast) will be used by low-vision users, and the screen reader NVDA will be used by blind users. NVDA is not part of Windows and requires a separate installation step.

Software requirements:

Java: Java SE Development Kit (JDK) 6 or higher (for the rule-based matchmaker).

Linux/GNOME:

  • GNOME 3.4 or GNOME 3.6, with a preference for GNOME 3.6.
  • No GNOME Shell extensions or third-party modifications to the original GNOME UX experience will be installed, with the following exception: GnomeTweakTool will need to be installed on distributions where the font size cannot be set through System Settings (e.g. Fedora 18). Installation instructions for GnomeTweakTool: http://wiki.gpii.net/index.php/GNOME_Tools.

GNOME versions on some recent Linux distributions:

  • Fedora 18 (with GNOME 3.6; released 15 January 2013)
  • openSUSE 12.2 (with GNOME 3.4; released September 2012)
  • Ubuntu GNOME 12.04 (with GNOME 3.4.1) or Ubuntu GNOME Remix 12.10 (with GNOME 3.6),

Windows OSs (Windows 7, 32) and built-in features: Magnifier, high-contrast settings etc.

Note: We will not use 64-bit versions of Windows.

Downloadable link to the solution to be tested:

Please, provide with an executable file or similar to install the demo/test components/solution.

Target audience:
  • Low-vision users
  • Blind users

Note: Users without disabilities without disabilities will not be included in the first phase, because the selected needs & preferences terms are tailored to users with visual impairments.

Step-by-step description of proposed procedure for running the demo/test (tasks):

(1) The user has defined his/her N&P set for a specific platform A; this N&P set is stored in the system. For Windows users, platform A would be Windows; for Linux users, platform A would be Linux.

(2) The user logs in on platform B and configures his/her settings manually. For Windows users, platform B would be Linux; for Linux users, platform B would be Windows.

(3a) Platform B is auto-configured according to settings provided by the rule-based matchmaker.*

Or

(3b) Platform B is auto-configured according to settings provided by the statistical matchmaker.

(4) The user is asked to perform the following tasks:

  • READING: Reading an email or a longer text on a static webpage.
  • BROWSING: Searching information on a static webpage.
  • FORM FILLING: Writing some text or filling in a web form.

So the tasks are done using the preferences inferred by the matchmakers.

(5) The user with the assistance of an expert logs in on platform B, using the user’s token and configures his/her settings manually until the user is satisfied with the settings. (Since the user will probably not be familiar with this OS, the expert needs to help the user here. So the expert needs to know where all the settings can be found.) These final settings needs to be saved, so they can be compared with the settings inferred by the matchmakers.

Research objectives of the test:

The rule-based matchmaker succeeds if it scores at least 50% compared to the settings created by the user and the expert in the last step of the test procedure. (Success criterion for M18)

The statistical matchmaker succeeds if it scores at least 50% compared to the settings created by the user and the expert in the last step of the test procedure. (Success criterion for M18)

Note: the score is calculated by multiplying a distance metric with 100. The distance metric is a value between 0 and 1 that combines the distance metrics for each of the preferences or settings used in this test phase. These preferences and their distance calculation are described in the wiki page http://wiki.gpii.net/index.php/Cloud4all_Testing:_Essential_Registry_Terms.

Pre-Test questionnaire. Is there any specific information (related to your demo/test) you would like to gather from the users before the tests? If yes, please, write each one as a question for including in the Pre-Test questionnaire

Apart from the standard questions we will have (common for all prototypes) such as for example demographics, time (years) using computers, etc. please check if there is any relevant information you would like to know in advance from the users before they do the test. e.g. screen reader they usually use (giving them few options – the most relevant), which changes they usually perform in their computer before using it, which problems they face when trying to use another computer (not their own computer)… Please, try to provide them as questions in order to avoid misunderstandings between what you want to gather and what the person who changes it into a question (it could happen if you send them as sentences). An example:

1. When you use a PC, which operating system do you use most frequently?

  • Windows
  • Linux
  • iOS
  • Which version of this operating system do you use?

2. When you use a mobile device, which operating system do you use most frequently?

  • iOS
  • Android
  • Symbian
  • Windows
  • Others

2. Which AT are you using (screen reader, magnifier, other)?

For screen-reader users: Which screen reader do you use?

  • JAWS
  • NVDA
  • SuperNova
  • Window-Eyes
  • VoiceOver
  • Orca
  • Other: ...

For screen magnifier users: Wich screen magnifier do you use?

  • Magnifier (Windows)
  • Magnifier (ORCA)
  • ZoomText
  • Virtual Magnifying Glass
  • Other: ...

3. Do you sometimes switch between devices/platforms?

If yes, between which devices/platforms? 

___________________________________________

Do you experience problems when moving from your system to another? If yes, which one?

___________________________________________

4. What are your needs (necessary to interact with PC, e.g. speech output)?

____________________________________________

5. What are your preferences (some preferred settings if known, e.g. volume, speech rate)?

____________________________________________

6. Are you familiar with your System?

[  ] Yes                  [   ] No                  [   ] I don't know

6.1. Can you change all the settings of your system and your AT that you need to change?

6.2. Do you often need help to change a setting?

7. Who has set up/configured your system?

____________________________________________


Please check and update the following information:

Add which parameters are going to change when running the tests, e.g. if the screen reader is configured automatically, specify which parameters are going to be changed: speed, pitch, male or female voice…
OS and magnification settings for which changes will be monitored:

  • ForegroundColour (depends on theme)
  • BackgroundColour
  • FontSize
  • CursorSize
  • MagnifierEnabled
  • Magnification (i.e. magnification level)
  • MagnificationPosition
  • Tracking

Screen reader settings for which changes will be monitored:

  • ScreenReaderTTSEnabled
  • AuditoryOutLanguage
  • SpeechRate
  • SpeakTutorialMessages
  • KeyEcho
  • WordEcho
  • AnnounceCapitals
  • ScreenReaderBrailleOutput
  • PunctuationVerbosity
  • ReadingUnit

For each of these settings, a distance measure will be calculated between the values inferred by the matchmakers and the values from the final set-up step.

Other parameters:

  • Completion per task
  • Number of UI settings changed by user per task
  • Number of user errors per task
  • Post-task: User-perceived difficulty level per task
  • Post-task: User experience.

Post-test questionnaire. Is there any specific information (related to your demo/test) you would like to gather from the users after the tests? If yes, please, write each one as a question for including in the Post-Test questionnaire:

Apart from the standard questions we will have (common for all prototypes) such as for example if they would use Cloud4all, improvement recommendations, further comments, etc. please check if there is any relevant information you would like to know from the users after they do the test. e.g. if they found any specific difficulty when performing the test, if they changes provided are enough, etc… Please, try to provide them as questions in order to avoid misunderstandings between what you want to gather and what the person who changes it into a question (it could happen if you send them as sentences). An example:

1. Did you find difficulties when using the solution XXX? [Yes/No + further justification]

Please, try to be as much specific as possible.

Contingency plans. Identify possible errors and bugs that might appear during the test and how to mitigate them:

Please add any error prompt that the system could present, and how to avoid or mitigate them. Please, be as much detailed as possible in order to avoid relevant problems during the test performance.

Contact for technical support (name and e-mail):

Rule-based matchmaker:

  • Claudia Loitsch: claudia.loitsch(at)tu-dresden.de
  •  ? Kostas Votis: kvotis(at)iti.gr

Statistical matchmaker:

  • Andy Stiegler: stiegler(at)hdm-stuttgart.de

MM User Manual

Name of the solution: Rule-based and statitical MM
Long description of the functionality of the Demo/Test:  User manual, including images and link to videos if needed:

Link to other relevant training information:

Add links to tutorials, documents or training courses, related with the solution.

Detailed installation instructions for running the Test/Demo:

Please provide step by step instructions to install the solutions, and clarify if there any plugin or similar that needs to be installed:

Needs and preferences management tool with basic functions

Name of the solution to be tested:

Needs and preferences management tool with basic functions

Cloud4all Activity related:

A102.2 User interface parameters for viewing and editing profiles

A102.3 Tool for users and care-givers to work with their profiles

Developer(s):

CERTH/ HIT

Fraunhofer

Hardware requirements/AT:

Laptop or standard PC with Internet connection.

Primary test users are elderly people with rather mild forms of visual, hearing or cognitive disabilities, since the test is meant to provide rather usability information than accessibility information.
AT has to be provided according to the needs of the invited users: If a test participant uses an AT already at home or work, then this AT should be provided in the test, too. Particularly, for the visually impaired users, magnifying AT and screen readers need to be held available on the test PC.

Software requirements:


A102.4 S/W prototype available

Optional: connection to statistical Matchmaker (needed for the preview). General usability of the tool can also be tested without data transmission from and to the MM. In this case a pre-defined setting change has to be part of the test task, the preview then will be a simulation of setting changes.

Downloadable link to the solution to be tested:

Please, provide with an executable file or similar to install the demo/test components/solution.Will be provided by CERTH

Target audience:


•          Visual impaired users (only mild forms of vision impairements)

•          Elderly users mixed vision, hearing, and cognitive disabilities

•          Users with intellectual disabilities

•          Users with dislexia

•          Low literacy Users

Description of Step by Step proposed procedure for running the Demo/Test (Tasks):

1. Login/ Registration (first time user):

Start: Login/ Logout screen

Action: the user selects "register". The registration process is still not defined, so this link is non-functional. the examiner has to explain that there will be a registration process. After clicking on the "register" link the user will be led to the N&P set initialisation screen.

End: N&P Initialisation screen


2. Create an N&P set using the Preferences Management Editor/ by taking over the actual device settings  („N&P set initialisation“).

Start: N&P Initialisation screen

Action: The user selects 'take over device settings and will be led to the preference editor (in case mirroring device settings is not feasible for the prototype standard Desktop PC settings will be shown in the editor).

End: Preference Editor screen

WORDING TEST:

A wording test can be used to check the comprehensibility of applied labels/ buttons, distinction betwen interactive and non-interactive areas,....The examiner asks the user to have a look on the screen and to explain what the user assumes a UI element/area is for (what functionality is behind this element/ area; is it interactive or just displayed information; what will happen, when selecting this element). IMPORTANT: The user should not click on the screen, he/ she should just explain. 

3. Logout:

Start: Preference Editor screen

Action: The user logs out by clicking on the log-out button.

End: The Login/ logout screen will be shown.

4. Login (already registered user):

Start: Login/ Logout screen

Action: The user logs in by typing in user name and password (a dummy test account has been set up before).

End: After login the PMT main screen is shown (with the two options 'Preference Editor' and 'GPII Management')


5. Edit defined aspects of the N&P set, using the Preferences Management Editor („N&P set view and edit“):

Start: PMT main screen

Action: The user selects 'Preference Editor' and selects a defined UI Option (UI option pre-defined by examiner in the task). The UI-Option selection process can be done by using the search functionality, recently used UI-Options,'all categories' or 'common UI Options. Whene the user has done a selection the examiner asks, if the user knows/ imagines an alternative way for UI Option selection. After UI-Option selection the user adjusts the setting and saves the setting change.

End: Editor screen with saved setting changes

 

Research Objectives of the test:

It will be evaluated, if the PM Tool  is easy to use and provides a high usability. Labels should be clear to the users and the user should be able to easily navigate through the menu. Usability problems that minimize user satisfaction have to be identified.

Which changes have to be applied to the prototype in order to improve / maximise its accessibility and usability?

Pre-Test questionnaire. Is there any specific information (related to your demo/test) you would like to gather from the users before the tests? If yes, please, write each one as a question for including in the Pre-Test questionnaire:

Computer Literacy Scale: will be provided in English, German, Spanish (has to be translated to Greek - Fraunhofer will provide English version for translation)

Indicators/parameters. Which are the parameters or indicators of success in the Demo/Test (including metrics and success thresholds)?

Please check and update the following information:

Add which parameters are going to change when running the tests, e.g. if the screen reader is configured automatically, specify which parameters are going to be changed: speed, pitch, male or female voice…

·  The user understands the planned functions.

·  The users regard the functions as sufficient for their purposes.

·  The functions are presented in a usable manner. 

- high score on satisfaction scale (AttrakDiff)

- high score on usability scale (SUS or ISOMETRICS-S)

Post-test questionnaire. Is there any specific information (related to your demo/test) you would like to gather from the users after the tests? If yes, please, write each one as a question for including in the Post-Test questionnaire:

Apart from the standard questions we will have (common for all prototypes) such as for example if they would use Cloud4all, improvement recommendations, further comments, etc. please check if there is any relevant information you would like to know from the users after they do the test. e.g. if they found any specific difficulty when performing the test, if they changes provided are enough, etc… Please, try to provide them as questions in order to avoid misunderstandings between what you want to gather and what the person who changes it into a question (it could happen if you send them as sentences). An example:

1. Did you find difficulties when using the solution XXX? [Yes/No + further justification]

Please, try to be as much specific as possible.

What did you like the most?

What did you like less?

AttrakDiff (http://www.attrakdiff.de/en/Home/)

System Usability Scale

Contingency plans. Identify possible errors and bugs that might appear during the test and how to mitigate them:

Please add any error prompt that the system could present, and how to avoid or mitigate them. Please, be as much detailed as possible in order to avoid relevant problems during the test performance.To be edited by CERTH

Contact for technical support (name and e-mail):

Kostas Kalogirou <kalogir@certh.gr>


User manual (at the moment this is not prioritary)

Name of the solution: Needs and preferences management tool with basic functions
Long description of the functionality of the Demo/Test:  User manual, including images anfd link to videos if needed:

Link to other relevant training information:

Add links to tutorials, documents or training courses, related with the solution.

Detailed installation instructions for running the Test/Demo:

Please provide step by step instructions to install the solutions, and clarify if there any plugin or similar that needs to be installed:


Needs and preferences management tool with extended functions

Name of the solution to be tested:

Needs and preferences management tool with extended functions

Cloud4all Activity related:

Please provide the activity/ies related number/s and title/s.

Developer(s):

Please provide the name of the main organisations involved (short name(s) according to the DoW).

Hardware requirements/AT:

Please check and update the following information if necessary:

None. mock ups test only

Software requirements:

Please check and update the following information if necessary:

None. mock ups test only

Downloadable link to the solution to be tested:

Please, provide with an executable file or similar to install the demo/test components/solution.

Target audience:

Please check and update the following information if necessary:

• Visual impaired users
• Elderly users mixed vision, hearing, and cognitive disabilities
• Users with intellectual disabilities
• Users with dyslexia
• Low literacy Users
 

Description of Step by Step proposed procedure for running the Demo/Test (Tasks):

Please check and update the following information with detailed step by step instructions for running the demo/test:

Mention all the details from the detailed test instructor manual.

Research Objectives of the test:

Please check and update the following information. Try to summarize the research objective/s as statements instead of questions (e.g.

“The solution XXX will be succeed whether users find more appropriate to use it than the standard solution provided by the OS itself.”

“It will be evaluated whether the rule-based matchmaker provides better results that the statistical one, or the opposite.”)

Which changes have to be applied to the prototype in order to improve / maximise its accessibility and usability?

Pre-Test questionnaire. Is there any specific information (related to your demo/test) you would like to gather from the users before the tests? If yes, please, write each one as a question for including in the Pre-Test questionnaire:

Apart from the standard questions we will have (common for all prototypes) such as for example demographics, time (years) using computers, etc. please check if there is any relevant information you would like to know in advance from the users before they do the test. e.g. screen reader they usually use (giving them few options – the most relevant), which changes they usually perform in their computer before using it, which problems they face when trying to use another computer (not their own computer)… Please, try to provide them as questions in order to avoid misunderstandings between what you want to gather and what the person who changes it into a question (it could happen if you send them as sentences). An example:

1. Which screen reader do you usually use?

[ ] JAWS (version________________)

[ ] NVDA (version________________)

[ ] Other _________________

Please, try to be as much specific as possible.

Indicators/parameters. Which are the parameters or indicators of success in the Demo/Test (including metrics and success thresholds)?

Please check and update the following information:

Add which parameters are going to change when running the tests, e.g. if the screen reader is configured automatically, specify which parameters are going to be changed: speed, pitch, male or female voice…

·  The user understands the planned functions.

·  The users regard the functions as sufficient for their purposes.

·  The functions are presented in a usable manner. 

Post-test questionnaire. Is there any specific information (related to your demo/test) you would like to gather from the users after the tests? If yes, please, write each one as a question for including in the Post-Test questionnaire:

Apart from the standard questions we will have (common for all prototypes) such as for example if they would use Cloud4all, improvement recommendations, further comments, etc. please check if there is any relevant information you would like to know from the users after they do the test. e.g. if they found any specific difficulty when performing the test, if they changes provided are enough, etc… Please, try to provide them as questions in order to avoid misunderstandings between what you want to gather and what the person who changes it into a question (it could happen if you send them as sentences). An example:

1. Did you find difficulties when using the solution XXX? [Yes/No + further justification]

Please, try to be as much specific as possible.

After each task, the test instructor asks open, non-suggestive questions in order to understand possible problems the user has encountered while solving the task.

After the test, the instructor categorizes the encountered usability problems and reports them in a pre-defined matrix format.

Contingency plans. Identify possible errors and bugs that might appear during the test and how to mitigate them:

Please add any error prompt that the system could present, and how to avoid or mitigate them. Please, be as much detailed as possible in order to avoid relevant problems during the test performance.

Contact for technical support (name and e-mail):

e.g. Silvia de los Rios (srios@lst.tfo.upm.es)


User manual (at the moment this is not prioritary)

Name of the solution: Needs and preferences management tool with extended functions
Long description of the functionality of the Demo/Test:  User manual, including images anfd link to videos if needed:

Link to other relevant training information:

Add links to tutorials, documents or training courses, related with the solution.

Detailed installation instructions for running the Test/Demo:

Please provide step by step instructions to install the solutions, and clarify if there any plugin or similar that needs to be installed:




Semantic alignment tool

Name of the solution to be tested:

Semantic alignment tool

Cloud4all Activity related:

Please provide the activity/ies related number/s and title/s.

Developer(s):

Please provide the name of the main organisations involved (short name(s) according to the DoW).

Hardware requirements/AT:

Please check and update the following information if necessary:

H/W required for the execution of the tests (including specifications): Not applicable.

AT: Not applicable. This tool is designed and will be tested with developers, not end users.

Software requirements:

Please check and update the following information if necessary:

The semantic alignment tool connected with the semantic framework (solutions ontology).

Downloadable link to the solution to be tested:

Please, provide with an executable file or similar to install the demo/test components/solution.

Target audience:

Please check and update the following information if necessary:

•          Developers

Description of Step by Step proposed procedure for running the Demo/Test (Tasks):

Please check and update the following information with detailed step by step instructions for running the demo/test:

Adding a new solution/setting to the solutions ontology

1) The user uses the alignment tool for having access to the ontological data of the semantic framework.

2) The user can select the category through a categorization list for the new solution that must be added to the ontology (e.g. screen reader).

3) The user gives the description of the tool (e.g. text description, functionalities, etc.).

4) The user gives to the systems the customizable settings for the specific solution.

5) An alignment mechanism proposes to the user (through matching ontological concepts) the appropriate settings that should be aligned with the existing settings of the ontology (if they don’t exist new settings will be created).

6)  The solutions ontology is being updated through the tool.

Searching (by a free text and /or by categorisation lists) for solutions, devices settings, etc, that are stored in the ontology

1) The user is browsing the ontological data through the searching basic functionality of the tool (free text, list of categories)

Research Objectives of the test:

Please check and update the following information. Try to summarize the research objective/s as statements instead of questions (e.g.

“The solution XXX will be succeed whether users find more appropriate to use it than the standard solution provided by the OS itself.”

“It will be evaluated whether the rule-based matchmaker provides better results that the statistical one, or the opposite.”)

1) How the tool can support users for performing syntactic and semantic analysis of solutions and automatic categorization of solutions that are stored in the semantic Framework of content and solutions (i.e. by information about solutions, platforms, devices and settings stored in the ontology)?

 2) How the tool can assist users to help visualising annotated content presented in the ontology?

Pre-Test questionnaire. Is there any specific information (related to your demo/test) you would like to gather from the users before the tests? If yes, please, write each one as a question for including in the Pre-Test questionnaire:

Apart from the standard questions we will have (common for all prototypes) such as for example demographics, time (years) using computers, etc. please check if there is any relevant information you would like to know in advance from the users before they do the test. e.g. screen reader they usually use (giving them few options – the most relevant), which changes they usually perform in their computer before using it, which problems they face when trying to use another computer (not their own computer)… Please, try to provide them as questions in order to avoid misunderstandings between what you want to gather and what the person who changes it into a question (it could happen if you send them as sentences). An example:

1. Which screen reader do you usually use?

[ ] JAWS (version________________)

[ ] NVDA (version________________)

[ ] Other _________________

Please, try to be as much specific as possible.

Indicators/parameters. Which are the parameters or indicators of success in the Demo/Test (including metrics and success thresholds)?

Please check and update the following information:

Add which parameters are going to change when running the tests, e.g. if the screen reader is configured automatically, specify which parameters are going to be changed: speed, pitch, male or female voice…

·         time to completion the alignment task

·         time to storing data to the ontology

·         time to search completion task

·         success on the completion of the task

·         success value

Post-test questionnaire. Is there any specific information (related to your demo/test) you would like to gather from the users after the tests? If yes, please, write each one as a question for including in the Post-Test questionnaire:

Apart from the standard questions we will have (common for all prototypes) such as for example if they would use Cloud4all, improvement recommendations, further comments, etc. please check if there is any relevant information you would like to know from the users after they do the test. e.g. if they found any specific difficulty when performing the test, if they changes provided are enough, etc… Please, try to provide them as questions in order to avoid misunderstandings between what you want to gather and what the person who changes it into a question (it could happen if you send them as sentences). An example:

1. Did you find difficulties when using the solution XXX? [Yes/No + further justification]

Please, try to be as much specific as possible.

Contingency plans. Identify possible errors and bugs that might appear during the test and how to mitigate them:

Please add any error prompt that the system could present, and how to avoid or mitigate them. Please, be as much detailed as possible in order to avoid relevant problems during the test performance.

Contact for technical support (name and e-mail):

e.g. Silvia de los Rios (srios@lst.tfo.upm.es)


User manual (at the moment this is not prioritary)

Name of the solution: Semantic alignement tool
Long description of the functionality of the Demo/Test:  User manual, including images anfd link to videos if needed:

Link to other relevant training information:

Add links to tutorials, documents or training courses, related with the solution.

Detailed installation instructions for running the Test/Demo:

Please provide step by step instructions to install the solutions, and clarify if there any plugin or similar that needs to be installed:



SP3 Demonstrations

Maavis

Name of the solution to be tested:

SP3 demo of Maavis

Cloud4all Activity related:

Please provide the activity/ies related number/s and title/s:

Developer(s):

Please provide the name of the main organisations involved (short name(s) according to the DoW):

Hardware requirements/AT:

Please check and update the following information if necessary:

•          PC , laptop etc with entry level spec and pare resources

•          Sound output

•          If switch control is to be tested then USB joystic, games controller, or switch (via a joycable etc). However keyboard control as possible as a simulation

•          Some Features require network connectivity

Software requirements:

Please check and update the following information if necessary:

•          Windows 7 or 8 (not RT); 32 or 64 bit

•          Internet connection for certain features

Downloadable link to the solution to be tested:

Please, provide with an executable file or similar to install the demo components/solution:

Target audience:
Please check and update the following information if necessary:

•          Elderly users mixed vision, hearing, and cognitive disabilities

Detailed installation instructions for running the Demo:

Please provide step by step instructions to install the solutions, and clarify if there any plugin or similar that needs to be installed:

Demonstration Needs and Preferences set/s:

Please, provide with the different Needs and Preferences sets to be used during the demonstration:

Description of Step by Step proposed procedure for running the Demo (Tasks):

Please provide detailed step by step instructions for making the demonstration:

 

Contact for technical support (name and e-mail):

e.g. Silvia de los Rios (srios@lst.tfo.upm.es)


Read&Write GOLD

Name of the solution to be tested:

SP3 demo of Read&Write GOLD

Cloud4all Activity related:

Please provide the activity/ies related number/s and title/s:

Developer(s):

Please provide the name of the main organisations involved (short name(s) according to the DoW):

Hardware requirements/AT:

Please check and update the following information if necessary:

•          2 GB Free Disk Space

•          Speakers, Sound Card, Microphone (for speech input), DVD Player (for installation only)

•          Pentium IV 1.8GHz processor (2.4GHz recommended)

Software requirements:

Please check and update the following information if necessary:

•          Windows XP SP3 or above

•          512 MB RAM (1 GB recommended)

•          Internet connection for certain features

•          PDF Aloud requires Adobe Reader 9 or above or Acrobat Version 8 or above (This provides text-to-speech in Adobe Reader. For this to work Adobe Reader MUST be installed before Read&Write Gold)

•          Web Reading Supported in IE and Firefox Browsers only. IE8 and above and Firefox 4 and above

•          MS Word Support requires Office 2003 or above.

Downloadable link to the solution to be tested:

Please, provide with an executable file or similar to install the demo components/solution:

Target audience:
Please check and update the following information if necessary:

•          Elderly users mixed vision, hearing, and cognitive disabilities

Detailed installation instructions for running the Demo:

Please provide step by step instructions to install the solutions, and clarify if there any plugin or similar that needs to be installed:

Demonstration Needs and Preferences set/s:

Please, provide with the different Needs and Preferences sets to be used during the demonstration:

Description of Step by Step proposed procedure for running the Demo (Tasks):

Please provide detailed step by step instructions for making the demonstration:

 

Contact for technical support (name and e-mail):

e.g. Silvia de los Rios (srios@lst.tfo.upm.es)



EASIT

Name of the solution to be tested:

SP3 demo of EASIT

Cloud4all Activity related:

Please provide the activity/ies related number/s and title/s:

Developer(s):

Please provide the name of the main organisations involved (short name(s) according to the DoW):

Hardware requirements/AT:

Please check and update the following information if necessary:

•          PC

•          X64

•          4GB ram

•          Disk minimum 100GB

•          Screen size min 15

Software requirements:

Please check and update the following information if necessary:

·         Windows XP, 7 or any recent Linux distribution

•    Recently updated web browser: Chrome or FireFox

Downloadable link to the solution to be tested:

Please, provide with an executable file or similar to install the demo components/solution:

Target audience:
Please check and update the following information if necessary:

•          Elderly users mixed vision, hearing, and cognitive disabilities

Detailed installation instructions for running the Demo:

Please provide step by step instructions to install the solutions, and clarify if there any plugin or similar that needs to be installed:

Demonstration Needs and Preferences set/s:

Please, provide with the different Needs and Preferences sets to be used during the demonstration:

Description of Step by Step proposed procedure for running the Demo (Tasks):

Please provide detailed step by step instructions for making the demonstration:

 

Contact for technical support (name and e-mail):

e.g. Silvia de los Rios (srios@lst.tfo.upm.es)


Microsoft PixelSense/Sociable

Name of the solution to be tested:

SP3 demo of Microsoft PixelSense/Sociable(Tablet PC – all in one (standing up)

Cloud4all Activity related:

Please provide the activity/ies related number/s and title/s:

Developer(s):

Please provide the name of the main organisations involved (short name(s) according to the DoW):

Hardware requirements/AT:

Please check and update the following information if necessary:

•          Processor: x64

•          Memory: 4GB Ram

•          Disk space: 250GB

•          Screen: 12,1” min – preferably 21”

Software requirements:

Please check and update the following information if necessary:

•          Windows 7

•          Net framework 4

•          Surface toolkit runtime for Windows

•          XNA Redistributable

•          SQL Express 2008 and Management Tools

•          “Sociable” software

Downloadable link to the solution to be tested:

Please, provide with an executable file or similar to install the demo components/solution:

Target audience:

Please check and update the following information if necessary:

•          Elderly users mixed vision, hearing, and cognitive disabilities

Detailed installation instructions for running the Demo:

Please provide step by step instructions to install the solutions, and clarify if there any plugin or similar that needs to be installed:

Demonstration Needs and Preferences set/s:

Please, provide with the different Needs and Preferences sets to be used during the demonstration:

Description of Step by Step proposed procedure for running the Demo (Tasks):

Please provide detailed step by step instructions for making the demonstration:

 

Contact for technical support (name and e-mail):

e.g. Silvia de los Rios (srios@lst.tfo.upm.es)


Mobile Accessibility

Name of the solution to be tested:

SP3 demo of Mobile Accessibility

Cloud4all Activity related:

Please provide the activity/ies related number/s and title/s:

Developer(s):

Please provide the name of the main organisations involved (short name(s) according to the DoW):

Hardware requirements/AT:

Please check and update the following information if necessary:

•          2 x Android 4.0 phone (suggested device Google Galaxy Nexus)

 

Software requirements:

Please check and update the following information if necessary:

•          NFC functionality

Downloadable link to the solution to be tested:

Please, provide with an executable file or similar to install the demo components/solution:

Target audience:

Please check and update the following information if necessary:

 •          Elderly users mixed vision, hearing, and cognitive disabilities

Detailed installation instructions for running the Demo:

Please provide step by step instructions to install the solutions, and clarify if there any plugin or similar that needs to be installed:

Demonstration Needs and Preferences set/s:

Please, provide with the different Needs and Preferences sets to be used during the demonstration:

Description of Step by Step proposed procedure for running the Demo (Tasks):

Please provide detailed step by step instructions for making the demonstration:

 

Contact for technical support (name and e-mail):

e.g. Silvia de los Rios (srios@lst.tfo.upm.es)


UI options

Name of the solution to be tested:

SP3 demo of UI options

Cloud4all Activity related:

Please provide the activity/ies related number/s and title/s:

Developer(s):

Please provide the name of the main organisations involved (short name(s) according to the DoW):

Hardware requirements/AT:

Please provide with the hardware requirements for running the demo:

 

Software requirements:

Please provide with the software requirements for running the demo:

 

Downloadable link to the solution to be tested:

Please, provide with an executable file or similar to install the demo components/solution:

Target audience:

Please check and update the following information if necessary:

•          Elderly users mixed vision, hearing, and cognitive disabilities

Detailed installation instructions for running the Demo:

Please provide step by step instructions to install the solutions, and clarify if there any plugin or similar that needs to be installed:

Demonstration Needs and Preferences set/s:

Please, provide with the different Needs and Preferences sets to be used during the demonstration:

Description of Step by Step proposed procedure for running the Demo (Tasks):

Please provide detailed step by step instructions for making the demonstration:

 

Contact for technical support (name and e-mail):

e.g. Silvia de los Rios (srios@lst.tfo.upm.es)