Developer Space/Components

From wiki.gpii
Revision as of 07:24, 17 September 2014 by GreggVan (talk | contribs) (Methods for adding voice output to a web app via javascript.)
Jump to: navigation, search

Web Accessibility Testing

GUI Modules

Accessible Chart Components

Accessible UI Modules

Self-adaptive-ui-modules

Self-adaptive components from approaches like MyUI and URC will be adapted so that they can provide runtime adaptations with modern standard frameworks without requiring specific infrastructures (e.g. like the adaptation engine or adaptation framework).

Other Frameworks and APIs

Personalization Modules

Social Personalization Modules

taking recommendations and opinions of one’s social network into account when looking for interface options use high-level information, feedback and recommendations and meta-data provided by the new development tools

contact: RtFI, HdM

Other Development/Testing

Output Rendering/Conversion Modules

Speech Synthesis

Methods for adding voice output to a web app via javascript.

courtesy of Jeff Bigham at CS.CMU.EDU

Braille Translators

[Upcoming] GPII Transformer Infrastructure/toolkit

an open source version of RoboBraille infrastructure available for anyone wishing to set up a transformer service. ‘behind-a-firewall’ package as part of the repository.

contact: Sensus

Math access

Human Input Modules

Braille Displays

Open Source Input Transducer Prototyping Module

The input transducer modules provide a standardized interface to commonly used microcontroller platfroms like

This allows developers to acquire data from digital and analogue sensors, like simple switches, sip/puff sensors or other devices by simply connecting the sensors and using the provided microcontroller firmware.

Other HID Emulation

Camera Input Modules

Several input components based on algorithms of the Open Computer Vision library OpenCV will be provided via a standardized interface, including

  • head tracking
  • eye tracking or reflective marker tracking


Haptic/Touch I/O Modules

user-application interaction through a variety of haptic devices. This module will be based on previous work from the AEGIS FP7 project and other open source tools, like the 3DHapticWebBrowser

The Haptic/Touch I/O Modules will be based on:

- The / CHAI3D : An open source set of C++ libraries for computer haptics, visualization and interactive real-time simulation. CHAI 3D supports several commercially-available three-, six- and seven-degree-of-freedom haptic devices, and makes it simple to support new custom force feedback devices.

Other Haptic APIs:

- The / H3DAPI

- The / Geomagic OpenHaptics

- The / Jtouchtoolkit

- The Immersion’s Haptic Development Platform

- The / Novint SDK - The / libnifalcon


contact: CERTH

Bioelectrics Signal Acquisition and Processing Modules

Integration of bioelectric signal acquisition devices like

  • the OpenEEG biosignal amplifier
  • the OpenBCI 8-channel EEG/EMG/ECG acquisition unit

This will allow developers the integration bioelectric interfaces into their applications, including EMG (electro-myogram, muscle activities), EOG (electro-oculogram, eye movements) or EEG (electro-encephalogram, brainwaves)

Smart Home Integration Modules

unified API for interaction with standardized smart home and building automation systems including

This will allow developers to integrate environmental control features into applications.


[Upcoming] Real-Time User Monitoring Sensor Layer

code reuse for user monitoring (e.g. sensor plug-ins) in different assistive applications especially internal sensors on mobile phones and wearable sensors

contact: KIT

Smart Input Modules

Modules that implement higher level functionality based on raw input, transforming the input into more abstract input.

AsTeRICS AT Modules

The AsTeRICS project provides a flexible construction set for creating assistive technologies that are highly adaptable to the user. The following components might be extracted on demand so that they can be used by developers. Additionally we plan to build a bridge to GPII via Websockets so that dedicated AsTeRICS models utilizing various components can interact directly with the GPII infrastructure.


Template URC Socket Modules

“super-sockets” by inheritance for various targets (products and services), to be used as basis for the development of individualized, pluggable user interfaces for smart homes.


Smart Discovery Modules

easy discovery of smart devices using different modalities like Barcode/NFC or spatial references using implicit and explicit interactions with the environments using HTML5 and SmartPhone APIs


Smart Authentication Module

context-based/implicit access control and authentication mechanisms based on physical presence that work on any browser

contact: KIT

Affect Sensing Modules

INTERSTRESS project on algorithms for emotion, stress, boredom and frustration detection using information from virtual activity sensors (e.g. based on accelerometers, camera-based) and physiological sensors (i.e. ECG, GSR)

contact: CERTH-ITI Kostas Votis

Browser-based Context Sensing

Browser-based Learning and Generation for Virtual Sensors

[Upcoming] Crowd-sourced Calibration/Emulation Module

replacement of AT input technologies with cheaper or better available sensors by allowing users to provide correlation and calibration data

contact: KIT

Other/External Components listings

Get Involved/Contribute

This is a place for bootstrapping GPII DSpace Component Listing. Get a wiki account and add new stuff or join the mailing list for news on reusable components: http://lists.gpii.net/cgi-bin/mailman/listinfo/dspace

TEMPLATE for new component listings

Use either links or a sub page using the Developer Space Components Template Description.

See Also



Wiki Categories