Difference between revisions of "Developer Space/Components"

From wiki.gpii
Jump to: navigation, search
Line 1: Line 1:
<div class="toclimit-2" style="float:right; margin: 0em 1em 0em 1em;">
#REDIRECT [[:Category:Developer_Space/Components]]
==  Web Accessibility Testing ==
* [http://www.paciellogroup.com/resources/wat "Web Accessibility Toolbar"]
* [http://wave.webaim.org/ "Web Accessibility Evaluation Tool" ]
* [https://addons.mozilla.org/de/firefox/search/?q=accessibility "Firefox Accessibility Addons" ]
* [http://trace.wisc.edu/peat/ "Photosensitive Epilepsy Analysis Tool"]
* [http://www.paciellogroup.com/resources/contrastAnalyser  "Colour Contrast Analyser (Win/Mac)"]  For testing color(!) contrast of text to meet WCAG  AA or AAA success criteria.
* [http://www.accessible-project.eu/ "Web & Mobile Web Accessibility Evaluation Tools" ]
* [https://github.com/Access4all/ContentAccessibilityChecker Access4All Content Accessibility Checker]
* [http://www.accessible-project.eu/index.php/dias.html Disability Impairment Approximation Simulator (DIAS)]
* [https://github.com/Heydon/REVENGE.CSS REVENGE.CSS] [http://heydonworks.com/revenge_css_bookmarklet/ bookmarklet]
== GUI Modules ==
=== Accessible Chart Components ===
* [http://paypal.github.io/amcharts-accessibility-plugin/ AmCharts accessibility plugin]
* [http://describler.com/ Describler: Make SVG Accessible]
=== Accessible UI Modules ===
* [https://github.com/paypal/bootstrap-accessibility-plugin Bootstrap accessibility plugin].
** Victor Tsaran: worth looking at: [https://www.paypal-engineering.com/2014/01/28/bootstrap-accessibility-plugin-making-the-popular-web-development-framework-better/ Bootstrap Accessibility Plugin: making the popular web development framework better]. PayPal Engineering Blog, 28 January 2014.
* AEGIS: [http://access.aol.com/aegis/ Accessible jQuery-ui Components Demonstration].
* AEGIS: [http://www.accessiblemootoolsdemo.iao.fraunhofer.de/Mootools_Widgets/index.html Accessible Mootools Widget Demo] (by [http://www.iao.fraunhofer.de Fraunhofer IAO], but not the same group as in [[Cloud4All]]).
* [http://wiki.fluidproject.org/display/docs/Component+Library Infusion Component Library].
* Dojo: [http://dojotoolkit.org/reference-guide/1.9/dijit/ Dijit Overview].
* Fluid Projects: http://fluidproject.org/projects/
* ExtJS: 4.2 [http://docs.sencha.com/extjs/4.2.2/#!/guide/accessibility] [http://vimeo.com/17840717], 5.0 [http://docs.sencha.com/extjs/5.0.0/core_concepts/accessibility.html]
* OpenUI5 Controls with Aria Support [https://openui5.hana.ondemand.com/#search.html?q=%22ARIA%20Support%22]
* Assets Framework [http://assets.cms.gov/]
* Foundation [http://foundation.zurb.com/]
=== [[Developer Space Components/self-adaptive-ui-modules|Self-adaptive-ui-modules]]<br/> ===
Self-adaptive components from approaches like MyUI and URC will be adapted so that they can provide runtime adaptations with modern standard frameworks without requiring specific infrastructures (e.g. like the adaptation engine or adaptation framework).
=== Other Frameworks and APIs ===
* IAccessible2 [http://www.linuxfoundation.org/collaborate/workgroups/accessibility/iaccessible2] [http://www.linuxfoundation.org/collaborate/workgroups/accessibility/iaccessible2/softwaredirectory]
== Personalization Modules ==
=== Social Personalization Modules ===
taking recommendations and opinions of one’s social network into account when looking for interface options
use  high-level information, feedback and recommendations and meta-data provided by the new development tools
contact: [[RtFI]], [[HdM]]
=== Other Development/Testing  ===
* http://projects.eclipse.org/projects/technology.actf
== Output Rendering/Conversion Modules ==
=== Speech Synthesis ===
*the JavaScript Speech Synthesis Code:
**GOOGLE: Apparently, Chrome is planning on making a speech synthesis (also speech recognition!) API for the web that can run '''without'''' any Chrome plugin support
***[https://code.google.com/p/chromium/issues/detail?id=171887 Chromium: Issue 171887: Implement the TTS / Synthesis portion of the Web Speech API].
***[http://www.chromestatus.com/features Chromium Dashboard: Web Platform Features], esp. [http://www.chromestatus.com/features/4782875580825600 Web Speech API (synthesis)].
***Glen Shires and Hans Wennborg: [https://dvcs.w3.org/hg/speech-api/raw-file/9a0075d25326/speechapi.html Web Speech API Specification], 19 October 2012.
* [http://espeak.sourceforge.net/ ESpeak]
====Methods for adding voice output to a web app via javascript.====
*If you're willing to install an extension, then you can use this: http://accessibility.cs.cmu.edu/webanywhere/tutorials/local-tts-chrome.html
*If you want it to work without installation, then you'll have to use a remote speech service with the HTML5 <audio> API:  http://accessibility.cs.cmu.edu/webanywhere/tutorials/speech-servers.html
*You can also use an entirely Javascript solution (speak.js), but its sound quality is limited:  https://github.com/kripken/speak.js/
courtesy of Jeff Bigham at CS.CMU.EDU
=== Braille Translators ===
* [http://www.liblouis.org/ Liblouis  braille translator and back-translator and Liblouisutdml braile formatter]
* [https://code.google.com/p/daisy-pipeline/ Daisy pipeline]
* (Braille translators coming from SENSUS - as breakout component of their system - for use in other systems)
==== [Upcoming] GPII Transformer Infrastructure/toolkit====
an open source version of RoboBraille infrastructure available for anyone wishing to set up a transformer service.  ‘behind-a-firewall’ package as part of the repository.
contact: [[Sensus]]
=== Math access ===
*[http://oerpub.org/ OERPUB].
* Liz Gannes: [http://allthingsd.com/20130604/t-v-ramans-audio-deja-vu-from-google-a-math-reading-system-for-the-web/ T.V. Raman’s Audio Deja Vu: From Google, a Math-Reading System for the Web], 4 June 2013.
* InftyReader:
** [http://www.youtube.com/watch?v=eXPdugnzkow InftyReader Automatically Converting a BMP Math Image to MathML].
** [http://www.youtube.com/watch?v=xvvHi023U7Y InftyReader Automatically Converting a PNG Math Image to LaTeX].
** [http://www.youtube.com/watch?v=HuTbXDA9YiU InftyReader Automatically Converting a PNG Math Image to MathML].
** [http://www.youtube.com/watch?v=PHDZEjwWjx0 InftyReader Automatically Converting a TIF Math Image to LaTeX].
** [http://www.youtube.com/watch?v=WtW3bSfPaX4 InftyReader Automatically Converting a TIF Math Image to MathML].
* HTML5 and MathML: [https://eyeasme.com/Joe/MathML/MathML_browser_test MathML Browser Test (Presentation Markup)] ( Josephus Javawaski, 2012).
* Important docs:
** [https://vismor.com/download/Documents/Site_Implementation/viewing_mathematics.pdf Viewing Mathematics on the Internet (PDF)]
** [https://vismor.com/documents/site_implementation/viewing_mathematics/S7.SS7.php Font Rendering by Browser]
* Math font download sites:
** [http://tavmjong.free.fr/FONTS/ Arev Fonts].
** [http://www.microsoft.com/typography/fonts/family.aspx?FID=360 Cambridge Math].
** [http://arkandis.tuxfamily.org/tugfonts.htm Irianis ADF Math].
** [http://www.microsoft.com/typography/fonts/font.aspx?FMID=160 Lucida Sans Unicode].
** [http://www.dessci.com/en/products/mathtype/fonts.htm MathType's Fonts].
** [http://miktex.org/ MiKTEX].
** [http://support.apple.com/kb/PH3871 OS X Lion: Enter special characters and symbols] (archived page).
** [http://www.aip.org/stixfonts/ StiXfonts].
== Human Input Modules ==
=== Braille Displays ===
* [http://mielke.cc/brltty/documentation.html BRLTTY] and [http://mielke.cc/brltty/doc/Manual-BrlAPI/English/BrlAPI.html BrlAPI]
=== [[Open Source Input Transducer Prototyping Module]] ===
The input transducer modules provide a standardized interface to commonly used microcontroller platfroms like
* the [http://www.arduino.cc/ Arduino]
* the [http://www.pjrc.com/teensy/ Teensy USB Development Board].
* other platforms can be integrated on demand
This allows developers to acquire data from digital and analogue sensors, like simple switches, sip/puff sensors or other devices by simply connecting the sensors and using the provided microcontroller firmware.
=== Other HID Emulation ===
* [http://www.makeymakey.com/ MakeyMakey]: Make anything a keyboard
* [http://inputstick.com Input Stick]: Input HID Device emulation from Android via BT (proprietory HW + Android API)
* V-USB based HID firmware for AVR: http://www.obdev.at/products/vusb/prjhid.html (GPL)
* [http://www.autohotkey.com/board/topic/44603-usb-hid-emulator/ HID-Emulation using Autohotkey] Scripting on Windows
* [http://deskthority.net/ Deskthority] Community dedicated to keyboards and mice
** [http://deskthority.net/wiki/HID_Liberation_Device_-_Instructions HID Liberation Device]
** [http://deskthority.net/wiki/AVR-Keyboard AVR Keyboard]
* [https://github.com/tmk/tmk_keyboard] TMK Keyboard firmware
** [ https://github.com/tmk/tmk_keyboard/blob/master/doc/other_projects.md Links to other Keyboard Projects ]
=== [[Camera Input Modules]] ===
Several input components based on algorithms of the Open Computer Vision library [http://opencv.org/ OpenCV] will be provided via a standardized interface, including
* head tracking
* eye tracking or reflective marker tracking
=== Haptic/Touch I/O Modules ===
user-application interaction through a variety of haptic devices. This module will be based on previous work from the AEGIS FP7 project and other open source tools, like the 3DHapticWebBrowser
The Haptic/Touch I/O Modules will be based on:
- The [http://chai3d.org / CHAI3D ]: An open source set of C++ libraries for computer haptics, visualization and interactive real-time simulation. CHAI 3D supports several commercially-available three-, six- and seven-degree-of-freedom haptic devices, and makes it simple to support new custom force feedback devices.
Other Haptic APIs:
- The [http://www.h3dapi.org / H3DAPI ]
- The [http://geomagic.com/en/products/open-haptics/overview / Geomagic OpenHaptics ]
- The [https://java.net/projects/jtouchtoolkit / Jtouchtoolkit]
- The [http://www.immersion.com/products/haptic-sdk/ Immersion’s Haptic Development Platform ]
- The [http://www.novint.com/index.php/downloads / Novint SDK]
- The [http://qdot.github.io/libnifalcon/ / libnifalcon  ]
contact: [[CERTH]]
=== [[Bioelectrics Signal Acquisition and Processing Modules]] ===
Integration of bioelectric signal acquisition devices like
* the [http://openeeg.sourceforge.net/doc/ OpenEEG] biosignal amplifier
* the [http://www.openbci.com/ OpenBCI] 8-channel EEG/EMG/ECG acquisition unit
This will allow developers the integration bioelectric interfaces into their applications, including  EMG (electro-myogram, muscle activities), EOG (electro-oculogram, eye movements) or EEG (electro-encephalogram, brainwaves)
=== [[Smart Home Integration Modules]] ===
unified API for interaction with standardized smart home and building automation systems including
* [http://www.knx.org KNX/EIB] (via the [http://sourceforge.net/p/calimero/wiki/Home/ Calimero library])
* [http://www.enocean.com/en/home/ EnOcean]
This will allow developers to integrate environmental control features into applications.
=== [Upcoming] Real-Time User Monitoring Sensor Layer ===
code reuse for user monitoring (e.g. sensor plug-ins) in different assistive applications especially internal sensors on mobile phones and wearable sensors
contact: [[KIT]]
== Smart Input Modules ==
Modules that implement higher level functionality based on raw input, transforming the input into more abstract input.
=== [[AsTeRICS AT Modules]] ===
The [http://www.asterics.eu AsTeRICS] project provides a flexible
construction set for creating assistive technologies that are highly adaptable to the user. The following components
might be extracted on demand so that they can be used by developers. Additionally we plan to build a bridge to GPII via
Websockets so that dedicated AsTeRICS models utilizing various components can interact directly with the GPII infrastructure.
=== [[URC_Super_Sockets | Template URC Socket Modules ]] ===
“super-sockets” by inheritance for various targets (products
and services), to be used as basis for the development of individualized, pluggable user interfaces for smart
=== [[Point_and_Control | Smart Discovery Modules]] ===
easy discovery of smart devices using different modalities like Barcode/NFC or spatial references using implicit and explicit interactions with the environments using HTML5 and SmartPhone APIs
=== Smart Authentication Module ===
context-based/implicit access control and authentication mechanisms based on physical presence that work on any browser
contact: [[KIT]]
=== Affect Sensing Modules ===
INTERSTRESS project on algorithms for emotion, stress, boredom and frustration detection using information from virtual activity sensors (e.g. based on accelerometers, camera-based) and physiological sensors (i.e. ECG,
contact: [[CERTH-ITI]]  Kostas Votis
===  Browser-based Context Sensing ===
[[Browser-based Learning and Generation for Virtual Sensors]]
* [http://ambientdynamix.org/ Ambient Dynamics]: [http://ambientdynamix.org/resources/ Incomplete Source?]
=== [Upcoming] Crowd-sourced  Calibration/Emulation  Module ===
replacement of AT input technologies with cheaper or better available sensors by allowing users to provide
correlation and calibration data
contact: [[KIT]]
== Other/External Components listings ==
* RTFI: [http://www.raisingthefloor.org/content/spotlight#Helping_developers_to_build_accessibility_into_their_products Spotlight / featured tools and components].
* [http://www.aegis-project.eu/index.php?option=com_content&view=article&id=175&Itemid=72 AEGIS: Overview of demonstrators].
== Get Involved/Contribute==
This is a place for bootstrapping GPII DSpace Component Listing. Get a wiki account and add new stuff or
join the mailing list for news on reusable components:
=== TEMPLATE for new component listings ===
Use either links or a sub page using the [[Developer Space Components Template Description]].
== See Also ==
* [[Building Blocks]].
* [[Developer Space Resources]].
== Wiki Categories ==
[[Category: Developer Space]]

Revision as of 15:27, 30 October 2014