Tuesday, August 26, 2008

Lon Thornburg's No Limits 2 Learning Blog

Lon Thornburg is an assistive technology specialist in Oregon who is the author of the No Limits 2 Learning Blog, "celebrating human potential through assistive technology". In addition to assistive technology, Lon's blog focuses on topics related to children, education, disabilities,

Lon recently attended a workshop on alternative text access and podcasting, presented by Steven Timmer, of Premier Literacy, and came away with some good insights, which you can read on his blog post. Steven Timer, who is legally blind, discussed the difference between the concept of assistive technology for learning and assistive technology for living. I liked the quotes from Steven Timmer:

"AT must take what you find difficult to do and make it easier for you- if it doesn't, it isn't AT."

"if they can't use it within 15 minutes of training," Steven shared, "and 3 steps or less, they probably won't use it."

On another post, Lon shared another quote from Steven Timer:

"If kids don't find a practical reason to use AT for something they want to do, they won't incorporate it into their world and use it,"

Lon asks us to think more deeply about the purpose of assistive technology:

"Instead of spending so much time with a software that prepares them to learn information to perform better on state assessments and try to get their scores up, maybe we need to focus more on helping them to independently take text and summarize it, and convert it to an Mp3 file they can listen to it. Oh, and by the way…let them select an article, story or other print media that is relevant to them."

"I have some developmentally delayed and cognitive disabled students that are older - in high school that can use some of both kinds of AT to a certain point, but by high school, we should have given them the ability to use living technology for themselves so they can use it independently to access life. And you know what? If I teach them how to access life with living tools, I bet it will impact their ability to prep for those tests…hmmm just a thought."

Lon's blog is well worth signing up for a feed!

Here is another good tidbit of information he posted from his blog - his notes from an interview with Kristin Whitfield, CCC-SLP, from Dynavox, citing research from Penn State University, Project R, that suggests that early learners (2-5) can benefit from higher-end communication tools, if properly designed for their age and developmental needs. The researchers point out that most current AAC devices were developed by non-disabled adults, and do not model the way young children cognitively and linguistically interact with the world.

By the way, Penn State University offers a webcast, along with transcripts, slides, and handouts from the AAC RERC Webcast Series, Reading and AAC, presented by Janice Light, Ph.D., a professor in the Communication Sciences and Disorders.

  • "This session will discuss effective evidence-based practices to maximize the literacy skills of individuals who require AAC. Case studies (including video) will be used to illustrate effective interventions to help student who require AAC: (a) acquire phonological awareness skills, (b) learn to read words, (c) participate in shared reading activities with personalized books, and (d) write their own stories. With appropriate instruction, individuals who require AAC can achieve improved literacy skills and will be able to maximize their educational and vocational outcomes."

Thanks, Lon Thornburg, for sharing so much on your blog!

Saturday, August 23, 2008

Digital Students@Analog Schools, 2004. Do the sentiments still ring true?

Digital Students@Analog Schools, 2004. Do the sentiments still ring true?

The above link is to a post about on the Interactive Multimedia Technology blog about technology integration, need for multimedia learning activities, and resources for supporting technology and Universal Design for Learning.

Saturday, August 16, 2008

Microsoft Surface's Hotel Concierge Application: Let's see an affordable Surface, deployed in classrooms and libraries!

Take a look at the video of Microsoft's multi-touch "Surface", functioning as an interactive, electronic hotel concierge. Imagine how multi-touch, multi-user technology could be used in education!


Cross posted from the Interactive Multimedia Technology blog

Saturday, August 09, 2008

Adobe's Digital Kids Club: Great source for learning and communication activities that incorporate digital video and photography.

The Adobe Digital Kids Club website provides wide range of ideas for integrate digital video and photography into engaging lesson plans. Teachers are encouraged to share lesson plans on the site. A template is provided.

Info from the website:

"The Adobe Digital Kids Club makes it easy for educators to introduce digital photography and video into the classroom. Students take photos with their digital cameras and then use Adobe® Photoshop® Elements software for Windows® and Mac to edit, enhance, organize, and share their images. Using Adobe Premiere® Elements, they produce amazing videos for classroom projects and presentations."

"Submit your own lesson or activity
Teachers: Share your digital media lessons and activities. E-mail your lesson or activity to the Adobe Digital Kids Club, and we’ll contact you if your lesson is selected to appear on the site. Review some of the lessons and activities below, and then view and print out a template to get started. View or download a template (PDF: 235k)."

If you are involved with the Digital Kids Club, leave a comment and share your links!

Wednesday, August 06, 2008

Video Modeling Software for visual learners, including those with autism spectrum disorders.

Here is a clip about the Activity Trainer, video modeling software that supports the following skills:

  • Academic
  • Communication
  • Daily Living
  • Non-Verbal Imitation
  • Recreation
  • Social
  • Vocational




A free 30-day trial of the software can be downloaded from the Accelerations Educational Software website.

On the Accelerations Educational Software website, you can find other products, such as Storymovies, which is the product of a collaboration between Carol Gray (social stories), Mark Shelley, and the Special Minds Foundation.

Sunday, August 03, 2008

NextGen Teachers - NextGen School Psychologists?

I came across the NextGen Teachers blog, "Educators connecting to explore the next generation of teaching and learning", and noticed that the members of this blog post specific "how-to" information about what works in their classrooms.

If you are a teacher, school psychologist, related services provider, special educator, or anyone interested in the use of technology to support efforts such as Response to Intervention (RTI) and 21st Century Schools, take a look at what NextGen Teachers are writing about. It is small group, created by Doug Belshaw, a high school history and ICT teacher in England.

I wonder if there any NextGen Psychologists out there, involved with innovative technologies. I know of a few, but I'm sure there must be more....

Thursday, July 31, 2008

Role of Psychology in Assistive Technology and Device Discontinuance: Article from the American Psychological Association

Susanne Croasdaile, from the VCU Assistive Technology Blog, posted a link to an article from APA about the role of psychology in AT. According to the article, the input of psychology can lead to a better "goodness of fit" between the person and the technology, and decrease the phenomenon of device discontinuance.

It should be noted that the traditional terminology used for device discontinuance is device abandonment. Unfortunately, this asigns blame for the discontinuance on the user of the technology or device, rather than mis-match between the device/technology and the user.

(For more information about the factors related to device discontinuance, see "ATOMS Project Technical Report - Factors in Assistive Technology Device Abandonment: Replacing “Abandonment” with “Discontinuance” by April Lauer, MS, OTR, Kathy Longenecker Rust, MS, OT, & Roger O. Smith, PhD, OT.)

From the APA article:

"Development of successful AT products requires a careful analysis of the goals, functional capacities, and physical and social environments of the intended users. Marcia Scherer, of the University of Rochester Medical School, notes that psychologists play a leading role in carrying out this work, by conducting studies of users’ judgments of whether and how particular technologies benefit them; how technologies fit within the users’ full range of activities and contribute to their sense of control; the perceptions and attitudes of users and others toward particular technologies; and the ways in which technologies actually increase users’ abilities to perform particular activities independently in daily life."

"Careful research on these questions as part of the technology development process can help ensure that the products are actually acquired and used. As Scherer points out, “We know that technology-person mismatches can have a series of repercussions including wasted resources, and people not performing at their functional best. On the service delivery level, device abandonment represents ineffective use of an assistive technology, all of which can be addressed through psychological science.”

Thursday, July 24, 2008

Classroom 2.0 met the milestone of 10,000 members today!



Classroom 2.0 met the milestone of 10,000 members today!

I've mentioned previously that Classroom 2.0
a social networking site for those interested in Web 2.0 and collaborative technologies in education. Classroom 2.0 offers members a personal page, space for uploading pictures and videos, and also has a Resources Wiki, maintained by the membership.

Steve Hardagon is the founder of Classroom 2.0. He is the director of the K12 Open Technologies Intiative at the Consortium for School Networking (CoSN). established Classroom 2.0. He maintains the following blogs:


Thank you, Steve!

When I joined Classroom 2.0 earlier this year, there were over 3,000 members. I decided to start out by starting a thread on the Classroom 2.0 forum,
Let's share links to our blogs! The thread is still active, as new members join the group.

An outgrowth of this thread was the International Edublogger's Directory, which now represents 361 bloggers from about 45 countries, thanks to the efforts of Patricia Donaghy, who blogs at
Using ICT in Further Education and Free Resources for Education

Update from CoSN:
[07.15.08]"CoSN Receives MacArthur Grant: Exploring Policy & Leadership Barriers to Effective Use of Web 2.0 in Schools"
"CoSN has received a grant from the John D. and Catherine T. MacArthur Foundation, as part of the foundation’s digital media and learning initiative, which focuses on how digital media are changing the face of education, learning and students’ daily lives. The effort, titled Schools and Participatory Culture: Overcoming Organizational and Policy Barriers, will identify the organizational and policy barriers that impede the adoption of new media in schools, and develop an action plan with recommendations on how to overcome them."

CoSN's Data-Driven Decision Making website
CoSN's Cyber Security for the Digital District website

Tuesday, July 22, 2008

YouTube? TeacherTube? SchoolTube? An article in Converge asks if education is ready for YouTube.

A recent article in Converge, "Is education ready for YouTube", by Sara Cardine, discusses the use of YouTube in education. Some teachers and professors regularly post summaries of lectures on YouTube. Others use YouTube to find interesting video-clips that relate to what they are teaching. Other sites include TeacherTubeand SchoolTube. This is a great resource for busy teacher who are looking to find resources for visual learners.

The article goes on to quote Shelley Pasnick, of the
Center for Children and Technology:

  • "The use of media in the classroom can help students engage in an exploration-based approach to learning, where questions are encouraged over rote memorization of theories and rules. Still, the proper infrastructure has to be in place to ensure best practice.."
The article references the following YouTube channels established by organizations:

PBS
National Wildlife Foundation
National Public Radio

Wednesday, July 09, 2008

Technology and autism spectrum disorders: Innovative techniques for early assessment and diagnosis

I recently came across two articles regarding the use of technology and video for early diagnosis of autism. The Wall Street Journal article, "New Ways to Diagnose Autism Earlier: Detection at Younger Ages Leads to Greater Gains in Language and IQ; Predicting Risk with Eye-Movement Sensors"
The image “http://s.wsj.net/public/resources/images/PJ-AM721_pjAUTI_20080707145105.jpg” cannot be displayed, because it contains errors.

The following research video, from Yale University, depicts an infant involved in an eye-tracking session:


The researchers at Yale's Autism Program work in an interdisciplinary environment, and focus on infants as young as three months, preschoolers, school-age children, and young adults.

According to the WSJ article, researchers from the Early Autism Study at McMaster Univerisity, in Canada, are engaged in similar research. They have developed a system that uses eye-movement sensors with babies as young as nine months of age.

Related research, involving the analysis of home videotapes, is currently underway through the MIT Media Lab's Human Speechome project.

A recent Orlando Sentinel article, "Is your baby autistic? UF researchers' book helps provide answers", reviews the work of Osnat and Philip Teitelbaum, of the University of Florida. The researchers analyzed movement patterns of a number of infants and young children from home videotapes. The children were later diagnosed with autism. According to this research, the movements of babies who developed autism were different than the movements of non-autistic children.

The use of video in autism research is not uncommon. For more information, see my previous post,
TECHNOLOGY FOR DATA COLLECTION, ANALYSIS, AND PROGRESS MONITORING. Scroll down to the section about the work of Gregory Abowd and his colleages at Georgia Tech.

Cognitive Bursts, Autism, & "Sense of Self": Digital media for intervention across all stages of development.

Reflections:
Over the past several years, I have worked closely with young people who have severe autism. At the same time, I have taken a variety of computer science, software information systems, and educational technology courses. Over time, I've integrated the use of technology, including digital photography and videography into my work. In some ways, it is still a much-unchartered territory.

Part of the reason that the use of technology for prevention and intervention has not been at the forefront of special education is that our current practices were informed by research that took place years before the Internet was a household world, before interactive white-boards, and before concepts such as "Universal Design for Learning" were taught in special education teacher preparation courses. 



 Those our 40's (or older) who did not take courses in MIS or computer science most likely were taught by professors who minimal exposure to technology. Some of us might have had a "tech-savvy" professor who was familiar with the ins and outs of SPSS on a mainframe computer.   


Teachers with 20+ years of seasoning were lucky to have witnessed a a demonstration the latest in educational software when they were university students - the Electronic Workbook!     If you studied psychology, most likely your professors were probably up-to-date about the ins and outs of the brain, but think about all that has been discovered since then!

Here are some of my observations:
I've noticed that many young people who are "on-the-spectrum" experience what I call "cognitive bursts", often around puberty, but also during the late teen and early 20's


To an untrained eye, these bursts might go unnoticed, or even minimized. Part of the reason is that the "bursts" are not demonstrated in ways that can be easily captured through traditional psychological or educational assessments. For example, one student might not be able to make a choice in response to a test item by pointing. Another student might not be able to respond to a test item because they do not speak. If the student has spent numerous years in their "own little world", they might not be accustomed to showing what they know, even if they have made significant cognitive gains, including gains in receptive language.

As a professional, I know that it is not appropriate to provide parents with false hope. I know that the tools we have for assessing cognitive growth among students with autism spectrum disorders are not adequate. For example, two students can have the same "IQ" at age 3, 5, 8- or any age, but function much differently at age 18 or 25.  This is especially true for young people who have attention problems, working memory deficits, and/or delays in language development relative to their non-verbal abilities.


My point is that we must take early cognitive assessment  scores with a grain of salt, and   ensure that there are multiple opportunities for meaningful assessment and significant intervention during other points of a young person's development.
 

In my opinion, the more severe the situation, the more intensive the intervention! 

My mantra earlier in my career was "early intervention, early intervention- the earlier, the better!". It has changed.

Over the past few years, I have come to the realization that the focus on early intervention is only a small part of the bigger picture, and for some. The focus on the delivery of services, including technology-supported interventions for a student during their early years might minimize the opportunities- and funding- for significant intervention during other points of a young person's development.

By focusing primarily on early intervention, might be missing the boat. We must do more across the young person's development through young adulthood (and of course, beyond.) Each child is different, and each brain's course of development is different. One child may be ripe for growth at 30 months of age, or at age 3 or 4. Another might start talking and initiating interactions at age 14, or begin to make sense of print at age 16!
I know one severely autistic youth who was reading at an 8th grade level at age 22, something that probably would not have been predicted by those who worked with him during his early years.

From what I've observed in special education, cognitive bursts are often harnessed by a team of perceptive teachers, therapists, and support workers, to facilitate academic, communication, and at times, social interaction skills development. While this may not be the case for each student and in each school, it really does happen.

When a student experiences a "burst", no matter how insignificant it might look on the surface, we are given a golden opportunity to fashion an integrated approach to moving the young person forward. At the same time, we are provided the opportunity to  help the student develop a more solid sense of self. For students with severe autism, this might be a key to opening up their world.

Technology can help.
Because each young person develops differently, it is important that interventions designed to facilitate this sort of growth be available at all points of development, not limited to the intensive support that is recommended for the youngest of this group.

My mantra now is intervention, intervention, intervention, and INTENSIVE technology-supported intervention during periods of cognitive growth, across the developmental stages, as appropriate.


Here is what I've been doing:

I'm spending a higher percentage of my time observing students in a variety of settings, and using video and digital photography to capture my observations. I am using digital content during my assessment process, and I'm using digital content for creating intervention activities that assist in measuring a student's progress over time.

Most importantly, I think, is that I'm
exploring ways that teens with autism can develop a sense of self, to help them build a sort of "anchor" within themselves.

One technique I'm exploring is the use video cameras to record familiar activities and settings, from the first-person point of view. To do this, I follow the student around in school, home, and/or community setting, and then tape the various scenes as if I was in the young person's shoes. My camera is a window to the student's world, as they see it. I supplement the video with digital photography of the same content, which then can be incorporated into an interactive PowerPoint or slide-show.

I also spend some time taking video-clips and pictures of familiar items and objects the student encounters throughout the day, such as teaching materials that the teachers put up on the walls, computer screen shots, video clips of favorite songs and scenes from the television that the student watches, screen shots of educational software that the student uses, and so forth.


I use Kidspiration and Inspiration for some of this work. These applications are user-friendly, designed for student use, and provide multi-modal output. There is a text-to-speech component that is great for pairing words with visual representations. At the end of the school year, I came across a great application, called UMAJIN Creative, that I found to be quite useful. (I also use some of my own prototype applications, which are in various stages of development.)

How does this work? I usually sit beside a student in a comfortable, familiar spot, with my laptop placed where it can be accessed by both the student and myself. We look at the content together. For students who are used to using a switch, I have one available.

What I'm finding is that using strategies that incorporate digital media provides a means for the student to generate more language and communication.  This is often initiated by the student!

With students who have autism spectrum disorders, establishing a connection, through digital photography and videography, focusing on familiar things is especially important. Taking the time to capture the student's world, from their perspective, is mandatory, in my opinion. By doing this, we are providing specific information that might help to answer unspoken questions that the young person has, but lacks the skills to formulate or articulate - for example: 



"Who am I, and what is my relationship to this physical world?"

By taking this approach, the adults - teachers, parents, assistants - who are involved with the student, can work to build a solid scaffold for further learning and interaction. Bit-by-bit, digital content - pictures, video clips, can be built into the process to facilitate social awareness and social-emotional interaction skills. By learning about familiar people, how they "tick", and how one should go about interacting with these people, the student might gain a sense of self within a social context. We can help them answer the question we all have, at one time or another:

"Who am I, and what is my relationship to this social world?"

Note: I am actively searching for articles related to my topics. Please leave a comment, along with links and names of researchers if you have any information about this ! Personal observations are also welcome.)

Update:

See "



Minna, from SymTrend, left a comment on this post:
"SymTrend is PDA and web-based software for recording behavioral observations about children with lower functioning autism. Those who are higher functioning or Aspergers can use our system for self-monitoring and to get guidance when they are in situations that challenge them."
http://www.bricklin.com/log/symtrend.htm

Here is my previous comment about SymTrend:

"
The beauty of SymTrend, in my opinion, is that it helps people develop self-monitoring skills through providing a means of analyzing data that is gathered frequently. From what I understand, through interaction with the software, the student/client establishes a better understanding of themselves, and also and understanding of feelings, triggers, reactions, and coping strategies. A rich amount of data is collected that can be helpful to treatment providers, or special educators."

For more information, see Minna's comment to this post. My previous post about SymTrend includes a video about SymTrend. Also, visit the SymTrend website.


Saturday, June 28, 2008

New Assistive Technologies for the Visually Impaired: WebAnywhere, Trinetra

I came across a couple of newer technologies suitable for children, teens, and adults who have vision impairments. The technologies might also be useful to a wide range of people, including those who do not have a disability.

WebAnywhere is a screen reading interface for the web, developed by Jeffry P. Bighan, Craig M. Prince, Sangyun Hahn, and Richard E. Ladner, of the University of Washington.

Below is information about WebAnywhere from the University of Washington News:

"This is for situations where someone who's blind can't use their own computer but still wants access to the Internet. At a museum, at a library, at a public kiosk, at a friend's house, at the airport," said Richard Ladner, a UW professor of computer science and engineering. The free program and both audio and video demonstrations are at http://webanywhere.cs.washington.edu.

"Online service lets blind surf the Internet from any computer, anywhere"

WebAnywhere Site

WebAnywhere Paper
WebInSight Publications
WebAnywhere Alpha Release


From the Carnegie-Mellon Website:

Trinetra "the third eye" : SmartPhone-based assistive technologies


The image “http://www.ece.cmu.edu/~trinetra/images/danrfull.jpg” cannot be displayed, because it contains errors.


"Trinetra aims to develop cost-effective, smartphone-enabled assistive technologies to provide people with an enhanced quality of life in their daily activities. The broad objective is to harness the collective capability of diverse networked embedded devices to support location-aware and context-aware applications, including first-responder support, building navigation, retail shopping, smart transportation, etc."

"The project was originally conceived to enable greater independence for the blind and the visually impaired. To date, we have researched and developed a portable barcode-based solution involving an Internet- and Bluetooth-enabled smartphone to aid grocery shopping at the Carnegie Mellon campus convenience store, Entropy."

"We have also more recently extended this to assist both sighted and visually impaired commuters with their transportation and commute-planning needs, using a smart phone to convey notifications of arrivals, departures, etc. We have also developed a phone-based currency identifier for the visually impaired."

Trinetra: Assistive Technologies for Grocery Shopping pdf


Assistive Embedded Technologies pdf Priya Narasimhan

Related:

"Sight for the Blind and Speech for the Deaf: A professor turns cellphones into aides for the disabled"
-Catherine Rampell (Chronicle of Higher Education)

Wednesday, June 18, 2008

More Multi-touch: Link to Scientific American article.

Scientific American, June, 2008 Hands On Computing: How Multi-touch Screens Could Change The Way We Interact With Computers and Each Other "The iPhone and even wilder interfaces could improve collaboration without a mouse or keyboard. "

"It is easy to imagine how photographers, graphic designers or architects—professionals who must manipulate lots of visual material and who often work in teams—would welcome this multi-touch computing. Yet the technology is already being applied in more far-flung situations in which anyone without any training can reach out during a brainstorming session and move or mark up objects and plans." -Stuart Brown

Comment:
In K-12 settings, this technology would be great for cooperative group learning, technology-supported project-based instructional activities, and group social skills training.




http://www.cnet.com.au/story_media/339274147/200x150/the-next-game-controller-your-brain_1.jpg
If you are looking for information about brain-computer interfaces, follow the link to my post about Emotive Systems neural interface on the Technology-Supported Human-World Interaction blog. It looks like it holds promise for cognitive rehabilitation applications and games.

Emotiv System's Neural Game Controller Headset: Human-Computer Interface of the Future?

The image “http://www.tomshardware.se/kringutrustning/20080221/images/emotiv_425.jpg” cannot be displayed, because it contains errors.

Also see:

Game Interaction via Thoughts and Facial Expressions: EPOC - Emotiv Systems Neural Interface

Monday, June 16, 2008

Inclusive Music: Banana Keyboard SoundHouse Special Access Kit

The image “http://www.spectronicsinoz.com/images/product/banana_keyboard.jpg” cannot be displayed, because it contains errors.

For those of you looking for assistive technology for music, the Banana Keyboard and the SoundHouse special access kit might be the answer to you needs. According to the website, the kit is designed to support skill development in the following areas:

  • Switch use
  • Cause-and-effect
  • Switch timing
  • Choosing with a switch
  • Music


The image “http://www.spectronicsinoz.com/images/product/banana_in_use.jpg” cannot be displayed, because it contains errors.
(Pictures from the Spectronics website.)

Features:
  • Sixteen keys, curved for easy access, fits well on a wheelchair or desktop.
  • Connect up to eight switches to the keyboard.
  • Play back words and speech, along with music.
  • Software handles MIDI and WAVE sound files.
  • Works with the Super Duper Music Looper software that allows children to use a paintbrush, an erase tool, and a mouse to create music.

Thanks, Gavin McLean, for the link!

Friday, June 13, 2008

Revisiting Interactive 3-D Brain Anatomy : The Secret Life of the Brain Website

http://www.bcs.org/upload/img_200/brain-skull.jpgThe image “http://www.pbs.org/wnet/brain/scanning/images/intro_right_03.jpg” cannot be displayed, because it contains errors.timeline picsThe image “http://www.pbs.org/wnet/brain/images/screen_left_01.jpg” cannot be displayed, because it contains errors.

I'm attending the first of three two-day institutes about neuropsychology, focusing on the assessment and intervention of traumatic brain injuries. It has been a while since I studied neuropsychology, so to brush up, I revisited Secret Life of the Brain, an on-line companion to the PBS series of the same name that aired in 2002. The materials cover the human brain from infancy through old age.

My favorite section of this website is the interactive 3-D Brain Anatomy tour. This on-line application allows for zooming in and out, 360 degree rotation, and exploration of the brain by area or function. When you roll over a brain part, you can find more information. The specific area of the brain becomes highlighted, and the rest of the brain becomes translucent.

Description from the website:
"THE SECRET LIFE OF THE BRAIN, a David Grubin Production, reveals the fascinating processes involved in brain development across a lifetime. The five-part series, which will premiere nationally on PBS in winter 2002, informs viewers of exciting new information in the brain sciences, introduces the foremost researchers in the field, and utilizes dynamic visual imagery and compelling human stories to help a general audience understand otherwise difficult scientific concepts."

Wednesday, June 11, 2008

MICOLE: Open-source multi-modal software supports cooperative learning among sighted and visually impaired children

According to an article on the ICT Results website, a project called "MICOLE" explores the ways multi-modal computing can support co-operative learning among sighted and visually impaired children by harnessing the sense of touch through haptic input devices, and providing a means to produce pictures that can be felt.

This is a quote from the article:
“Adding the sense of touch to information and communication technology is just getting to the point where it can be commercialised,” Raisamo continues. “The first people to benefit are people with disabilities, especially people who are blind or have visual impairment. The more senses you can use, the more multi-modal your computer interface, the more inclusive the technology can be.”Students colloborate in hands on learning the Micole way. Photo: © Micole project.










MICOLE stands for Multimodal Collaboration Environment for Inclusion of Visually Impaired Children.

MICOLE is an open-source project. You can download the software and SDK (MICOLELib) from the website. There also is an on-line support forum and a list of publications.

From the MICOLE project website:
"The work in the MICOLE project aims at developing a system that supports collaboration, data exploration, communication and creativity of visually impaired and sighted children. In addition to the immediate value as a tool the system will have societal implications through improved inclusion of the visually disabled in education, work, and society in general. While the main activity is the construction of the system, several other supporting activities are needed, especially empirical research of collaborative and cross-modal haptic interfaces for visually impaired children."

According to an article about MICOLE on the
Axistive website:

"Among the interfaces and application prototypes that have been developed are an electronic browser, rhythm reproduction, Post-It notes with a haptic bar code, virtual maracas (percussion instruments), a tactile maze game, memory games, a haptic version of Pong and explorative learning of the internal layers of the earth."

Related:

Hands on Learning for the Visually Impaired

Multisensory User Interface