send link to app

ARC app for iPhone and iPad


4.4 ( 3344 ratings )
Education
Developer: Dianne Macdonald
Free
Current version: 1.1.12, last update: 4 years ago
First release : 25 Mar 2019
App size: 16.78 Mb

The theory
The intervention in this app is based on a number of theories of word decoding (word reading) and reading comprehension. The program targets word reading and reading comprehension by pairing written words with their corresponding image according to the Dual Coding Theory of Literacy (Sadoski, 2005). The Dual Coding Theory of Literacy explains how pairing written words with their corresponding image is an effective way to learn sight words (Arlin, Scott & Webster, 2018). This method also accelerates learning of oral vocabulary (Ricketts et al., (2015). According to the Simple View of Reading (Hoover & Gough, 1990), the product of word decoding and language comprehension is a necessary condition for reading proficiency/reading comprehension. Therefore, according to this theory, if a child has good word decoding but poor language comprehension (as is often the case with children with autism and hyperlexia), focusing on their oral vocabulary acquisition should lead to improved reading comprehension. Similarly, if a child has not yet learned to decode words, but has adequate language comprehension, focusing on word decoding (sight word reading) by pairing the written word with the image, should lead to improved word decoding. This intervention targets different areas of need depending on the profile of the child. Therefore, it is appropriate for young children (2-5 years old) with Autism Spectrum Disorder who show an early interest or ability in word reading (hyperlexia). It is also appropriate for children with typical development who have an interest in learning to read. The efficacy of this intervention application is currently being studied at McGill University in Montreal.

The Intervention
This program initially focuses on establishing oral vocabulary. A written word appears on the screen accompanied by the spoken word and the child is prompted to “Touch the word” to focus their attention on the written word. The child is then provided with a choice of two pictures and asked to match the picture to the word. Once the oral vocabulary term has been learned, then that same word is targeted for reading comprehension. At this stage, only the written word appears and the child is again prompted to touch the word. The child is again provided with 2 pictures and a written word, and the goal is to match the written word to its corresponding picture. If the child selects the correct word, then an arrow prompts them to go to the next page. If the child initially selects the incorrect picture, he/she will be asked to “try again”. If he/she selects the incorrect picture again, only the correct image remains on the screen and the child is again prompted to “Touch the picture that goes with the word”. In this way, the child always ends by selecting the correct word. Auditory instructions can be turned on or off. Similarly, if auditory instructions are turned off the child will hear a reinforcing sound or an incorrect sound. A timer in the upper right corner counts down from 15 minutes. A pause button allows the app to be paused. This app is designed to allow the child to control functioning, i.e., moving to the next page. At the end of the 15 minutes of play, a screen displays the vocabulary words (oral and written) that the child learned during that session. Parents are asked to try to use the words that their child has learned during other activities, like reading (by focusing on stories that have those words) or during everyday events in the home. Using these words outside of the app encourages carryover of learning from the app to everyday activities. At this time, a user name and password are required to use this app. If you live in the Montreal or Ottawa area and wish to participate in this study, please contact [email protected]
Funding for this application has been provided by the Organization for Autism Research (OAR) and the Centre for Research on Brain, Language and Music (CRBLM).