Keywords

1 Introduction

Images and diagrams are an integral part of many educational materials [9]. Tactile diagram is the representation of an image in a simplified form that makes the content accessible by touch. They are widely adopted in textbooks for the visually impaired (VI) people. Several studies [4, 13, 31] have shown that tactile perception is good for the comprehension of graphical images and tactile diagrams proved to be useful for the VI students for learning graphically intensive subjects. Apart from tactile textbooks, tactile diagrams are widely used in public spaces as maps and floor plans for guiding VI people.

Fig. 1.
figure 1

Deciphering tactile images; (A) Exploring the tactile image, Braille keys and symbols with two hands (B) Using both hands to decipher the Braille legend on the consecutive page (C) Exploring the tactile image, Braille keys and symbols with two hands

Despite the wide acceptance of tactile diagrams, they are often limited by their spatial resolution and local perception range [24]. The traditional tactile graphics makes use of the Braille annotations as a type of markup for the discrete areas of tactile diagrams. However, Tatham [37] states that, the extensive use of Braille annotations in can worsen the overall legibility of the tactile graphics. While textures and tactile patterns are prominently used for marking areas, it still involves finding the key and the corresponding description which are often placed in other pages. The number of textures that could be clearly distinguishable remains limited and can vary on the tactile acuity of the user [38]. Additionally, the Braille legend of a diagram is placed on multiple pages, which demands flipping of pages for comprehending pictorial information (Fig. 1). This in turn complicates the interpretation of tactile images [16]. Another reason for excluding Braille annotations from tactile graphics is due to the inclusivity of Braille among the VI community. Research [6] shows that the number of blind people who can read Braille and it can be estimated that an even smaller proportion can read Braille-labelled tactile graphics. Another argument to reduce Braille labels is to limit the tactile complexity of the graphics. A widely adopted alternative is to combine tactile graphics with interactive assistive technologies. Recent studies have shown that the tactile diagrams complemented with interactive audio support is advantageous according to the usability design goals (ISO 9241) [7]. There are various existing devices and approaches (mentioned in Sect. 3) for audio-tactile graphics. However, the factors pertaining to wearability, setup time, effects of the ambient lighting conditions and scalability were not fully investigated in the existing audio-tactile methodologies.

In this paper, we present the design of FingerTalkie, a finger-worn interactive device with an offset point-and-click method that can be used with existing tactile diagrams to obtain audio descriptions. Compared to the existing interactive audio-tactile devices, FingerTalkie does not use camera based methods or back-end image processing. Our concept leverages the usage of color tactile diagrams which gaining popularity, thus reducing the barrier for technology adoption. The FingerTalkie device was designed through an iterative user-centred design process, involving 8 visually-impaired users. Minimal and low-cost hardware has helped in the design of a standalone and compact device. We conducted a controlled experiment with 12 blind-folded sighted users to evaluate the usability of the device. The results showed that the user performance of pointing and clicking with FingerTalkie could be influenced by the size and the complexity of the tactile shape. We further conducted a focus-group interview with 8 VI users. The qualitative result showed that compared existing audio-based assistive products in the market, the VI users appreciated FingerTalkie’s ease of setup, support for two-hand exploration of the tactile diagrams, and potential in improving the efficiency of comprending tactile diagrams.

2 Related Work

We discuss prior work related to two areas of our system: (i) audio-/touch-based assistive devices for VI users and (ii) finger-based wearable interfaces.

2.1 Audio-/Touch-Based Assistive Technologies

Adding auditory information (e.g., speech, verbal landmarks, earcons, and recorded environmental sounds) to the tactile diagrams has been considered as an efficient way of improving the reading experience of VI users [7, 26]. Furthermore, it was intuitive for VI users to obtain such auditory information with their fingers touching the tactile diagrams or other tangible interfaces. Early prototypes, such as KnowWhere [22], 3DFinger [32], Tangible Newspaper [36], supported computer-vision-based tracking of VI user’s finger on 2D printed material (e.g., maps and newspaper) and retrieval of the corresponding speech information. Nanayakkara et al. [28] developed EyeRing, a finger-worn device with an embedded camera connected to an external micro-controller for converting the printed text into speech output based on OCR and text-to-speech techniques. Later, the same research group developed FingerReader [35] and FingerReader 2.0 [5], to assist blind users in reading of printed text on the go by harnessing the technologies of computer vision and cloud-based object recognition. Shi et al. [33] developed Magic Touch, a computer-vision-based system that augments printed graphics with audio files associated with specific locations on the model. The system used external webcam to track user’s finger on the 3D-printed object, and retrieve the corresponding audio information. Later, Shi et al. [34] expanded the functionality of Magic Touch to Markit and Talkit with the feature of touch-based audio annotation on the 3D-printed object. Using the front camera of a smart tablet and a front-mounted mirror, the Tactile Graphics Helper [12] tracked a student’s fingers as the user explores a tactile diagram, and allowed the student to gain clarifying audio information about the tactile graphic without sighted assistance. Several researchers have also developed hand gesture for interactive 2D maps for the VI [8].

These works suggested that the camera-based finger-tracking method can be used for VI users to retrieve audio information by touching physical objects. However, there are major drawbacks in using camera-based technologies including back-end processing hardware, size of the camera and system, the requirement for ambient light, and difficulty with near focus distance. Furthermore, it was costly to embed a camera and set up an external connection to the processing hardware. Due to these limitations, this solution may not be suitable for VI users in the developing countries.

Besides computer-vision-based finger tracking, researchers also investigated other techniques based on embedded sensors, such as Pen Friend [20], Near Field Communication (NFC)/Radio-frequency identification (RFID) reader [40], and QR-code readers [1, 3], for retrieving audio with the tactile diagrams. While these devices may overcome the requirement for high resolution as in the camera-based solution, they often require users to hold devices in their hands, thus keeping at least one hand constantly occupied. As the distal phalanx of the index fingers (Fig. 2) are primarily used for exploring Braille and tactile diagrams, it is advised that VI users’ hands should not be occupied by any other means [10]. Moreover, it is difficult to paste a Pen Friend label or RFID tag or QR code in smaller regions and areas with irregular boundaries on a tactile diagram. In addition, QR-code detection demands an optimal amount of ambient light for the reader to operate, which makes it quite unusable in low light conditions [3]. Talking Tactile Tablet (TTT) [23], in turn, may support the user reading the tactile diagram with both the hands and get an audio feedback simultaneously. However, the size and weight of the device makes it non-portable.

In this paper, we explain the design and implementation of FingerTalkie in a finger-wearable form factor, with cheap, off-the-shelf and robust color-sensing technology. It supports audio retrieval from color-printed tactile diagrams without any extra hardware embedded in the diagrams. Our technical experiments showed that FingerTalkie can retrieve correct audio information in low-light or even dark settings.

Fig. 2.
figure 2

(a) Parts of the fingers (b) Bending of fingers during tactile reading

2.2 Finger-Based Wearable Interfaces

Wearable devices for the hand often focused on the fingers since it is one of the most sensitive part and most often used for grasping and exploring the environment. The design of the interaction technique in FingerTalkie was largely inspired by existing wearable finger-based interaction for general purposes. Fukumoto and Tonomura’s FingerRing [11] in 1994 was considered to be the first digital prototype exploring a finger-worn interface. It embedded an accelerometer into the form factor of a finger ring to detect gesture input in the form of taps performed with the fingertips. Since then, various technologies have been used to implement ring-shape input devices. For instance, Nenya by Ashbrook et al. [2] detected finger rotation via magnetic tracking. Yang et al. introduced Magic Finger [43] with IR beacons to recognize surface textures. Ogata et al. [29] developed iRing using infrared reflection to detect directional gesture swipes and finger bending. Jing et al. developed Magic Ring [18] with an accelerometer to detect motion gestures of the index finger. eRing [41] employed electric field sensing to detect multiple finger gestures. OctaRing [25] achieved multi-touch input by pressure-sensing, and LightRing [21] fused the results of infrared proximity sensing and a gyroscope to locate the fingertip on any surface for cursor pointing and target selection. All these existing finger-based input techniques utilized embedded motion sensors in the ring-shape form factor, to achieve surface or mid-air gesture recognition. When it comes to designing finger-based interaction for VI users reading tactile diagram, one should take into account the ease of input registration and the robustness of input detection. Motion sensors may face the issue of robustness due to low sensor bandwidth. As discussed before, VI users often understand the tactile diagrams with both hands resting on and touching the diagrams. Thus, performing complex gestures on the surface or mid air may cause fatigue.

To ensure the robustness of finger-based interaction, researchers leveraged thumb-to-finger touch with buttons [14] and touch sensors [42]. Inspired by these configuration, we incorporated a button in the FingerTalkie device for VI users to register the input. The choice of using buttons instead of sensors aimed to further reduce the cost of the device. Different from the existing devices mostly with buttons on the side of the proximal phalanx, we investigated the placement of the button around the finger through iterative design processes, and designed the one-finger offset-clicking input technique in our final prototype. The quantitative and the qualitative studies suggested that VI users could successfully explore the tactile diagrams and retrieve corresponding audio information using the offset-clicking technique with the button placed in front of the finger tip.

3 Our Solution - FingerTalkie

Based on the problems and challenges identified in existing literature, we designed a device with embedded color sensor on the fingertip that does not obstruct the finger movements or the touch-sensing area of the finger tip. The initial design of the FingerTalkie device is illustrated in Fig. 3. The color sensor on the tip of the finger can read/sense colors printed on a tactile diagram. A user can click the button on the proximal phalanx to play the audio associated to the colored area, via an external device (e.g., laptop, smartphone or smartwatch) that is connected to it wirelessly. The external device handles the computation and stores the database of colors and mapped audio files. In the following we describe the rationale behind our design choices.

3.1 Problems and Considerations

There are several studies that investigated the haptic exploration styles of the visually impaired and the sighted people [15]. When using two hands to explore the tactile diagram and its annex Braille legend page, VI users may use one hand as a stationery reference point (Fig. 2C) or move both hands simultaneously (Fig. 2 B). The exploration strategies consists of usage of only one finger (index) or multiple fingers [15]. The precise nature of these exploratory modes and their relations to performance level remain obscure [39]. Nevertheless, a common problem with tactile diagrams is its labelling. Braille labelling becomes cumbersome as it often becomes cluttered and illegible due spatial constraints [37]. Moreover, associating the Braille legend on the separate pages disrupts the referencing and reduces the immediacy of the graphic, thereby resulting in comprehension issues [16].

To address this issue, several existing studies associates the auditory information with touch exploration, to enhance the experience of VI users obtaining information through physical interfaces. Finger-worn devices with motion sensors and camera based setup can be costly and difficult to calibrate and set up. These devices also requires the user to aim a camera, which can be difficult for blind users [19], and use one of their hands to hold the camera, preventing bimanual exploration of the diagram, which can be necessary for good performance [27]. Based on the above factors and constraints, we formulated the following design considerations for developing a system that:

  1. 1.

    Allow users to use both hands to probe tactile boundaries without restricting the movement and the tactile sensation of finger tips.

  2. 2.

    Support the access to real-time audio feedback while exploring discrete areas of tactile diagram irrespective of the boundary conditions (irregular boundaries, 2.5D diagrams, textured diagrams etc.)

  3. 3.

    Is portable, easy to set-up, inexpensive and easily adaptable with the existing tactile graphics for VI users in developing countries.

Fig. 3.
figure 3

First prototype sketch

3.2 Design Rationale

Existing interactive technologies for audio-tactile diagrams include embedding physical buttons or capacitive touch, RGB camera with QR code, text recognition and RFID tags to map audio to the discrete areas. These technologies lack flexibility as the users have to focus to particular points within the tactile area to trigger the audio. Moreover, it is difficult for QR codes and RFID tags to be used with the tactile diagrams with irregular boundary lines. By further exploring a simpler sensing mechanisms, the idea of color tagging and sensing for audio tactile may offer advantages over other methods due to the following reasons:

  1. 1.

    Contrasting colors have been widely used in tactile diagrams for assisting low vision and color-blind people for easy recognition of boundaries and distinct areas. The device could leverage the potential of existing colored tactile diagrams, without requiring the fabrication of new ones.

  2. 2.

    The non colored tactile diagram can be colored with stickers or easily painted.

  3. 3.

    The color-sensing action is unaffected by ambient lighting with the usage of a sensor module with an embedded white LED light.

  4. 4.

    Color sensors are low-cost, frugal technology with low power consumption and low requirement on background processing.

4 Iterative Design and Prototyping

Follow the design considerations and the conceptual design, We adopted a multiple-stage iterative design process involving 8 VI users evaluating 3 prototypes.

4.1 First Prototype

We followed the existing work on finger-worn assistive device [28] to design the first prototype of FingerTalkie. As shown in Fig. 4, it consisted of two wearable parts: (i) a straight 3D-printed case to be worn at the middle phalanx with the color sensor (Flora TCS34725A) at the tip, and (ii) a push button which was sewed to another velcro as a ring worn on the finger base. A velcro strap was attached on the 3D-printed case to cater to different finger sizes.

For this prototype, we used an Arduino UNO with a laptop (Macbook Pro) as the external peripherals. The wearable part of the prototype device was connected to the Arduino UNO using thin wires. We used Arduino IDE with the standard audio package library to store color-to-audio profiles and perform the back-end processing.

Fig. 4.
figure 4

First prototype

User Study 1 - Design

The main goal of testing the first prototype was to investigate the feasibility of the hardware setup, and collect user feedback on the early design and the prototype of FingerTalkie.

Participants. For the first pilot study, we recruited 4 congenitally blind participants (4 males) aged between 27 to 36 (Mean = 31.5, SD = 3.6). All the participants were familiar with using tactile diagrams.

Fig. 5.
figure 5

The tactile diagram used in the pilot studies. (Color figure online)

Apparatus. We tested the first prototype with a simple tactile diagram of two squares (blue and pink color) as shown in Fig. 5. Pressing the button on the device while pointing to the area within the squares activates different sounds.

Task and Procedure. The participants were initially given a demo on how to wear the prototype and to point and click on a designated area. Then they were asked to wear the prototype on their own and adjust the velcro strap according to their comfort. Later, the tactile diagram 5 was given to them and the participants were asked to explore and click within the tactile shapes to trigger different sounds played on the laptop speaker. Each participant could perform this action as many times as they wanted within 5 min. After all the participants performed the above task, a group interview was conducted. The participants were asked about the subjective feedback on the wearability, ease of use, drawbacks and issues faced while using the device and possibilities for improvement.

Study 1 - Feedback and Insights

All the participants showed positive responses and stated that it was a new experience for them. They did not face any difficultly in wearing the device. One participants accidentally pulled off the wires that connected the device[to the Arduino] while trying to wear the prototype. All the participants reported that the device was lightweight and it was easy to get the real-time audio feedback. 3 participants reported that the device doesn’t restrict the movements of their fingers during exploration of the diagram. For one participant, we noticed that the color sensor at the tip of the device was intermittently touching the embossed lines on the tactile diagram. This was due to his peculiar exploration style where the angle of exploration of the fingers with respect to the diagram surface was higher compared to the rest of the participants. This induced the problem of unintended sensor touching on tactile diagram during exploration. Moreover, the embossed elevations can also vary based on the type of the tactile diagrams which could worsen obstruction for the color sensor.

Fig. 6.
figure 6

Second prototype and the angular compensation at the tip

4.2 Second Prototype

In order to avoid the unwanted touching of color sensor while exploring a tactile diagram, we affixed the sensor at an angular position with respect to the platform. We observed the participants fingers were at an angle of 45\(^{\circ }\) with respect to the tactile diagram. Thus, we redesigned the tip of the device and fixed the color sensor at an angle of 45\(^{\circ }\) as shown in Fig. 6. The overall length of the finger wearable platform was also reduced from 6 cm to 5 cm.

The second prototype is a wrist-worn stand-alone device as shown in Fig. 6. It consisted of an Arduino Nano, 7.2v LiPo battery, a 5 V regulator IC with and an HC-05 Bluetooth module. All the components are integrated into a single PCB board that is connected to the finger-worn part with flexible ribbon wires. This design solved the problem of the excess tangled wires as the device could now connect with the laptop wirelessly through Bluetooth.

User Study 2 - Design

We evaluated the second prototype with another user study to assess the new design and gain insights for further improvement.

Participants. During the second pilot study, we ran a hands-on workshop with 4 visually impaired people (3 male and 1 female) aged between 22 to 36 years (Mean = 29, SD = 2.7). We used the second prototype and the tactile diagrams of squares that were used in first pilot study.

Task and Procedure. The users were initially given a demo on how to wear the prototype and then to point and click on a designated area. Later they were asked to wear the prototype on their own and adjust the velcro strap according to their comfort. Then, they were asked to explore the tactile diagram and click within the tactile shapes. Whenever the participant pointed within the squares and pressed the button correctly, Tone AFootnote 1 was played on the laptop speakers. When they made a wrong point-and-click (outside the squares), Tone BFootnote 2 was played to denote the wrong pointing. Each participant was given 10 min for the entire task. After the entire task, the participants were individually asked to provide their feedback regarding the ease of use, the drawbacks and issues faced while using the device and the potential areas of improvement.

4.3 Study 2 - Feedback and Insights

We observed that with the refined length and angle of contact of the device, the participants were able to explore the tactile diagrams more easily. However, two participants said the they found it difficult to simultaneously point to the diagram and press the button on the proximal phalanx. One participant said, “I feel that the area being pointed by [my] finger shifts while simultaneously trying to press the button on the index finger”. We found that the above mentioned participants had relatively stubby thumbs, which might had increased the difficulty of clicking the button while pointing. This means that the activation button on the distal phalanx may not be suitable for all the users ergonomically. Another participant who is partially visually impaired was concerned about the maximum number of colors (or discrete areas) the sensor could detect and whether colors could be reused.

5 Final Prototype

Based on the findings from the two user studies, we came up with a novel point-and-click technique and finalized the design of the device with further hardware improvements to make it a complete standalone device.

5.1 Offset Point-and-Click Technique

We replaced the button at the proximal phalanx of the finger with a limit-switch button on the tip of the finger-worn device as shown in Fig. 7. The color sensor is then attached to the limit-switch. The purpose of this design is to avoid affecting the pointing accuracy when the users simultaneously point the device and click the button on the proximal phalanx. With the new design, the users can click the button by simply tilting the finger forward and also get tactile click feedback on their finger tip.

Fig. 7.
figure 7

Left: Final standalone prototype, Center: Internal hardware Right: Exploring the tactile diagram with the final prototype.

5.2 RFID Sensing for Color Reuse

In order to enable the reuse of colors across different tactile diagrams, we introduced a mechanism to support multiple audio-color mapping profiles. This was achieved by embedding an RFID-reader coil in the FingerTalkie device. One unique RFID tag was attached to each tactile diagram. Before reading the main content, the user scanned the tag to read the color-audio-mapping profile of the current diagram. A micro 125 KHz RFID module was embedded on top of the Arduino Nano. We made a sandwiched arrangement of Arduino Nano, a much smaller HC-05 bluetooth chip and the RFID chip, creating a compact arrangement of circuits on top of the finger worn platform. An RFID coil with a diameter of 15 mm was placed on top of the limit switch, to support the selection of audio-color-mapping profile through the offset pointing interaction.

5.3 Interaction Flow

The user begins exploring a tactile diagram by hovering the FingerTalkie device over the RFID tag, which is placed at the top left corner of the tactile diagram and marked by a small tactile dot. The page selection is indicated by an audio feedback denoting the page number or title of the diagram. The user can then move the finger to the rest of the diagram for further exploration. To retrieve audio information about a colored area, the user uses the point-and-click technique by pointing to the area with an offset and tilting the finger to click.

6 Evaluating the Offset Point-and-Click Technique

We have designed FingeTalkie with a new interaction technique that requires users to point to areas with an offset distance and tilt to click. Can users perform it efficiently and accurately? To answer this question, we conducted a controlled experiment to formally evaluate the performance of this new technique and the usability of the FingerTalkie device. Participants were asked to use FingerTalkie device to point and click within the tactile areas in predefined graphical shapes. The following hypotheses are tested:

  • H1: It is faster to select larger tactile areas than smaller ones.

  • H2: It is slower to perform a correct click for areas with sharper angles.

  • H3: It is more error-prone to select smaller tactile areas than larger ones.

  • H4: It yields more error to select the shapes with sharper angles.

6.1 Design

We employed a [\(4\times 3\)] within-subject experiment design with two independent factors: Size (Small, Medium, Large) and Shape (Circle, Square, Triangle and Star). The tactile diagrams we used are made of flashcards with a size of \(20\times 18\) cm. The tactile shapes were created by laser cutting a thick paper board which gave 1.5 mm tactile elevation for the tactile shapes. We used tactile diagrams of four basic figures: circle, triangle, square and star based on the increasing number of edges and corners and decreasing angular measurements between the adjacent sides. We made 3 different sizes (large, medium and small) of each shape as shown in Fig. 8. The large size of all the shapes were made in a way that it can be inscribed in a circle of 5 cm. The medium size was set to 40% (2 cm) of the large size and the smallest size being 20% (1 cm). According to tactile graphics guidelines [38], the minimum area that can be perceived on a tactile diagram is 25.4 mm \(\times \) 12.5 mm. We chose our smallest size slightly below this threshold to include the worst case scenario.

All the elevated shapes were of blue color and the surrounding area was in white color as shown in Fig. 8. All the shapes were placed at the vertical center of the flashcard. The bottom of each shape was at a fixed distance from the bottom of the flashcard as seen in the Fig. 8. This was done in order to maintain consistency while exploring the shapes and to mitigate against shape and size bias.

6.2 Participants

To eliminate biases caused by prior experience with tactile diagrams, we recruited 12 sighted users (5 female) and blind-folded them during the experiment. They were recruited from a local university aged between 25 and 35 years (Mean = 30, SD = 2.8). 8 out of 12 participants were right-handed. None of them had any prior experience in using tactile diagrams.

6.3 Apparatus

The testing setup involved the finger-worn device connected to an Arduino Nano which interfaces with a laptop. The testing table as shown in Fig. 9 consisted of a fixed slot to which the flashcards could be removed and replaced manually by the moderator. A press button (Fig. 9) was placed beneath the flashcard slot in order to trigger the start command whenever the user was ready to explore the next diagram.

Fig. 8.
figure 8

Tactile flashcards (Color figure online)

Fig. 9.
figure 9

Testing setup

6.4 Task and Procedure

The experiment begins with a training session before going into the measured session. The participants are blindfolded and asked to wear the FingerTalkie prototype. During the training session, the participants are briefed about the motive of the experiment and also guided through the actions to be performed during the tests. A dummy tactile flashcard of blue colored square (side of 20 mm) is used for the demo session. In order to avoid bias, the shape and position of the tactile image on the flashcard are not revealed or explained. The participants are asked to explore the tactile flashcard and asked to point-and-click within the area of the tactile shape. When a click is received while pointing within the shape, Tone A (‘Glass’ sound file in the MacOS sound effects) is played to notify the correct operation. When the point-and-click occurred outside the tactile boundary (the white area), Tone B (‘Basso’ sound file in the MacOS sound effects) is played to denote the error. The participants are allowed to exercise the clicks as many times as they wanted during the training sessions. The training session for each participants took about 5 min.

During the measured session, the participants are asked to register correct clicks for given tactile flashcards as fast and accurate as possible. The moderator gives an audio cue to notify the participants every time a tactile flashcard is replaced. The participant will then have to press the start button on the bottom of the setup (Fig. 9) and explore the flashcard, point within the boundary of the tactile area and perform a click. Once a correct click is received, the moderator replaces the flashcard and participants start the next trial until all trails are finished. If the participants performs a wrong click, they can try as many times as they want to achieve the correct click until the session reaches the timeout (75 s). The order of trials in each condition is counterbalanced with a Latin Square. This design results in (\(4\times shapes\)) * (\(3\times sizes\)) * (\(2\times replication\)) * (\(12\times participants\)) = 228 measured trials.

Fig. 10.
figure 10

Mean task completion time of all shapes classified based on their sizes

6.5 Data Collection

We collected: 1) Task Completion Time, recorded from pressing the button to achieving a correct click (click within the boundary of the each shape on the flashcard) and 2) the error rate by logging in the number of wrong clicks of each flashcard before the correct click was registered.

6.6 Results

We post-processed the collected data by removing four outliers that were more/less than the mean values by more than two times of the standard deviations. The two-way repeated measures ANOVA was then performed on the Task Completion Time and the Number Of Errors with the Size and the Shape as the independent variables. The mean time and the mean number of errors for achieving a correct click for all the shapes and sizes are shown in Fig. 10 and Fig. 11.

Fig. 11.
figure 11

Mean number of errors for the shapes based on their sizes

H1-Task Completion Time (Size). There was a significant effect of size on Task Completion Time [F(2,22) = 3.94, p < 0.05, \(\eta _{p}^{2}=0.264\)]. Post-hoc pair-wise comparison showed that there is a significant difference in the Task Completion Time between the Large and the Small sized tactile shapes (p < 0.005) with the mean time of Small as 9.01 s (SD = 1.3) and of Large as 5.1 s (SD = 0.662). The mean time for the correct click for medium size is 6.49 s (SD = 1.41). But, there is no significant difference between Medium and Small or Medium and Large sizes. The mean task completion time (correct click) of the all small, medium and large sizes shows that the large sizes of all shapes were easily identifiable. Hence, H1 is fully supported.

H2-Task Completion Time (Shape). H2 is partially supported. We found a significant effect of Shape on Task Completion Time [F(3,33) = 12.881, p < 0.05, \(\eta _{p}^{2}=0.539\)]. No significant interaction effect between Size and Shape was identified. Post-hoc pair-wise comparison showed for the small size, the star shape took significantly longer time than the triangle (p < 0.05), the circle (p < 0.05), and the square (p < 0.05), while the triangle took significantly longer time than the square (p < 0.05). No significant difference was found between the square and the circle or the triangle and the circle. For the medium size, the significance difference on the task completion time was found between star and triangle (p < 0.05), star and circle (p < 0.05), and star and square (p < 0.05), while there was no significantly difference among the triangle, the circle, and the square. For the large size, there was no significantly difference among the four shapes. The mean time for reaching a correct click for each shape in each size is showed in Fig. 10. We hypothesized (H2) that the sharper angle a shape has, the longer it would take for the correct click. As expected, smaller tactile areas are more sensitive to this effect. The result was as predicted except that the circle performed worse than square in all sizes, although no significant difference was found between circle and square. We speculate that one major reason of square performing better than circle in our experiment is due to the rectangular shape of the color sensor, which aligns better with straight lines than curves. While future investigation is needed, this raises alerts on potential impact of the shape of the sensing area of any sensing technology to be used in this context.

H3-Number of Errors (Size). H3 is fully supported. We found a significant effect of size on Number Of Errors [F(2,22) = 9.82, p < 0.05, \(\eta _{p}^{2}=.472\)]. Post-hoc comparison showed that the small size yielded significantly larger number of errors than the large size did (p < 0.005). There is also a significant difference between the number of errors for the small size was also significantly larger than those of the medium size (p < 0.05), while there was no significant difference between the medium and large sizes. The mean number of errors of the small, medium, and large shapes are 1.05 (SD = 0.225), 0.521 (SD = 0.235) and 0.26 (SD = 0.09) respectively. In general we can see the error rates are rather low: most trials were completed in one or two attempts even in the smallest size.

H4-Number of Errors (Shape). H4 is partially supported in a similar way to H2. There was a significant effect of shape on Number Of Errors [F(3,33) = 10.96, p < 0.001, \(\eta _{p}^{2}=0.499\)]. Post-hoc pair-wise comparison showed that the star shape yielded a significantly more errors compared to square (small size: p < 0.005, medium size: p < 0.005, large size: p < 0.005), triangle (small size: p < 0.05, medium size: p < 0.05, large size: p < 0.05) and circle (small size: p < 0.005, medium size: p < 0.005, large size: p < 0.005). There was no significant difference between the square and circle across different sizes where as square yielded significantly less error when compared to triangle (small size: p < 0.05, medium size: p < 0.05, large size: p < 0.05). Figure 11 shows detailed results of the number of errors across difference shapes and sizes. We can see the error rate is consistent with the task completion time, which accords with our observation that failed attempts was a major cause for slower performances.

Overall, FingerTalkie was effective in selecting a wide range of tactile shapes. Participants could make a correct selection easily in one or two shots in most cases, even when the size is smaller than the smallest tactile areas used in the real world. Effects of sharp angles were shown in smaller tactile shapes. Potential effects of the shape of the sensory area was uncovered, should be paid attention to in future development of similar technologies.

7 Focus-Group Interview with Blind Users

The aim of focus-group interview is to obtain a deeper understanding on key factors such as wearability and form factor, novelty and usefulness of the device, difficulty in using the device, learnability, cost of the device, audio data-input, and sharing interface.

7.1 Participants

The subjective feedback session was conducted with 8 congenitally blind participants. The participant group consisted of 1 adult male (Age = 35) and 7 children aging from 11 to 14 (Mean = 13.0, SD = 1.0). All the users were right-handed.

7.2 Apparatus

We used two tactile figures; squares of two different sizes (5 cm and 3 cm) side by side as to demonstrate the working of the device. One of the square was filled with blue color while another smaller square was filled with red color. Each square color was annotated with a discrete audio that could be listened through the laptop speakers. The finger worn device used for the evaluation was the standalone prototype which was connected to the external battery pack using a USB cable.

7.3 Procedure

The hands-on session was done in an informal setup where the participants were briefed initially about the concept of finger wearable device and the nature of the problem that it solves. The users were instructed to wear the device and they were guided to understand the position of the sensor on the tip. They were also instructed to touch the tip of the device to conform its angle of tilt. In this way, they could get a clear understanding of the distance of the sensor from the tip of the finger. The offset point-and-click mechanism was explained to each participant. The whole process was administered by a sighted external helper. The participants were then asked to explore the tactile diagram and perform the correct-clicking styles freely within the boundaries of the squares. Tone A ‘Glass’ sound file in the MacOS sound effects was played for the correct clicks on the big and small squares respectively. Tone B ‘Basso’ sound file in the MacOS sound effects was played for a wrong click outside the tactile boundary. Each participant experienced the device and performed clicking for approximately 10 min.

7.4 Results

After the exploratory hands-on session, all participants were asked to provide feedback regarding the following factors:

Usability of the Device. After wearing the device for about 5 min, all the users were impressed by the uniqueness of the device. It was also noted that none of the participants have ever used a finger-wearable interactive device in the past. On the other hand, 3 out of 8 users have used or was familiar with the Pen Friend/annotating pens [20] for audio-tactile markings. A Pen-Friend user said, “Reusability of the colors is a really good feature as we don’t have to worry about the tags running out.” Another user said, “The best thing I like about the finger device[FingerTalkie] when compared to Pen Friend is that I can use both my hands to explore the tactile diagrams.” One user had a prior experience in using an image-processing-based audio-tactile system where a smartphone/camera is placed on a vertical stand on top of the tactile diagram. To use such a system, the user needs to affix a sticker on his/her index finger to explore the tactile diagram. This user stated, “Though this system enabled me to use both the hands for tactile exploration, it was cumbersome to set up and calibrate the phone with the stand and sometimes didn’t work as expected due to the poor ambient lighting or improper positioning of the smartphone.” While all the users agreed on the application and usefulness of the device for audio annotation of tactile graphics, some even suggested different levels of applications. A user stated “I can use this device for annotating everyday objects like medicines and other personal artifacts identification. It will save me a lot of time in printing Braille and sticking it to the objects.”

Learnability/Ease of Use/Adaptability. After wearing the device, the users were able to understand the relation of the sensor and its distance and angle corresponding to the tactile surface after trying for a couple of minutes. Overall, the participants showed a great interest in wearing it and exploring the different sounds while moving between the two different tactile images. All the users stated that they could adapt to this clicking method easily by using it for a couple of hours. Asking about the ease of use, a participant stated “this is like a magic device. I just have to tilt my (index) finger to get the audio description about the place being pointed. Getting audio information from the tactile diagram have never been so easy.” Another user said “I have used a mobile phone application which can detect the boundaries of the tactile diagram using the camera and gives audio output corresponding to the area being pointed and double tapped. But for that, I require a stand on which the mobile phone should be fixed fist and should also make use that the room is well lit to get the best result. With this device, the advantage I find over the others is that its lightweight, portable and it works irrespective of the lighting conditions in the room.”

Wearability. It was observed that the finger wearable device could fit in perfectly on the index finger for seven out of eight participants with only minor adjustments in the strap. One exemption was a case in which the device was extremely loose and was tending to sway while the user tried to perform a click. One of the participants claimed “I don’t think it’s complicated and I can wear it on my own. It is easy to wear and I can adjust it by myself.” The device was found protruding out of the index finger in half of the cases, however this did not affect the usability of the device. The users were still able to make the offset-click without any fail.

Need of Mobile Application for User Data Input. Majority of users were eager to know the mechanism and the software interface by which the audio can be tagged to a specified color. The child participants were eager to know if they would be able to do it on their own. Four out of five child participants insisted that a mobile or computer application should be made accessible to the VI people so that they can do it on their own without an external assistance. A user said “Being proficient in using the smart phones, I am disappointed with the fact that most of the mobile applications are not designed taking care of the accessibility and hence render them useless”. One of the special educators said “If the teachers can themselves make a audio-color profile for each diagram or chapter and then share it with the students, it would save a lot of time for both the students and the special educators”.

In summary, the participants showed enthusiasm in using FingerTalkie in their daily and educational activities. Their feedback showed promises of FingerTalkie for providing an intuitive and seamless user experience. Most participants expressed appreciation to the simple design of the device. The offset point-and-click method appeared to be easy to learn and perform. Overall, the users liked the experience of the FingerTalkie and suggested for a sturdy design and an accessible back-end software system.

8 Limitations and Future Work

Though we were able to address most of the usability and hardware drawbacks of FingerTalkie during the iterative process, the following factors could be improved in future designs:

During the entire design and evaluation process, we used only Blue, Green, Red colors in the tactile diagrams. We used them to achieve a better detection accuracy. A better color sensor with noise filtering algorithms and a well-calibrated sensor positioning can help in detection of more colors efficiently on a single tactile diagram.

Though the final prototype is made into a compact wearable form factor, it is still bulky as we used off-the-shelf hardware components. It could be further miniaturized by the use of custom-made PCB design and SMD electronic components. In order to achieve a comprehensive and ready-to-use system, an accessible and stable back-end PC software or mobile app should be developed in the near future. The back-end software/mobile application should include the features of audio-color-mapping profile creation and sharing. Last but not the least, we will also explore other modality of on-finger feedback (e.g., vibration [30], thermal [44], poking [17], etc.) for VI users comprehending tactile diagrams.

9 Conclusion

In this paper, we introduce FingerTalkie, a novel finger-worn device with a new offset point-and-click technique that enables easy access of audio information on tactile diagrams. The design requirements and choices were established from an iterative user-centered design process. It is an easy-to-use, reliable and inexpensive technique that can help the VI to reduce the bulkiness of tactile textbooks by eliminating the Braille pages. The offset point-and-click technique can easily perform even with the smallest tactile areas suggested by the tactile graphics guidelines. The subjective feedback from VI users shows high acceptance of FingerTalkie in terms of dual-hand exploration ability when compared to the mainstream audio tactile devices in the market. As high-contrast colored tactile diagrams are gaining popularity amongst people with low or partial vision, we aim to use the same printed colors to make the color palette for the FingerTalkie. In addition, we envision that FingerTalkie can not only be used by VI users, but also by sighted users with special needs, such as elderly and children, to annotate everyday physical objects, such as medicine containers and textbooks. Due to the versatility of the design with the point-and-click method, the researchers in the future can adopt such techniques in other devices and systems where finger tips shall not be occluded while performing touch input.