Electronic Navigation Project

Navigating around the university campus can be difficult for visitors and incoming staff and students, and is a particular challenge for those with vision impairments. The aim of this project is to develop a campus navigation system that can be accessed using any web-enabled smartphone and provides feedback suitable for both blind and sighted users.

Background

  • Large and increasing numbers of visitors and new students traverse UCC's campus each year, a significant percentage of whom are blind/visually-impaired.
  • Maps (downloadable from the university website) and directional signage are available, but these are not always convenient even for sighted users
  • Blind/visually-impaired staff and students receive mobility training on key routes, but this is time-consuming and expensive to provide, and is not available to visitors.
  • Mobile and wireless technologies could be used to overcome these problems, delivering navigational information in both visual and non-visual forms (speech and sound, vibro-tactile feedback).

Phase 1: Feasibility Study

An initial study has been conducted and a report prepared, identifying the problems faced by people with visual impairments when navigating the UCC campus, and assessing potential solutions.

The report notes that there have been a number of recent attempts to develop efficient and cost-effective systems that enable blind and vision-impaired pedestrians to safely and independently navigate spaces with which they are not familiar. 

However, the report concludes that there is no off-the-shelf system that fully meets UCC's requirements. Most of the candidates fall short in one or more of the following respects:

  • accuracy or reliability of the localization information provided
  • ability to operate both indoors and outdoors
  • suitability of the feedback provided, e.g., listening to speech output makes it difficult to hear environmental sound
  • difficulty of use alongside other mobility aids, such as a long cane

Phase 2: Pilot Project

Based on recommendations made in the report, we are developing a pilot system that has two distinct layers:

  • a navigation layer that obtains data on position and orientation
  • an interface layer that makes this information available to the user.

Positioning technology is developing at a rapid pace: separating the navigation layer from the interface will make it easier to upgrade the system to take advantage of new technologies as they become available.

The Navigation Layer will initially be based on ULP Bluetooth 4.0 combined with Wireless Inertial Measurement Units (WIMU).

Bluetooth 4.0

  • can be used to provide a low-cost location-based system
  • offers a low energy wireless solution 
  • offers multi-vendor interoperability. 

WIMUs

  • provide data from accelerometer, gyroscope and magnetometer sensors. 
  • They capture tilt, force and timings and represent optical 3D motion capture. 

User Interface Layer

The User Interface Layer has been designed with both guide-dog and long-cane users in mind.

It will incorporate three audio feedback options - simple audio, speech and spatial audio.

Haptic Feedback will be provided as varying levels of pulsed vibrations used to indicate key features / areas of campus.

Other key components of the user interface include:

  • A ‘You are Here’ facility to inform the system user of their exact location on campus at the time requested
  • A ‘Route’ marking facility to explain to the user the best route from their current location or a particular building to their required destination
  • A ‘Near me’ facility will also be included should they require access to a service on campus
  • An ‘Orientation’ facility based on compass feedback will aid users should they become disorientated while on campus

 Map-based visual feedback will be provided for sighted users.

 

Close X