Owli-AI Logo Owli-AI
Menu

Owli-AI Research Project

AHRUS Audible High Resolution Ultrasonic Sonar

AHRUS is an electronic guide-dog concept that supports echolocation through audible ultrasound. The system aims to make obstacles and structures perceivable earlier without replacing natural spatial perception with headphones.

Prototype

Notice: this page was machine-translated and is pending editorial review.

  • echolocation
  • ultrasound
  • assistive technology
  • spatial hearing
  • obstacle detection

Project description

Goal

The AHRUS project examines how ultrasound can be used as an additional orientation channel for blind and visually impaired people. The goal is practical support in daily life, especially for small structures and surfaces that are hard to perceive.

How it works (short)

A focused ultrasound beam is emitted with modulation and is partially converted into audible signals by nonlinear acoustics in air. Reflections from objects can then be perceived with the user’s own ears and spatially classified.

What is new compared to classic echolocation?

Classic active echolocation with tongue clicks uses longer wavelengths and is therefore less selective for fine structures. AHRUS uses short ultrasound wavelengths and enables clearly directed scanning, which can improve differentiation of structure boundaries and small obstacles in specific scenarios.

Current status (prototype and first tests)

A functional prototype exists. In an initial evaluation with four participants, distance, direction, width estimation, and boundary perception were analyzed and compared with classic flash sonar.

Outputs

13 visuals from the AHRUS paper.

  1. Schematic view of a sound source moving in azimuth and elevation.
    Fig. 1 Overall directional representation.
  2. Detailed azimuth view with lateral sound source movement.
    Fig. 1 Detail left (azimuth).
  3. Detailed elevation view with vertical sound source movement.
    Fig. 1 Detail right (elevation).
  4. Principle diagram of self-demodulation of modulated ultrasound in air.
    Fig. 2 Principle of ultrasound demodulation.
  5. Photo of the AHRUS prototype with housing and transducer array.
    Fig. 3 Prototype implementation.
  6. Detailed photo of the circular transducer array on the AHRUS prototype.
    Fig. 3 Transducer array detail.
  7. Block diagram with DSP, configuration, Bluetooth, and ultrasound transducers.
    Fig. 4 AHRUS system design overview.
  8. Bar chart on perception of distance, direction, and boundary zones in four participants.
    Fig. 5 Results for distance, direction, and boundary perception.
  9. Bar chart on distance threshold in obstacle detection for column and car.
    Fig. 6 Distance threshold in obstacle detection.
  10. Bar chart on deviation in obstacle width estimation.
    Fig. 7 Obstacle width estimation.
  11. Comparison of sound intensity and directivity of tongue click and AHRUS.
    Fig. 8 Overall comparison of directivity.
  12. Detail view of broad sound radiation of a tongue click in flash sonar.
    Fig. 8 Detail left (flash sonar).
  13. Detail view of focused sound radiation in the AHRUS system.
    Fig. 8 Detail right (AHRUS).