Computer Science > Robotics
[Submitted on 19 Jul 2023]
Title:AcousTac: Tactile sensing with acoustic resonance for electronics-free soft skin
View PDFAbstract:Sound is a rich information medium that transmits through air; people communicate through speech and can even discern material through tapping and listening. To capture frequencies in the human hearing range, commercial microphones typically have a sampling rate of over 40kHz. These accessible acoustic technologies are not yet widely adopted for the explicit purpose of giving robots a sense of touch. Some researchers have used sound to sense tactile information, both monitoring ambient soundscape and with embedded speakers and microphones to measure sounds within structures. However, these options commonly do not provide a direct measure of steady state force, or require electronics integrated somewhere near the contact location. In this work, we present AcousTac, an acoustic tactile sensor for electronics-free force sensitive soft skin. Compliant silicone caps and plastic tubes compose the resonant chambers that emit pneumatic-driven sound measurable with a conventional off-board microphone. The resulting frequency changes depend on the external loads on the compliant end caps. We can tune each AcousTac taxel to specific force and frequency ranges, based on geometric parameters, including tube length and end-cap geometry and thus uniquely sense each taxel simultaneously in an array. We demonstrate AcousTac's functionality on two robotic systems: a 4-taxel array and a 3-taxel astrictive gripper. AcousTac is a promising concept for force sensing on soft robotic surfaces, especially in situations where electronics near the contact are not suitable. Equipping robots with tactile sensing and soft skin provides them with a sense of touch and the ability to safely interact with their surroundings.
References & Citations
Bibliographic and Citation Tools
Bibliographic Explorer (What is the Explorer?)
Connected Papers (What is Connected Papers?)
Litmaps (What is Litmaps?)
scite Smart Citations (What are Smart Citations?)
Code, Data and Media Associated with this Article
alphaXiv (What is alphaXiv?)
CatalyzeX Code Finder for Papers (What is CatalyzeX?)
DagsHub (What is DagsHub?)
Gotit.pub (What is GotitPub?)
Hugging Face (What is Huggingface?)
Papers with Code (What is Papers with Code?)
ScienceCast (What is ScienceCast?)
Demos
Recommenders and Search Tools
Influence Flower (What are Influence Flowers?)
CORE Recommender (What is CORE?)
arXivLabs: experimental projects with community collaborators
arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.
Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.
Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs.