Low-Power Keyword Spotting AI ‐ The Energy-Efficient Power of Tsetlin Machines

Discover how Tsetlin Machines transform audio keyword spotting with superior efficiency and reduced complexity.

Authors: include Tousif Rahman, Rishad Shafik, Adrian Wheeldon, Alex Yakovlev, Ole-Christoffer Granmo

Published: 2021, Journal of Low Power Electronics and Applications

Summary

Introducing a paradigm shift in low-power audio keyword spotting, Tsetlin Machines offer an innovative leap over traditional neural network approaches.

This paper delves into the deployment of Tsetlin Machines (TM) for enhancing the efficiency of keyword spotting (KWS) systems, traditionally dominated by neural networks (NNs). The TM leverages propositional logic and learning automata to achieve high accuracy with remarkably lower computational complexity. By significantly reducing the number of parameters and focusing on logic-based processing rather than arithmetic, TMs optimize power consumption without compromising learning efficacy.

A unique aspect of TM is its adaptability in processing increased keyword variations with minimal energy usage. The research highlights how TMs, through logical operations, expedite the convergence process compared to NNs, offering a faster rate of learning which is crucial for real-time applications. In comparative analyses, TMs not only show a lower memory footprint but also achieve competitive accuracy when tested against conventional models like SVMs and Random Forests.

The paper argues convincingly for the potential of Tsetlin Machines in revolutionizing low-power, on-chip KWS applications. With their ability to process complex pattern recognition tasks efficiently, TMs are poised to become an essential tool in the development of energy-efficient AI technologies.

Read more

The full paper Low-Power Audio Keyword Spotting using Tsetlin Machines is available from Journal of Low Power Electronics and Applications.

Article

First published by Journal of Low Power Electronics and Applications on 27 January 2021.

DOI: arXiv:2101.11336