Edge AI ‐ REDRESS Methodology Enhances Tsetlin Machines with Superior Compression

Discover how REDRESS transforms edge AI with Tsetlin Machines, achieving unprecedented compression and efficiency.

Authors: include Tousif Rahman, Rishad Shafik, Alex Yakovlev, Ole-Christoffer Granmo

Published: 2023, IEEE Transactions on Pattern Analysis and Machine Intelligence

Summary

Unleashing the potential of edge AI, the REDRESS methodology revolutionises Tsetlin Machines with significant advancements in model compression and inference speed.

The REDRESS project enhances the Tsetlin Machine (TM) to optimise edge inference, addressing challenges in memory footprint, energy consumption, and computational efficiency. This approach utilises a novel machine learning algorithm that harnesses learning automata for creating propositional logic, significantly reducing the complexity and size of traditional models. By incorporating an innovative encoding strategy called include-encoding, REDRESS achieves over 99% compression, storing only essential information for inference. This method allows TMs to operate directly on compressed data without decompression, enabling rapid multi-class classification on low-power devices like microcontrollers.

The system's enhanced design is proven through rigorous benchmarks, including MNIST and CIFAR2, where it dramatically outperforms Binary Neural Network (BNN) models in speed and energy efficiency. The paper highlights a novel training procedure—Tsetlin Automata Re-profiling—that optimises model sparsity, further enhancing performance and reducing resource use.

REDRESS not only pushes the boundaries of what's possible in edge computing with Tsetlin Machines but also sets a new standard for deploying AI in power-sensitive applications, making it a game-changer for the future of embedded AI systems.

Read more

The full paper REDRESS: Generating Compressed Models for Edge Inference Using Tsetlin Machines is available from IEEE Transactions on Pattern Analysis and Machine Intelligence.

Article

First published by IEEE Transactions on Pattern Analysis and Machine Intelligence on 19 April 2023.

DOI: TPAMI.2023.3268415