Authors: include Rishad Shafik, Alex Yakovlev, Ole-Christoffer Granmo
Published: 2023, International Symposium on the Tsetlin Machine
The presentations showcases a detailed study aimed at systematically searching for the optimal hyperparameters for the Tsetlin Machine when applied to the MNIST dataset. This research addresses the crucial aspect of hyperparameter tuning which significantly impacts the performance of Tsetlin Machines, particularly in image recognition tasks. The presenter details the methodology used to explore various hyperparameter combinations, including the number of clauses, the threshold value for feedback, and the impact of learning rates.
The study employs a structured approach to evaluate the effects of these parameters on the accuracy and computational efficiency of the Tsetlin Machine. Using a series of experiments, the research demonstrates how different settings affect the learning process and prediction accuracy, providing valuable insights into the configurations that yield the best results for handwritten digit recognition on the MNIST dataset.
This presentation not only highlights the flexibility and potential of Tsetlin Machines in handling complex datasets like MNIST but also provides a roadmap for researchers and practitioners to fine-tune their machine learning models effectively. The systematic search for optimal hyperparameters opens up new possibilities for enhancing the performance of logical learning models in various applications.