ProtoNN: Compressed and Accurate kNN for Resource-scarce Devices

Gupta, C. ; Suggala, A.S. ; Gupta, A. ; Simhadri, H.V. ; Paranjape, B. ; Kumar, A. ; Goyal, S. ; Udupa, M. ; Jain, P. ; Varma, Manik (2017) ProtoNN: Compressed and Accurate kNN for Resource-scarce Devices In: Proceedings of the 34 th International Conference on Machine Learning, 2017, Sydney, Australia.

Full text not available from this repository.

Abstract

Several real-world applications require real-time prediction on resource-scarce devices such as an Internet of Things (IoT) sensor. Such applications demand prediction models with small storage and computational complexity that do not compromise significantly on accuracy. In this work, we propose ProtoNN, a novel algorithm that addresses the problem of real-time and accurate prediction on resource-scarce devices. ProtoNN is inspired by k-Nearest Neighbor (KNN) but has several orders lower storage and prediction complexity. ProtoNN models can be deployed even on devices with puny storage and computational power (e.g. an Arduino UNO with 2kB RAM) to get excellent prediction accuracy. ProtoNN derives its strength from three key ideas: a) learning a small number of prototypes to represent the entire training set, b) sparse low dimensional projection of data, c) joint discriminative learning of the projection and prototypes with explicit model size constraint. We conduct systematic empirical evaluation of ProtoNN on a variety of supervised learning tasks (binary, multi-class, multi-label classification) and show that it gives nearly state-of-the-art prediction accuracy on resource-scarce devices while consuming several orders lower storage, and using minimal working memory.

Item Type:Conference or Workshop Item (Paper)
Source:Copyright of this article belongs to Association for Computing Machinery.
ID Code:119558
Deposited On:14 Jun 2021 09:58
Last Modified:14 Jun 2021 09:58

Repository Staff Only: item control page