Show simple item record

dc.contributor.authorSinha, Kaushik
dc.contributor.authorRam, Parikshit
dc.date.accessioned2021-10-18T02:12:09Z
dc.date.available2021-10-18T02:12:09Z
dc.date.issued2021-08-14
dc.identifier.citationSinha, K., & Ram, P. (2021). Fruit-fly inspired neighborhood encoding for classification. Paper presented at the Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 1470-1480. doi:10.1145/3447548.3467246en_US
dc.identifier.isbn978-1-4503-8332-5
dc.identifier.urihttps://doi.org/10.1145/3447548.3467246
dc.identifier.urihttps://soar.wichita.edu/handle/10057/22206
dc.descriptionClick on the DOI link to access this conference paper at the publishers website (may not be free).en_US
dc.description.abstractInspired by the fruit-fly olfactory circuit, the Fly Bloom Filter [4] is able to efficiently summarize the data with a single pass and has been used for novelty detection. We propose a new classifier that effectively encodes the different local neighborhoods for each class with a per-class Fly Bloom Filter. The inference on test data requires an efficient FlyHash [6] operation followed by a high-dimensional, but very sparse, dot product with the per-class Bloom Filters. On the theoretical side, we establish conditions under which the predictions of our proposed classifier agrees with the predictions of the nearest neighbor classifier. We extensively evaluate our proposed scheme with 71 data sets of varied data dimensionality to demonstrate that the predictive performance of our proposed neuroscience inspired classifier is competitive to the nearest-neighbor classifiers and other single-pass classifiers.en_US
dc.description.sponsorshipWe plan to pursue theoretical guarantees for FBFC and FBFC★ in general R by exploring data dependent assumptions such as doubling measure. While our theoretical results connects FBFC to 1NNC, thereby inheriting its generalization guarantees, in our empirical evaluations, we also compared our proposed schemes to the more general NNC (which has better generalization guarantees). Our empirical evaluations indicate that FBFC★ significantly outperforms 1NNC, while matching NNC in most cases and at times outperforming it. This motivates us to study the conditions under which FBFC/FBFC★ matches NNC in future work. Additionally, utilizing the sparse and randomized nature of FBFC, we will investigate differential privacy preserving properties of FBFC as well as robustness of FBFC to benign and adversarial perturbations. Finally, we believe that FBFC can be adapted to handle concept drifts and distribution shifts when learning with data streams (online learning) by being able to forget past examples. Acknowledgement. KS gratefully acknowledges funding from NSF award FAIN 2019844.en_US
dc.language.isoen_USen_US
dc.publisherAssociation for Computing Machineryen_US
dc.relation.ispartofseries27th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD 2021;
dc.subjectNearest-neighbor classificationen_US
dc.subjectRandomized algorithmen_US
dc.subjectBio-inspireden_US
dc.titleFruit-fly inspired neighborhood encoding for classificationen_US
dc.typeConference paperen_US
dc.rights.holder© 2021 Association for Computing Machinery.en_US


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record