Improved nearest neighbor search using auxiliary information and priority functions

Loading...
Thumbnail Image
Authors
Keivani, Omid
Sinha, Kaushik
Advisors
Issue Date
2018-05
Type
Conference paper
Keywords
Artificial intelligence , Forestry , Learning systems , Trees (mathematics)
Research Projects
Organizational Units
Journal Issue
Citation
Keivani, Omid; Sinha, Kaushik. 2018. Improved Nearest neighbor search using auxiliary information and priority functions. 35th International Conference on Machine Learning, ICML 2018, vol. 6:pp 4017-4029
Abstract

Nearest neighbor search using random projection trees has recently been shown to achieve superior performance, in terms of better accuracy while retrieving less number of data points, compared to locality sensitive hashing based methods. However, to achieve acceptable nearest neighbor search accuracy for large scale applications, where number of data points and/or number of features can be very large, it requires users to maintain, store and search through large number of such independent random projection trees, which may be undesirable for many practical applications. To address this issue, in this paper we present different search strategies to improve nearest neighbor search performance of a single random projection tree. Our approach exploits properties of single and multiple random projections, which allows us to store meaningful auxiliary information at internal nodes of a random projection tree as well as to design priority functions to guide the search process that results in improved nearest neighbor search performance. Empirical results on multiple real world datasets show that our proposed method improves the search accuracy of a single tree compared to baseline methods.

Table of Contents
Description
Proceedings of the 35th International Conference on Machine Learning, Stockholm, Sweden, PMLR 80, 2018. Copyright 2018 by the author(s)
Publisher
International Machine Learning Society (IMLS)
Journal
Book Title
Series
35th International Conference on Machine Learning, ICML 2018;v.6
PubMed ID
DOI
ISSN
EISSN