Show simple item record

dc.contributor.authorKeivani, Omid
dc.contributor.authorSinha, Kaushik
dc.contributor.authorRam, Parikshit
dc.date.accessioned2018-04-09T14:13:21Z
dc.date.available2018-04-09T14:13:21Z
dc.date.issued2017-05
dc.identifier.citationKeivani, Omid; Sinha, Kaushik; Ram, Parikshit. 2017. Improved maximum inner product search with better theoretical guarantees. 2017 International Joint Conference on Neural Networks (IJCNN), pp 2927-2934en_US
dc.identifier.isbn978-1-5090-6182-2
dc.identifier.issn2161-4393
dc.identifier.otherWOS:000426968703024
dc.identifier.urihttp://dx.doi.org/10.1109/IJCNN.2017.7966218
dc.identifier.urihttp://hdl.handle.net/10057/14865
dc.descriptionClick on the DOI link to access the article (may not be free).en_US
dc.description.abstractRecent interest in the problem of maximum inner product search (MIPS) has sparked the development of new solutions. The solutions (usually) reduce MIPS to the well-studied problem of nearest-neighbour search (NNS). To escape the curse of dimensionality, the problem is relaxed to accept approximate solutions (that is, accept anything that is approximately maximum), and locality sensitive hashing is the approximate NNS algorithm of choice. While being extremely resourceful, these existing solutions have a couple of aspects that can be improved upon - (i) MIPS can be reduced to NNS in multiple ways and there is lack of understanding (mostly theoretical but also empirical) when to choose which reduction for best accuracy or efficiency, and (ii) when MIPS is solved via approximate NNS, translating this approximation to the MIPS solution is not straightforward. To overcome these usability issues, we propose the use of randomized partition trees (RPTs) for solving MIPS. We still reduce MIPS to NNS but utilize RPTs to solve the NNS problem. RPTs find the exact NNS solution, hence the exact MIPS solution (with high probability), avoiding the need for any translation of approximation. The theoretical properties of RPTs allow us to definitively choose the best MIPS-to-NNS reduction. The empirical properties of RPTs allow us to significantly outperform the state-of-the-art while providing unique fine-grained control over the accuracy-efficiency tradeoff. For example, at 80% accuracy, RPTs are 2-5× more efficient than the state-of-the-art.en_US
dc.language.isoen_USen_US
dc.publisherIEEEen_US
dc.relation.ispartofseries2017 International Joint Conference on Neural Networks (IJCNN);
dc.subjectMaximum inner product searchen_US
dc.subjectNearest neighbor searchen_US
dc.subjectLocality sensitive hashingen_US
dc.subjectRandomized partition treesen_US
dc.titleImproved maximum inner product search with better theoretical guaranteesen_US
dc.typeConference paperen_US
dc.rights.holder© 2017, IEEEen_US


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record