-
BayLearn 2020: Hamming Space Locality Preserving Neural Hashing for Similarity Search
Hamming Space Locality Preserving Neural Hashing for Similarity Search
BayLearn 2020 Poster #61
Presenter: Daphna Idelson
Abstract:
We propose a novel method for learning to map a large-scale dataset in the feature representation space to binary hash codes in the hamming space, for fast and efficient approximate nearest-neighbor similarity search. Our method is composed of a simple neural network and a novel training scheme, that aims to preserve the locality relations between the original data points. We achieve distance preservation of the original cosine space upon the new hamming space, by introducing a loss function that translates the relational similarities in both spaces to probability distributions - and optimizes the KL divergence between them. We also introduce a simple data s...
published: 10 May 2021
-
ICLR14: J Masci: Sparse similarity-preserving hashing
ICLR 2014 Talk: "Sparse similarity-preserving hashing
Jonathan Masci, Alex M. Bronstein, Michael M. Bronstein, Pablo Sprechmann, Guillermo Sapiro
Paper: http://openreview.net/document/69982ebb-b3e4-4685-9ca7-8b29a29b3a39
Slides: http://www.idsia.ch/~masci/slides/2014-ICLR-sparsehash.pdf
published: 19 Apr 2014
-
Algorithmic Speedups via Locality Sensitive Hashing & Bio-Inspired Hashing - September 8, 2021
Anshuman Mishra talks about algorithmic speedups via locality sensitive hashing and reviews papers on bio-inspired hashing, specifically LSH inspired by fruit flies.
He first gives an overview of what algorithmic speedups are, why they are useful and how we can use them. He then dives into a specific technique called locality sensitive hashing (LSH) and goes over the motivations of using these types of hash algorithms and how they work. Lastly, Anshuman talks about the potential biological relevances of these hash mechanisms. He looks at the paper “A neural algorithm for a fundamental computing problem” which outlined a version of LSH inspired by fruit flies that uses sparse projections, expands dimensionality and uses a Winner-Takes-All mechanism.
Paper reviewed: “A Neural Algorithm f...
published: 14 Sep 2021
-
SISAP 2020: Improving Locality Sensitive Hashing by Efficiently Finding Projected Nearest Neighbors
Authors:
Omid Jafari, Parth Nagarkar and Jonathan Montano
published: 30 Oct 2020
-
BayLearn 2020: Batch Reinforcement Learning Through Continuation Method
published: 10 May 2021
-
Deep Learning - 016 Employing indexing structures for efficient retrieval of semantic neighbors
Deep learning added a huge boost to the already rapidly developing field of computer vision. With deep learning, a lot of new applications of computer vision techniques have been introduced and are now becoming parts of our everyday lives. These include face recognition and indexing, photo stylization or machine vision in self-driving cars.
The goal of this course is to introduce students to computer vision, starting from basics and then turning to more modern deep learning models. We will cover both image and video recognition, including image classification and annotation, object recognition and image search, various object detection techniques, motion estimation, object tracking in video, human action recognition, and finally image stylization, editing and new image generation. In cou...
published: 28 Apr 2021
-
Central Similarity Quantization for Efficient Image and Video Retrieval
Authors: Li Yuan, Tao Wang, Xiaopeng Zhang, Francis EH Tay, Zequn Jie, Wei Liu, Jiashi Feng Description: Existing data-dependent hashing methods usually learn hash functions from pairwise or triplet data relationships, which only capture the data similarity locally, and often suffer from low learning efficiency and low collision rate. In this work, we propose a new \emph{global} similarity metric, termed as \emph{central similarity}, with which the hash codes of similar data pairs are encouraged to approach a common center and those for dissimilar pairs to converge to different centers, to improve hash learning efficiency and retrieval accuracy. We principally formulate the computation of the proposed central similarity metric by introducing a new concept, i.e., \emph{hash center} that ref...
published: 17 Jul 2020
-
Robust Property Preserving Hashes
Presentation by Elette Boyle, Rio LaVigne, Vinod Vaikuntanathan at Crypto 2018 Rump Session
published: 05 Oct 2018
-
Alexandr Andoni - The Geometry of Similarity Search (May 19, 2017)
More details: https://www.simonsfoundation.org/event/the-geometry-of-similarity-search/
published: 06 Feb 2019
-
Lecture 15.4 — Semantic Hashing — [ Deep Learning | Geoffrey Hinton | UofT ]
🔔 Stay Connected! Get the latest insights on Artificial Intelligence (AI) 🧠, Natural Language Processing (NLP) 📝, and Large Language Models (LLMs) 🤖. Follow (https://twitter.com/mtnayeem) on Twitter 🐦 for real-time updates, news, and discussions in the field.
Check out the following interesting papers. Happy learning!
Paper Title: "On the Role of Reviewer Expertise in Temporal Review Helpfulness Prediction"
Paper: https://aclanthology.org/2023.findings-eacl.125/
Dataset: https://huggingface.co/datasets/tafseer-nayeem/review_helpfulness_prediction
Paper Title: "Abstractive Unsupervised Multi-Document Summarization using Paraphrastic Sentence Fusion"
Paper: https://aclanthology.org/C18-1102/
Paper Title: "Extract with Order for Coherent Multi-Document Summarization"
Paper: https://aclan...
published: 25 Sep 2017
4:59
BayLearn 2020: Hamming Space Locality Preserving Neural Hashing for Similarity Search
Hamming Space Locality Preserving Neural Hashing for Similarity Search
BayLearn 2020 Poster #61
Presenter: Daphna Idelson
Abstract:
We propose a novel method ...
Hamming Space Locality Preserving Neural Hashing for Similarity Search
BayLearn 2020 Poster #61
Presenter: Daphna Idelson
Abstract:
We propose a novel method for learning to map a large-scale dataset in the feature representation space to binary hash codes in the hamming space, for fast and efficient approximate nearest-neighbor similarity search. Our method is composed of a simple neural network and a novel training scheme, that aims to preserve the locality relations between the original data points. We achieve distance preservation of the original cosine space upon the new hamming space, by introducing a loss function that translates the relational similarities in both spaces to probability distributions - and optimizes the KL divergence between them. We also introduce a simple data sampling method by representing the database with randomly generated proxies, used as reference points to query points from the training set. Experimenting with three publicly available standard ANN benchmarks we demonstrate significant improvement over other binary hashing methods, achieving an improvement of between 7% to 17%. As opposed to other methods, we show high performance in both low (64 bits) and high (768 bits) dimensional bit representation, offering increased accuracy when resources are available and flexibility in choice of ANN strategy.
https://wn.com/Baylearn_2020_Hamming_Space_Locality_Preserving_Neural_Hashing_For_Similarity_Search
Hamming Space Locality Preserving Neural Hashing for Similarity Search
BayLearn 2020 Poster #61
Presenter: Daphna Idelson
Abstract:
We propose a novel method for learning to map a large-scale dataset in the feature representation space to binary hash codes in the hamming space, for fast and efficient approximate nearest-neighbor similarity search. Our method is composed of a simple neural network and a novel training scheme, that aims to preserve the locality relations between the original data points. We achieve distance preservation of the original cosine space upon the new hamming space, by introducing a loss function that translates the relational similarities in both spaces to probability distributions - and optimizes the KL divergence between them. We also introduce a simple data sampling method by representing the database with randomly generated proxies, used as reference points to query points from the training set. Experimenting with three publicly available standard ANN benchmarks we demonstrate significant improvement over other binary hashing methods, achieving an improvement of between 7% to 17%. As opposed to other methods, we show high performance in both low (64 bits) and high (768 bits) dimensional bit representation, offering increased accuracy when resources are available and flexibility in choice of ANN strategy.
- published: 10 May 2021
- views: 338
14:31
ICLR14: J Masci: Sparse similarity-preserving hashing
ICLR 2014 Talk: "Sparse similarity-preserving hashing
Jonathan Masci, Alex M. Bronstein, Michael M. Bronstein, Pablo Sprechmann, Guillermo Sapiro
Paper: http:/...
ICLR 2014 Talk: "Sparse similarity-preserving hashing
Jonathan Masci, Alex M. Bronstein, Michael M. Bronstein, Pablo Sprechmann, Guillermo Sapiro
Paper: http://openreview.net/document/69982ebb-b3e4-4685-9ca7-8b29a29b3a39
Slides: http://www.idsia.ch/~masci/slides/2014-ICLR-sparsehash.pdf
https://wn.com/Iclr14_J_Masci_Sparse_Similarity_Preserving_Hashing
ICLR 2014 Talk: "Sparse similarity-preserving hashing
Jonathan Masci, Alex M. Bronstein, Michael M. Bronstein, Pablo Sprechmann, Guillermo Sapiro
Paper: http://openreview.net/document/69982ebb-b3e4-4685-9ca7-8b29a29b3a39
Slides: http://www.idsia.ch/~masci/slides/2014-ICLR-sparsehash.pdf
- published: 19 Apr 2014
- views: 549
1:05:25
Algorithmic Speedups via Locality Sensitive Hashing & Bio-Inspired Hashing - September 8, 2021
Anshuman Mishra talks about algorithmic speedups via locality sensitive hashing and reviews papers on bio-inspired hashing, specifically LSH inspired by fruit f...
Anshuman Mishra talks about algorithmic speedups via locality sensitive hashing and reviews papers on bio-inspired hashing, specifically LSH inspired by fruit flies.
He first gives an overview of what algorithmic speedups are, why they are useful and how we can use them. He then dives into a specific technique called locality sensitive hashing (LSH) and goes over the motivations of using these types of hash algorithms and how they work. Lastly, Anshuman talks about the potential biological relevances of these hash mechanisms. He looks at the paper “A neural algorithm for a fundamental computing problem” which outlined a version of LSH inspired by fruit flies that uses sparse projections, expands dimensionality and uses a Winner-Takes-All mechanism.
Paper reviewed: “A Neural Algorithm for a Fundamental Computing Problem” by Dasgupta et al. : https://www.science.org/doi/abs/10.1126/science.aam9868
0:00 Overview
1:11 Algorithmic Speedups
14:28 Locality Sensitive Hashing
45:54 Bio-inspired Hashing
- - - - -
Numenta is leading the new era of machine intelligence. Our deep experience in theoretical neuroscience research has led to tremendous discoveries on how the brain works. We have developed a framework called the Thousand Brains Theory of Intelligence that will be fundamental to advancing the state of artificial intelligence and machine learning. By applying this theory to existing deep learning systems, we are addressing today’s bottlenecks while enabling tomorrow’s applications.
Subscribe to our News Digest for the latest news about neuroscience and artificial intelligence:
https://numenta.com/news-digest/
Subscribe to our Newsletter for the latest Numenta updates:
https://tinyurl.com/NumentaNewsletter
Our Social Media:
https://twitter.com/Numenta
https://www.facebook.com/OfficialNumenta
https://www.linkedin.com/company/numenta
Our Open Source Resources:
https://github.com/numenta
https://discourse.numenta.org/
Our Website:
https://numenta.com/
https://wn.com/Algorithmic_Speedups_Via_Locality_Sensitive_Hashing_Bio_Inspired_Hashing_September_8,_2021
Anshuman Mishra talks about algorithmic speedups via locality sensitive hashing and reviews papers on bio-inspired hashing, specifically LSH inspired by fruit flies.
He first gives an overview of what algorithmic speedups are, why they are useful and how we can use them. He then dives into a specific technique called locality sensitive hashing (LSH) and goes over the motivations of using these types of hash algorithms and how they work. Lastly, Anshuman talks about the potential biological relevances of these hash mechanisms. He looks at the paper “A neural algorithm for a fundamental computing problem” which outlined a version of LSH inspired by fruit flies that uses sparse projections, expands dimensionality and uses a Winner-Takes-All mechanism.
Paper reviewed: “A Neural Algorithm for a Fundamental Computing Problem” by Dasgupta et al. : https://www.science.org/doi/abs/10.1126/science.aam9868
0:00 Overview
1:11 Algorithmic Speedups
14:28 Locality Sensitive Hashing
45:54 Bio-inspired Hashing
- - - - -
Numenta is leading the new era of machine intelligence. Our deep experience in theoretical neuroscience research has led to tremendous discoveries on how the brain works. We have developed a framework called the Thousand Brains Theory of Intelligence that will be fundamental to advancing the state of artificial intelligence and machine learning. By applying this theory to existing deep learning systems, we are addressing today’s bottlenecks while enabling tomorrow’s applications.
Subscribe to our News Digest for the latest news about neuroscience and artificial intelligence:
https://numenta.com/news-digest/
Subscribe to our Newsletter for the latest Numenta updates:
https://tinyurl.com/NumentaNewsletter
Our Social Media:
https://twitter.com/Numenta
https://www.facebook.com/OfficialNumenta
https://www.linkedin.com/company/numenta
Our Open Source Resources:
https://github.com/numenta
https://discourse.numenta.org/
Our Website:
https://numenta.com/
- published: 14 Sep 2021
- views: 1103
9:10
Deep Learning - 016 Employing indexing structures for efficient retrieval of semantic neighbors
Deep learning added a huge boost to the already rapidly developing field of computer vision. With deep learning, a lot of new applications of computer vision te...
Deep learning added a huge boost to the already rapidly developing field of computer vision. With deep learning, a lot of new applications of computer vision techniques have been introduced and are now becoming parts of our everyday lives. These include face recognition and indexing, photo stylization or machine vision in self-driving cars.
The goal of this course is to introduce students to computer vision, starting from basics and then turning to more modern deep learning models. We will cover both image and video recognition, including image classification and annotation, object recognition and image search, various object detection techniques, motion estimation, object tracking in video, human action recognition, and finally image stylization, editing and new image generation. In course project, students will learn how to build face recognition and manipulation system to understand the internal mechanics of this technology, probably the most renown and often demonstrated in movies and TV-shows example of computer vision and AI.
https://wn.com/Deep_Learning_016_Employing_Indexing_Structures_For_Efficient_Retrieval_Of_Semantic_Neighbors
Deep learning added a huge boost to the already rapidly developing field of computer vision. With deep learning, a lot of new applications of computer vision techniques have been introduced and are now becoming parts of our everyday lives. These include face recognition and indexing, photo stylization or machine vision in self-driving cars.
The goal of this course is to introduce students to computer vision, starting from basics and then turning to more modern deep learning models. We will cover both image and video recognition, including image classification and annotation, object recognition and image search, various object detection techniques, motion estimation, object tracking in video, human action recognition, and finally image stylization, editing and new image generation. In course project, students will learn how to build face recognition and manipulation system to understand the internal mechanics of this technology, probably the most renown and often demonstrated in movies and TV-shows example of computer vision and AI.
- published: 28 Apr 2021
- views: 306
1:00
Central Similarity Quantization for Efficient Image and Video Retrieval
Authors: Li Yuan, Tao Wang, Xiaopeng Zhang, Francis EH Tay, Zequn Jie, Wei Liu, Jiashi Feng Description: Existing data-dependent hashing methods usually learn h...
Authors: Li Yuan, Tao Wang, Xiaopeng Zhang, Francis EH Tay, Zequn Jie, Wei Liu, Jiashi Feng Description: Existing data-dependent hashing methods usually learn hash functions from pairwise or triplet data relationships, which only capture the data similarity locally, and often suffer from low learning efficiency and low collision rate. In this work, we propose a new \emph{global} similarity metric, termed as \emph{central similarity}, with which the hash codes of similar data pairs are encouraged to approach a common center and those for dissimilar pairs to converge to different centers, to improve hash learning efficiency and retrieval accuracy. We principally formulate the computation of the proposed central similarity metric by introducing a new concept, i.e., \emph{hash center} that refers to a set of data points scattered in the Hamming space with a sufficient mutual distance between each other. We then provide an efficient method to construct well separated hash centers by leveraging the Hadamard matrix and Bernoulli distributions. Finally, we propose the Central Similarity Quantization (CSQ) that optimizes the central similarity between data points w.r.t.\ their hash centers instead of optimizing the local similarity. CSQ is generic and applicable to both image and video hashing scenarios. Extensive experiments on large-scale image and video retrieval tasks demonstrate that CSQ can generate cohesive hash codes for similar data pairs and dispersed hash codes for dissimilar pairs, achieving a noticeable boost in retrieval performance, i.e. 3\%-20\% in mAP over the previous state-of-the-arts.
https://wn.com/Central_Similarity_Quantization_For_Efficient_Image_And_Video_Retrieval
Authors: Li Yuan, Tao Wang, Xiaopeng Zhang, Francis EH Tay, Zequn Jie, Wei Liu, Jiashi Feng Description: Existing data-dependent hashing methods usually learn hash functions from pairwise or triplet data relationships, which only capture the data similarity locally, and often suffer from low learning efficiency and low collision rate. In this work, we propose a new \emph{global} similarity metric, termed as \emph{central similarity}, with which the hash codes of similar data pairs are encouraged to approach a common center and those for dissimilar pairs to converge to different centers, to improve hash learning efficiency and retrieval accuracy. We principally formulate the computation of the proposed central similarity metric by introducing a new concept, i.e., \emph{hash center} that refers to a set of data points scattered in the Hamming space with a sufficient mutual distance between each other. We then provide an efficient method to construct well separated hash centers by leveraging the Hadamard matrix and Bernoulli distributions. Finally, we propose the Central Similarity Quantization (CSQ) that optimizes the central similarity between data points w.r.t.\ their hash centers instead of optimizing the local similarity. CSQ is generic and applicable to both image and video hashing scenarios. Extensive experiments on large-scale image and video retrieval tasks demonstrate that CSQ can generate cohesive hash codes for similar data pairs and dispersed hash codes for dissimilar pairs, achieving a noticeable boost in retrieval performance, i.e. 3\%-20\% in mAP over the previous state-of-the-arts.
- published: 17 Jul 2020
- views: 110
3:11
Robust Property Preserving Hashes
Presentation by Elette Boyle, Rio LaVigne, Vinod Vaikuntanathan at Crypto 2018 Rump Session
Presentation by Elette Boyle, Rio LaVigne, Vinod Vaikuntanathan at Crypto 2018 Rump Session
https://wn.com/Robust_Property_Preserving_Hashes
Presentation by Elette Boyle, Rio LaVigne, Vinod Vaikuntanathan at Crypto 2018 Rump Session
- published: 05 Oct 2018
- views: 316
1:05:32
Alexandr Andoni - The Geometry of Similarity Search (May 19, 2017)
More details: https://www.simonsfoundation.org/event/the-geometry-of-similarity-search/
More details: https://www.simonsfoundation.org/event/the-geometry-of-similarity-search/
https://wn.com/Alexandr_Andoni_The_Geometry_Of_Similarity_Search_(May_19,_2017)
More details: https://www.simonsfoundation.org/event/the-geometry-of-similarity-search/
- published: 06 Feb 2019
- views: 324
8:51
Lecture 15.4 — Semantic Hashing — [ Deep Learning | Geoffrey Hinton | UofT ]
🔔 Stay Connected! Get the latest insights on Artificial Intelligence (AI) 🧠, Natural Language Processing (NLP) 📝, and Large Language Models (LLMs) 🤖. Follow (ht...
🔔 Stay Connected! Get the latest insights on Artificial Intelligence (AI) 🧠, Natural Language Processing (NLP) 📝, and Large Language Models (LLMs) 🤖. Follow (https://twitter.com/mtnayeem) on Twitter 🐦 for real-time updates, news, and discussions in the field.
Check out the following interesting papers. Happy learning!
Paper Title: "On the Role of Reviewer Expertise in Temporal Review Helpfulness Prediction"
Paper: https://aclanthology.org/2023.findings-eacl.125/
Dataset: https://huggingface.co/datasets/tafseer-nayeem/review_helpfulness_prediction
Paper Title: "Abstractive Unsupervised Multi-Document Summarization using Paraphrastic Sentence Fusion"
Paper: https://aclanthology.org/C18-1102/
Paper Title: "Extract with Order for Coherent Multi-Document Summarization"
Paper: https://aclanthology.org/W17-2407.pdf
Paper Title: "Paraphrastic Fusion for Abstractive Multi-Sentence Compression Generation"
Paper: https://dl.acm.org/doi/abs/10.1145/3132847.3133106
Paper Title: "Neural Diverse Abstractive Sentence Compression Generation"
Paper: https://link.springer.com/chapter/10.1007/978-3-030-15719-7_14
https://wn.com/Lecture_15.4_—_Semantic_Hashing_—_Deep_Learning_|_Geoffrey_Hinton_|_Uoft
🔔 Stay Connected! Get the latest insights on Artificial Intelligence (AI) 🧠, Natural Language Processing (NLP) 📝, and Large Language Models (LLMs) 🤖. Follow (https://twitter.com/mtnayeem) on Twitter 🐦 for real-time updates, news, and discussions in the field.
Check out the following interesting papers. Happy learning!
Paper Title: "On the Role of Reviewer Expertise in Temporal Review Helpfulness Prediction"
Paper: https://aclanthology.org/2023.findings-eacl.125/
Dataset: https://huggingface.co/datasets/tafseer-nayeem/review_helpfulness_prediction
Paper Title: "Abstractive Unsupervised Multi-Document Summarization using Paraphrastic Sentence Fusion"
Paper: https://aclanthology.org/C18-1102/
Paper Title: "Extract with Order for Coherent Multi-Document Summarization"
Paper: https://aclanthology.org/W17-2407.pdf
Paper Title: "Paraphrastic Fusion for Abstractive Multi-Sentence Compression Generation"
Paper: https://dl.acm.org/doi/abs/10.1145/3132847.3133106
Paper Title: "Neural Diverse Abstractive Sentence Compression Generation"
Paper: https://link.springer.com/chapter/10.1007/978-3-030-15719-7_14
- published: 25 Sep 2017
- views: 2533