Highlights [Hashing Survey]

Structure Sensitive Quantization

Hashing has been proved an attractive technique for fast nearest neighbor search over big data. Compared to embedding the data by simply linear projection, discovering the intrinsic structures underlying the data will enable the hash codes to own stronger discriminative power for fast nearest neighbor search over large-scale datasets.

Related papers
  • Yuqing Ma, Yue He, Fan Ding, Sheng Hu, Jun Li, Xianglong Liu*.  Progressive Generative Hashing for Image Retrieval.   IJCAI, 2018. [paper] [slides][bibtex]
  • Ke Xia, Yuqing Ma, Xianglong Liu*, Yadong Mu, Li Liu.  Temporal Binary Coding for Large-Scale Video Search.  ACM MM, 2017. [paper] [poster][bibtex]
  • Zhujin Li, Xianglong Liu*, Junjie Wu, Hao Su.  Adaptive Binary Quantization for Fast Nearest Neighbor Search. ECAI, 2016. (Journal: Distributed Adaptive Binary Quantization for Fast Nearest Neighbor Search. IEEE TIP, 2017) [paper] [slides][codes][bibtex]
  • Xianglong Liu, Bowen Du, Cheng Deng, Ming Liu, Bo Lang.   Structure Sensitive Hashing with Adaptive Product Quantization. IEEE Transactions on Cybernetics, 2016. [bibtex]
  • Xianglong Liu, Yadong Mu, Danchen Zhang, Bo Lang, Xuelong Li.  Large-Scale Unsupervised Hashing with Shared Structure Learning. IEEE Transactions on Cybernetics, 2015.[codes][bibtex]
  • Xianglong Liu, Junfeng He, Bo Lang, Shih-Fu Chang.  Hash Bit Selection: a Unified Solution for Selection Problems in Hashing. IEEE CVPR, 2013. (Journal: Hash Bit Selection for Nearest Neighbor Search. IEEE TIP, 2017)[paper] [slides][codes][bibtex]

Complementary Multi-Index Hashing

Hashing has been proven a promising technique for fast nearest neighbor search over massive databases. In many practical tasks it usually builds multiple hash tables for a desired level of recall performance. In the literature LSH-based multiple table indexing is usually adopted to independently build a set of hash tables using LSH functions, which can faithfully improve the recall performance (Lv et al. 2007; Norouzi, Punjani, and Fleet 2012; Xia et al. 2013; Cheng et al. 2014). However, without eliminating the table redundancy it often requires a huge number of tables, at the cost of significantly sacrificing precision.

Related papers
  • Qiang Fu, Xu Han, Xianglong Liu*, Jingkuan Song, Cheng Deng.   Complementary Binary Quantization for Joint Multiple Indexing.  IJCAI, 2018. [paper] [slides][bibtex]
  • Xianglong Liu, Cheng Deng, Yadong Mu, Zhujin Li.  Boosting Complementary Hash Tables for Fast Nearest Neighbor Search. AAAI, 2017. [paper] [slides][bibtex]
  • Xianglong Liu, Lei Huang, Cheng Deng, Jiwen Lu, Bo Lang.  Multi-View Complementary Hash Tables for Nearest Neighbor Search. AAAI, 2017. [paper] [slides][bibtex]
  • Tianxu Ji, Xianglong Liu*, Cheng Deng, Lei Huang, Bo Lang.  Query-Adaptive Hash Code Ranking for Fast Nearest Neighbor Search. ACM MM, 2014. (Journal: Query-Adaptive Hash Code Ranking for Large-Scale Multi-View Visual Search. IEEE TIP, 2016)[paper] [codes][bibtex]
  • Xianglong Liu, Junfeng He, Bo Lang.  Reciprocal Hash Tables for Nearest Neighbor Search. AAAI, 2013. (Journal: Query-Adaptive Reciprocal Hash Tables for Nearest Neighbor Search. IEEE TIP, 2016)[paper] [slides][codes][bibtex]

Hash-based Approximate Computing

Hashing has become an increasingly popular technique for fast approximate computing. Despite its successful progress in classic point-to-point search problem, there are few studies regarding its applications in the large-scale machine learning problems, including classification, detection, recommendation, deep learning, etc., where the key computation can be approximately completed using the bit manipulations on the binary codes, and thus can be significantly accelerated by hashing.

Related papers
  • Binshuai Wang, Xianglong Liu*, Ke Xia, Kotagiri Ramamohanarao, Dacheng Tao.  Random Angular Projection for Fast Nearest Subspace Search. PCM (Best Student Paper), 2018. [paper] [supplementary][bibtex]
  • Lei Huang, Xianglong Liu, Bo Lang, Adams Wei Yu, Bo Li.  Orthogonal Weight Normalization: Solution to Optimization over Multiple Dependent Stiefel Manifolds in Deep Neural Networks. AAAI, 2018. [paper] [codes][bibtex]
  • Lei Huang, Xianglong Liu*, Yang Liu, Bo Lang, Dacheng Tao.  Centered Weight Normalization in Accelerating Training of Deep Neural Networks. IEEE ICCV, 2017. [paper] [supp][codes][bibtex]
  • Xianglong Liu, Xinjie Fan, Cheng Deng, Zhujin Li, Hao Su, Dacheng Tao.  Multilinear Hyperplane Hashing. IEEE CVPR (Young Researcher Support), 2016. [paper] [supp][codes][bibtex]
  • Xianglong Liu, Junfeng He, Cheng Deng, Bo Lang.  Collaborative Hashing. IEEE CVPR (Young Researcher Support), 2014. [paper] [poster][codes][bibtex]