Optimized product quantization (OPQ)

Optimized product quantization (OPQ)

Optimized product quantization is an orthogonal MCQ method that also learns a rotation R of the data.

Rayuela.quantize_opq โ€” Function.
quantize_opq(X, R, C, V=false) -> B

Given data and PQ/OPQ codeboks, quantize.

Arguments

  • X::Matrix{T}: d-by-n data to quantize
  • R::Matrix{T}: d-by-d rotation to apply to the data before quantizing
  • C::Vector{Matrix{T}}: m-long vector with d/m-by-h matrix entries. Each matrix is a (O)PQ codebook.
  • V::Bool: Whether to print progress

Returns

  • B::Matrix{Int16}: m-by-n matrix with the codes that approximate X
source
Rayuela.train_opq โ€” Function.
train_opq(X, m, h, niter, init, V=false) -> C, B, R, error

Trains an optimized product quantizer.

Arguments

  • X::Matrix{T}: d-by-n data to quantize
  • m::Integer: Number of codebooks
  • h::Integer: Number of entries in each codebook (typically 256)
  • niter::Integer: Number of iterations to use
  • init::String: Method used to intiialize R, either "natural" (identity) or "random".
  • V::Bool: Whether to print progress

Returns

  • B::Matrix{Int16}: m-by-n matrix with the codes
  • C::Vector{Matrix{T}}: m-long vector with d-by-h matrix entries. Each matrix is a codebook of size approximately d/m-by-h.
  • R::Matrix{T}: d-by-d learned rotation for the data
  • obj::Vector{T}: The quantization error each iteration
source

Reference

The main ideas were published at the same time by two independent groups:

An extended version was later publised in a computer vision journal: