Motivated by applications to efficient secure computation, we consider the following problem of encrypted matrix-vector product (EMVP). Let F be a finite field. In an offline phase, a client uploads an encryption of a matrix M ∈ F^(m×ℓ) to a server, keeping only a short secret key. The server stores the encrypted matrix M̂. In the online phase, the client may repeatedly send encryptions q̂_i of query vectors q_i ∈ F^ℓ, which enables the client and the server to locally compute compact shares of the matrix-vector product Mq_i. The server learns nothing about M or q_i. The shared output can either be revealed to the client or processed by another protocol.
We present efficient EMVP protocols based on variants of the learning parity with noise (LPN) assumption and the related learning subspace with noise (LSN) assumption. Our EMVP protocols are field-agnostic in the sense that the parties only perform arithmetic operations over F, and are close to optimal with respect to both communication and computation. In fact, for sufficiently large ℓ (typically a few hundreds), the online computation and communication costs of our LSN-based EMVP can be less than twice the costs of computing Mq_i in the clear.
Combined with suitable secure post-processing protocols on the secret-shared output, our EMVP protocols are useful for a variety of secure computation tasks, including encrypted fuzzy search and secure ML.
Our technical approach builds on recent techniques for private information retrieval in the secret-key setting. The core idea is to encode the matrix M and the queries q_i using a pair of secret dual linear codes, while defeating algebraic attacks by adding noise.
Encrypted matrix-vector products from secret dual codes
2025
Research areas