Poster 2018

CASC2018

   

Sponsored by

 Maplesoft

   

Proceedings

   
    Invited Talks

 

Maurice R. Kibler (Lyon)

Institut de Physique Nucléaire de Lyon, CNRS/IN2P3, Université Claude Bernard Lyon 1

 

Quantum information and quantum computing: an overview and some mathematical aspects

 

The aim of the present paper is twofold. First, to give the main ideas behind quantum computing and quantum information, a field based on quantum-mechanical phenomena. Therefore, a short review is devoted to (i) quantum bits or qubits (and more generally qudits), the analogues of the usual bits 0 and 1 of the classical information theory, and to (ii) two characteristics of quantum mechanics, namely, linearity (which manifests itself through the superposition of qubits and the action of unitary operators on qubits) and entanglement of certain multi-qubit states (a resource that is specific to quantum mechanics). Second, to focus on some mathematical problems related to the so-called mutually unbiased bases used in quantum computing and quantum information processing. In this direction, the construction of mutually unbiased bases is presented via two distinct approaches: one based on the group SU(2) and the other on Galois fields and Galois rings.


 

Jean-Guillaume Dumas (Grenoble)

Laboratoire Jean Kuntzmann, Grenoble, France

 

Prover efficient verifiable computing:

 

In an emerging computing paradigm, computational capabilities, from processing power to storage capacities, are offered to users over communication networks as a service. Amazon Web services (through the Elastic Compute Cloud), Google Compute Engine, or IBM Platform Computing for instance, provide decentralized high-performance computing solutions. The idea is to outsource demanding computations in order to limit infrastructure costs. None of these platforms, however, offer any guarantee whatsoever on the calculation: no guarantee that the result is correct, nor even that the computation has even effectively been done.

The idea of verifiable computing is to associate a data structure to the result that allows a verification algorithm to prove the validity of a result, faster than by recomputing it. It is thus possible to outsource the computations to any "cloud" and to verify afterwards that the computations have been correctly made. We thus talk about a Prover (the server performing the computations) and a Verifier. Goldwasser, Kalai and Rothblum gave in 2008 a generic method to verify any parallelizable computation, in almost linear time in the size of the inputs and the result. However the extra cost of the computations for the Prover (and therefore the extra cost to the customer), although only almost a constant factor of the overall work, is nonetheless prohibitive in practice.

Differently, we will present ad-hoc procedures, e.g. for exact linear algebra computations, that are Prover-efficient, that is that have much less financial overhead.