Abstract:
We show how dynamical decoupling (DD) and quantum error correction (QEC) can be optimally combined in the setting of fault tolerant quantum computing. To this end we identify the optimal generator set of DD sequences designed to protect quantum information encoded into stabilizer subspace or subsystem codes. This generator set, comprising the stabilizers and logical operators of the code, minimizes a natural cost function associated with the length of DD sequences. We prove that with the optimal generator set the restrictive local-bath assumption used in earlier work on hybrid DD-QEC schemes, can be significantly relaxed, thus bringing hybrid DD-QEC schemes, and their potentially considerable advantages, closer to realization.

Abstract:
It is not so well-known that measurement-free quantum error correction protocols can be designed to achieve fault-tolerant quantum computing. Despite the potential advantages of using such protocols in terms of the relaxation of accuracy, speed and addressing requirements on the measurement process, they have usually been overlooked because they are expected to yield a very bad threshold as compared to error correction protocols which use measurements. Here we show that this is not the case. We design fault-tolerant circuits for the 9 qubit Bacon-Shor code and find a threshold for gates and preparation of $p_{(p,g) thresh}=3.76 \times 10^{-5}$ (30% of the best known result for the same code using measurement based error correction) while admitting up to 1/3 error rates for measurements and allocating no constraints on measurement speed. We further show that demanding gate error rates sufficiently below the threshold one can improve the preparation threshold to $p_{(p)thresh} = 1/3$. We also show how these techniques can be adapted to other Calderbank-Shor-Steane codes.

Abstract:
We introduce a scheme to perform universal quantum computation in quantum cellular automata (QCA) fashion in arbitrary subsystem dimension (not necessarily finite). The scheme is developed over a one spatial dimension $N$-element array, requiring only mirror symmetric logical encoding and global pulses. A mechanism using ancillary degrees of freedom for subsystem specific measurement is also presented.

Abstract:
We present a fault-tolerant semi-global control strategy for universal quantum computers. We show that N-dimensional array of qubits where only (N-1)-dimensional addressing resolution is available is compatible with fault-tolerant universal quantum computation. What is more, we show that measurements and individual control of qubits are required only at the boundaries of the fault-tolerant computer, i.e. holographic fault-tolerant quantum computation. Our model alleviates the heavy physical conditions on current qubit candidates imposed by addressability requirements and represents an option to improve their scalability.

Abstract:
Based on the idea of measuring the factorizability of a given density matrix, we propose a pairwise analysis strategy for quantifying and understanding multipartite entanglement. The methodology proves very effective as it immediately guarantees, in addition to the usual entanglement properties, additivity and strong super additivity. We give a specific set of quantities that fulfill the protocol and which, according to our numerical calculations, make the entanglement measure an LOCC non-increasing function. The strategy allows a redefinition of the structural concept of global entanglement.

Abstract:
We present a general strategy that allows a more flexible method for the construction of fully additive multipartite entanglement monotones than the ones so far reported in the literature of axiomatic entanglement measures. Within this framework we give a proof of a conjecture of outstanding implications in information theory: the full additivity of the Entanglement of Formation.

Abstract:
Within the framework of constructions for quantifying entanglement, we build a natural scenario for the assembly of multipartite entanglement measures based on Hopf bundle-like mappings obtained through Clifford algebra representations. Then, given the non-factorizability of an arbitrary two-qubit density matrix, we give an alternate quantity that allows the construction of two types of entanglement measures based on their arithmetical and geometrical averages over all pairs of qubits in a register of size N, and thus fully characterize its degree and type of entanglement. We find that such an arithmetical average is both additive and strongly super additive.

Abstract:
We generalize the strategy presented in Refs. [1, 2], and propose general conditions for a measure of total correlations to be an entanglement monotone using its pure (and mixed) convex-roof extension. In so doing, we derive crucial theorems and propose a concrete candidate for a total correlations measure which is a fully additive entanglement monotone.

Abstract:
It is well known that the quantum Zeno effect can protect specific quantum states from decoherence by using projective measurements. Here we combine the theory of weak measurements with stabilizer quantum error correction and detection codes. We derive rigorous performance bounds which demonstrate that the Zeno effect can be used to protect appropriately encoded arbitrary states to arbitrary accuracy, while at the same time allowing for universal quantum computation or quantum control.

Abstract:
Within quantum information, many methods have been proposed to avoid or correct the deleterious effects of the environment on a system of interest. In this work, expanding on our earlier paper [G. A. Paz-Silva et al., Phys. Rev. Lett. 108, 080501 (2012), arXiv:1104.5507], we evaluate the applicability of the quantum Zeno effect as one such method. Using the algebraic structure of stabilizer quantum error correction codes as a unifying framework, two open-loop protocols are described which involve frequent non-projective (i.e., weak) measurement of either the full stabilizer group or a minimal generating set thereof. The effectiveness of the protocols is measured by the distance between the final state under the protocol and the final state of an idealized evolution in which system and environment do not interact. Rigorous bounds on this metric are derived which demonstrate that, under certain assumptions, a Zeno effect may be realized with arbitrarily weak measurements, and that this effect can protect an arbitrary, unknown encoded state against the environment arbitrarily well.