Abstract:
After a brief introduction to the principles and promise of quantum information processing, the requirements for the physical implementation of quantum computation are discussed. These five requirements, plus two relating to the communication of quantum information, are extensively explored and related to the many schemes in atomic physics, quantum optics, nuclear and electron magnetic resonance spectroscopy, superconducting electronics, and quantum-dot physics, for achieving quantum computing.

Abstract:
Recent developments in quantum computation have made it clear that there is a lot more to computation than the conventional Boolean algebra. Is quantum computation the most general framework for processing information? Having gathered the courage to go beyond the traditional definitions, we are now in a position to answer: Certainly not. The meaning of a message being ``a collection of building blocks'' can be explored in a variety of situations. A generalised framework is proposed based on group theory, and it is illustrated with well-known physical examples. A systematic information theoretical approach is yet to be developed in many of these situations. Some directions for future development are pointed out.

Abstract:
The framework is proposed where matter can be seen as related to energy in a way structure relates to process and information relates to computation. In this scheme matter corresponds to a structure, which corresponds to information. Energy corresponds to the ability to carry out a process, which corresponds to computation. The relationship between each two complementary parts of each dichotomous pair (matter/energy, structure/process, information/computation) are analogous to the relationship between being and becoming, where being is the persistence of an existing structure while becoming is the emergence of a new structure through the process of interactions. This approach presents a unified view built on two fundamental ontological categories: Information and computation. Conceptualizing the physical world as an intricate tapestry of protoinformation networks evolving through processes of natural computation helps to make more coherent models of nature, connecting non-living and living worlds. It presents a suitable basis for incorporating current developments in understanding of biological/cognitive/social systems as generated by complexification of physicochemical processes through self-organization of molecules into dynamic adaptive complex systems by morphogenesis, adaptation and learning—all of which are understood as information processing.

Abstract:
Quantum computation is a novel way of information processing which allows, for certain classes of problems, exponential speedups over classical computation. Various models of quantum computation exist, such as the adiabatic, circuit and measurement-based models. They have been proven equivalent in their computational power, but operate very differently. As such, they may be suitable for realization in different physical systems, and also offer different perspectives on open questions such as the precise origin of the quantum speedup. Here, we give an introduction to the one-way quantum computer, a scheme of measurement-based quantum computation. In this model, the computation is driven by local measurements on a carefully chosen, highly entangled state. We discuss various aspects of this computational scheme, such as the role of entanglement and quantum correlations. We also give examples for ground states of simple Hamiltonians which enable universal quantum computation by local measurements.

Abstract:
We review our recent work addressing various theoretical issues in spin-based quantum dot quantum computation and quantum information processing. In particular, we summarize our calculation of electron exchange interaction in two-electron double quantum dots and multi-electron double dots, and discuss the physical implication of our results. We also discuss possible errors and how they can be corrected in spin-based quantum dot quantum computation. We critically assess the constraints and conditions required for building spin-based solid state quantum dot quantum computers.

Abstract:
This article is a short introduction to and review of the cluster-state model of quantum computation, in which coherent quantum information processing is accomplished via a sequence of single-qubit measurements applied to a fixed quantum state known as a cluster state. We also discuss a few novel properties of the model, including a proof that the cluster state cannot occur as the exact ground state of any naturally occurring physical system, and a proof that measurements on any quantum state which is linearly prepared in one dimension can be efficiently simulated on a classical computer, and thus are not candidates for use as a substrate for quantum computation.

Abstract:
The Hopfield neural networks and the holographic neural networks are models which were successfully simulated on conventional computers. Starting with these models, an analogous fundamental quantum information processing system is developed in this article. Neuro-quantum interaction can regulate the "collapse"-readout of quantum computation results. This paper is a comprehensive introduction into associative processing and memory-storage in quantum-physical framework.

Abstract:
Computers are physical systems: what they can and cannot do is dictated by the laws of physics. In particular, the speed with which a physical device can process information is limited by its energy and the amount of information that it can process is limited by the number of degrees of freedom it possesses. This paper explores the physical limits of computation as determined by the speed of light $c$, the quantum scale $\hbar$ and the gravitational constant $G$. As an example, quantitative bounds are put to the computational power of an `ultimate laptop' with a mass of one kilogram confined to a volume of one liter.

Abstract:
We show that when we endow the action principle with the overlooked possibility to allow endpoint selection, it gains an enormous additional power, which, perhaps surprisingly, directly corresponds to biological behavior. The biological version of the least action principle is the most action principle. For the first time, we formulate here the first principle of biology in a mathematical form and present some of its applications of primary importance.

Abstract:
Provided that there is no theoretical frame for complex engineered systems (CES) as yet, this paper claims that bio-inspired engineering can help provide such a frame. Within CES bio-inspired systems play a key role. The disclosure from bio-inspired systems and biological computation has not been sufficiently worked out, however. Biological computation is to be taken as the processing of information by living systems that is carried out in polynomial time, i.e., efficiently; such processing however is grasped by current science and research as an intractable problem (for instance, the protein folding problem). A remark is needed here: P versus NP problems should be well defined and delimited but biological computation problems are not. The shift from conventional engineering to bio-inspired engineering needs bring the subject (or problem) of computability to a new level. Within the frame of computation, so far, the prevailing paradigm is still the Turing-Church thesis. In other words, conventional engineering is still ruled by the Church-Turing thesis (CTt). However, CES is ruled by CTt, too. Contrarily to the above, we shall argue here that biological computation demands a more careful thinking that leads us towards hypercomputation. Bio-inspired engineering and CES thereafter, must turn its regard toward biological computation. Thus, biological computation can and should be taken as the ground for engineering complex non-linear systems. Biological systems do compute in terms of hypercomputation, indeed. If so, then the focus is not algorithmic or computational complexity but computation-beyond-the-Church-Turing-barrier. We claim that we need a new computational theory that encompasses biological processes wherein the Turing-Church thesis is but a particular case.