Abstract:
In this study, we prepared a peptide nucleic acid (PNA) against the gene coding for the human alpha 1 chain of type I collagen. This PNA was incorporated into normal human fibroblast cells by electroporation, leading to a decrease in the mRNA level of the gene. Furthermore, mRNA for the alpha 2 chain of type I collagen was also reduced. The production of collagen protein exhibited a similar profile to the changes in mRNA. These results indicate that PNA targeting COL1A1 is effective as an antigene reagent, and opens the possibility of future clinical applications in fibroproliferative disorders.

Abstract:
In this paper we consider the separate coding problem for $L+1$ correlated Gaussian memoryless sources. We deal with the case where $L$ separately encoded data of sources work as side information at the decoder for the reconstruction of the remaining source. The determination problem of the rate distortion region for this system is the so called many-help-one problem and has been known as a highly challenging problem. The author determined the rate distortion region in the case where the $L$ sources working as partial side information are conditionally independent if the remaining source we wish to reconstruct is given. This condition on the correlation is called the CI condition. In this paper we extend the author's previous result to the case where $L+1$ sources satisfy a kind of tree structure on their correlation. We call this tree structure of information sources the TS condition, which contains the CI condition as a special case. In this paper we derive an explicit outer bound of the rate distortion region when information sources satisfy the TS condition. We further derive an explicit sufficient condtion for this outer bound to be tight. In particular, we determine the sum rate part of the rate distortion region for the case where information sources satisfy the TS condition. For some class of Gaussian sources with the TS condition we derive an explicit recursive formula of this sum rate part.

Abstract:
We consider the distributed source coding system of $L$ correlated Gaussian sources $Y_i,i=1,2,...,L$ which are noisy observations of correlated Gaussian remote sources $X_k, k=1,2,...,K$. We assume that $Y^{L}={}^{\rm t}(Y_1,Y_2,$ $..., Y_L)$ is an observation of the source vector $X^K={}^{\rm t}(X_1,X_2,..., X_K)$, having the form $Y^L=AX^K+N^L$, where $A$ is a $L\times K$ matrix and $N^L={}^{\rm t}(N_1,N_2,...,N_L)$ is a vector of $L$ independent Gaussian random variables also independent of $X^K$. In this system $L$ correlated Gaussian observations are separately compressed by $L$ encoders and sent to the information processing center. We study the remote source coding problem where the decoder at the center attempts to reconstruct the remote source $X^K$. We consider three distortion criteria based on the covariance matrix of the estimation error on $X^K$. For each of those three criteria we derive explicit inner and outer bounds of the rate distortion region. Next, in the case of $K=L$ and $A=I_L$, we study the multiterminal source coding problem where the decoder wishes to reconstruct the observation $Y^L=X^L+N^L$. To investigate this problem we shall establish a result which provides a strong connection between the remote source coding problem and the multiterminal source coding problem. Using this result, we drive several new partial solutions to the multiterminal source coding problem.

Abstract:
In this paper we consider the identification (ID) via multiple access channels (MACs). In the general MAC the ID capacity region includes the ordinary transmission (TR) capacity region. In this paper we discuss the converse coding theorem. We estimate two types of error probabilities of identification for rates outside capacity region, deriving some function which serves as a lower bound of the sum of two error probabilities of identification. This function has a property that it tends to zero as $n\to \infty$ for noisy channels satisfying the strong converse property. Using this property, we establish that the transmission capacity region is equal to the ID capacity for the MAC satisfying the strong converse property. To derive the result we introduce a new resolvability problem on the output from the MAC. We further develop a new method of converting the direct coding theorem for the above MAC resolvability problem into the converse coding theorem for the ID via MACs.

Abstract:
We consider the distributed source coding system for $L$ correlated Gaussian observations $Y_i, i=1,2, ..., L$. Let $X_i,i=1,2, ..., L$ be $L$ correlated Gaussian random variables and $N_i,$ $i=1,2,... L$ be independent additive Gaussian noises also independent of $X_i, i=1,2,..., L$. We consider the case where for each $i=1,2,..., L$, $Y_i$ is a noisy observation of $X_i$, that is, $Y_i=X_i+N_i$. On this coding system the determination problem of the rate distortion region remains open. In this paper, we derive explicit outer and inner bounds of the rate distortion region. We further find an explicit sufficient condition for those two to match. We also study the sum rate part of the rate distortion region when the correlation has some symmetrical property and derive a new lower bound of the sum rate part. We derive a sufficient condition for this lower bound to be tight. The derived sufficient condition depends only on the correlation property of the sources and their observations.

Abstract:
We consider a distributed source coding problem of $L$ correlated Gaussian observations $Y_i, i=1,2,...,L$. We assume that the random vector $Y^{L}={}^{\rm t} (Y_1,Y_2,$ $...,Y_L)$ is an observation of the Gaussian random vector $X^K={}^{\rm t}(X_1,X_2,...,X_K)$, having the form $Y^L=AX^K+N^L ,$ where $A$ is a $L\times K$ matrix and $N^L={}^{\rm t}(N_1,N_2,...,N_L)$ is a vector of $L$ independent Gaussian random variables also independent of $X^K$. The estimation error on $X^K$ is measured by the distortion covariance matrix. The rate distortion region is defined by a set of all rate vectors for which the estimation error is upper bounded by an arbitrary prescribed covariance matrix in the meaning of positive semi definite. In this paper we derive explicit outer and inner bounds of the rate distortion region. This result provides a useful tool to study the direct and indirect source coding problems on this Gaussian distributed source coding system, which remain open in general.

Abstract:
We consider a relay channel where a relay helps the transmission of messages from one sender to one receiver. The relay is considered not only as a sender that helps the message transmission but as a wire-tapper who can obtain some knowledge about the transmitted messages. In this paper we study the coding problem of the relay channel under the situation that some of transmitted messages are confidential to the relay. A security of such confidential messages is measured by the conditional entropy. The rate region is defined by the set of transmission rates for which messages are reliably transmitted and the security of confidential messages is larger than a prescribed level. In this paper we give two definition of the rate region. We first define the rate region in the case of deterministic encoder and call it the deterministic rate region. Next, we define the rate region in the case of stochastic encoder and call it the stochastic rate region. We derive explicit inner and outer bounds for the above two rate regions and present a class of relay channels where two bounds match. Furthermore, we show that stochastic encoder can enlarge the rate region. We also evaluate the deterministic rate region of the Gaussian relay channel with confidential messages.

Abstract:
We consider the discrete memoryless degraded broadcast channels with feedback. We prove that the error probability of decoding tends to one exponentially for rates outside the capacity region and derive an explicit lower bound of this exponent function. We shall demonstrate that the information spectrum approach is quite useful for investigating this problem.

Abstract:
We consider the discrete memoryless degraded broadcast channels. We prove that the error probability of decoding tends to one exponentially for rates outside the capacity region and derive an explicit lower bound of this exponent function. We shall demonstrate that the information spectrum approach is quite useful for investigating this problem.

Abstract:
We consider the one helper source coding problem posed and investigated by Ahlswede, K\"orner and Wyner. In this system, the error probability of decoding goes to one as the source block length $n$ goes to infinity. This implies that we have a strong converse theorem for the one helper source coding problem. In this paper we provide a much stronger version of this strong converse theorem for the one helper source coding problem. We prove that the error probability of decoding tends to one exponentially and derive an explicit lower bound of this exponent function.