Abstract:
We study whether a violation of the null energy condition necessarily implies the presence of instabilities. We prove that this is the case in a large class of situations, including isotropic solids and fluids relevant for cosmology. On the other hand we present several counter-examples of consistent effective field theories possessing a stable background where the null energy condition is violated. Two necessary features of these counter-examples are the lack of isotropy of the background and the presence of superluminal modes. We argue that many of the properties of massive gravity can be understood by associating it to a solid at the edge of violating the null energy condition. We briefly analyze the difficulties of mimicking $\dot H>0$ in scalar tensor theories of gravity.

Abstract:
We study the impact of a running index $\alpha_t$ on the spectrum of relic gravitational waves (RGWs) over the whole range of frequency $(10^{-18}\sim 10^{10})$ Hz and reveal its implications in RGWs detections and in cosmology. Analytical calculations show that, although the spectrum of RGWs on low frequencies is less affected by $\alpha_t\ne 0$, but, on high frequencies, the spectrum is modified substantially. Investigations are made toward potential detections of the $\alpha_t$-modified RGWs for several kinds of current and planned detectors. The Advanced LIGO will likely be able to detect RGWs with $\alpha_t\ge 0$ for inflationary models with the inflation index $\beta=-1.956$ and the tensor-scalar ratio $r= 0.55$. The future LISA can detect RGWs for a much broader range of ($\alpha_t$, $\beta$, $r$), and will have a better chance to break a degeneracy between them. Constraints on $\alpha_t$ are estimated from several detections and cosmological observations. Among them, the most stringent one is from the bound of the Big Bang nucleosynthesis (BBN), and requires $\alpha_t < 0.008$ rather conservatively for any reasonable ($\beta$, $r$), preferring a nearly power-law spectrum of RGWs. In light of this result, one would expect the scalar running index $\alpha_s$ to be of the same magnitude as $\alpha_t$, if both RGWs and scalar perturbations are generated by the same scalar inflation.

Abstract:
We report on the main results presented at the workshop on The Physics of Relic Neutrinos. The study of relic neutrinos involves a broad spectrum of problems in particle physics, astrophysics and cosmology. Features of baryogenesis and leptogenesis could be imprinted in the properties of the relic neutrino sea. Relic neutrinos played a crucial role in the big bang nucleosynthesis. Being the hot component of the dark matter, they have participated in the structure formation in the universe. Although the direct detection of the sea seems impossible at this stage, there could be various indirect manifestations of these neutrinos which would allow us to study the properties of the sea both in the past and at the present epoch.

Abstract:
We look at what may occur if Boltzmann equations, as presented by Murayama in 2007, Les Houches, are applied to graviton density in a pre-Planckian universe setting. Two restrictions are in order. First of all, we are assuming a graviton mass on the order of 10?62 grams, as if the pre-Planckian regime does not change the nature of Graviton mass, in its low end. Secondly, we are also assuming that a comparatively low temperature regime (far below the Planckian temperature) exists. Finally we are leaving unsaid what may happen if Gravitational waves enter the Planck regime of ultra-high temperature. With those three considerations, we proceed to examine a Graviton density value resulting from perturbation from low to higher temperatures. In the end an ultra- hot Pre big bang cosmology will yield essentially no early universe information transfer crossovers to our present cosmological system. This is not affected by the choice if we have a single repeating universe, or a multiverse. A cold pre inflationary state yields a very different situation. Initial frequencies of Gravitons, though, as outlined may be different in the multiverse case, as opposed to the single repeating universe case. We close with comments as to Bicep 2, and how this document has material as to how to avoid the BICEP 2 disaster. And about choosing between either the possibility of massless Scalar-Tensor Gravity as the correct theory of gravitation or conventional GR.

Abstract:
Here we present a description of a modified Expectation – Maximization algorithm as well as its implementation (NullHap) which allow to effectively overcome these limitations. As an example of application we used Nullhap to reanalyze published data on distribution of KIR genotypes in Polish psoriasis patients and controls showing that the KIR2DS4/1D locus may be a marker of KIR2DS1 haplotypes with different effects on disease susceptibility.The developed application can estimate haplotype frequencies for every type of polymorphism and can effectively be used in genetic research as illustrated by a novel finding regarding the genetic susceptibility to psoriasis.Laboratory techniques used to determine haplotypes [1] are often too expensive for large-scale studies. The lack of phase information provided by the popular typing methods could be overcome using likelihood-based calculations [2], which estimate haplotype frequencies in a population, and reconstruct the haplotype pair in each individual. This approach is more cost-effective and powerful than linkage analysis [3], and gives more information than single marker-based methods [4].Haplotype estimation procedures typically use maximum likelihood approach. The most popular algorithm implemented for example in Arlequin [5] is The Expectation – Maximization algorithm (EM) [6] but other methods were also proposed: Bayesian method using a pseudo-Gibbs sampler [7], partition-ligation [8], Monte Carlo [9] and Hidden Markov Model [10].A frequent shortage of available software packages [5,7] is the lack of possibility to analyze loci where null variants occur with an appreciable frequency. In a diploid organism, a null allele is a variant which is not detected in genotyping, because of a deletion of an entire locus or because of a mutation interfering with analysis. This makes it impossible to distinguish between some heterozygous and homozygous genotypes [11]. For example, if there is only one alternative allele A1 besides

Abstract:
We consider a wide class of static spherically symmetric black holes of arbitrary dimension with a photon sphere (a hypersurface on which a massless particle can orbit the black hole on unstable circular null geodesics). This class includes various spacetimes of physical interest such as Schwarzschild, Schwarzschild-Tangherlini and Reissner-Nordstr\"om black holes, the canonical acoustic black hole or the Schwarzschild-de Sitter black hole. For this class of black holes, we provide general analytical expressions for the Regge poles of the $S$-matrix associated with a massless scalar field theory. This is achieved by using third-order WKB approximations to solve the associated radial wave equation. These results permit us to obtain analytically the nonlinear dispersion relation and the damping of the "surface waves" lying close to the photon sphere as well as, from Bohr-Sommerfeld-type resonance conditions, formulas beyond the leading order terms for the complex frequencies corresponding to the weakly damped quasinormal modes

Abstract:
Phenomenology of relic neutralinos is analyzed in an effective supersymmetric scheme at the electroweak scale. It is shown that current direct experiments for WIMPs, when interpreted in terms of relic neutralinos, are indeed probing regions of supersymmetric parameter space compatible with all present experimental bounds.

Abstract:
It has been pointed out that for some types of measurement the Heisenberg uncertainty relation seems to be violated. In order to save the situation a new uncertainty relation was proposed by Ozawa. Here we introduce revised definitions of error and disturbance taking into account the gain associated with generalized measurement interactions. With these new definitions, the validity of the Heisenberg inequality is recovered for continuous linear measurement interactions. We also examine the changes in distribution functions caused by the general measurement interaction and clarify the physical meanings of infinitely large errors and disturbances.

It briefly recalls the theory of Bell’s inequality and some experimental
measures. Then measurements are processed on one hand according to a property
of the wave function, on the other hand according to the sum definition. The
results of such processed measures are apparently not the same, so Bell’s
inequality would not be violated. It is a use of the wave function which
implies the violation of the inequality, as it can be seen on the last
flowcharts.