Abstract:
The paper questions the robustness of average case time complexity of the fast and popular quicksort algorithm. Among the six standard probability distributions examined in the paper, only continuous uniform, exponential and standard normal are supporting it whereas the others are supporting the worst case complexity measure. To the question -why are we getting the worst case complexity measure each time the average case measure is discredited?-- one logical answer is average case complexity under the universal distribution equals worst case complexity. This answer, which is hard to challenge, however gives no idea as to which of the standard probability distributions come under the umbrella of universality. The morale is that average case complexity measures, in cases where they are different from those in worst case, should be deemed as robust provided only they get the support from at least the standard probability distributions, both discrete and continuous. Regretfully, this is not the case with quicksort.

Abstract:
A lot has been done on the randomness of the decimal expansion of Pi with extensive tests of randomness that are used to distinguish good from not-so-good random number generators when applied to the decimal digits of Pi. Pi seems to pass these tests as well as some of the best random number generator (RNG) and could well serve as an RNG provided that the digits of Pi could be easily and quickly produced in the computer [Mar06]. We make an interesting study in the same context in which random substring of arbitrary length are extracted from arbitrary positions a large number of times and each sample is tested for randomness. Our results confirm the randomness of Pi and a recent claim that “Pi is less random than we thought” [TF05] stands refuted. George Marsaglia [Mar06] has also independently refuted the claim but in Marsaglia’s work, the randomness is established on the whole for the first 960 million digits of pi. Our study confirms the randomness for arbitrary subsequences also. Finally, the investigation of some functions of pi-rather than pi itself-is proposed.

Abstract:
Under the umbrella of statistical algorithmic complexity (which some authors call stochastic arithmetic) , it makes sense to talk about statistical bounds (asymptotic) and their empirical estimates over a finite range (a computer experiment cannot be run for infinite input size!), the so called empirical O, which were informally introduced in Chakraborty and Sourabh where it was shown that they make average complexity more meaningful. The present study shows that these concepts can be used effectively in worst cases as well as in best cases besides average cases with a case study on an efficient determinant algorithm.

Abstract:
Recently, there has been an upsurge of the number of articles on spatio-temporal modeling in statistical journals. Many of them focus on building good nonstationary spatio-temporal models. In this article, we introduce a state space based nonparametric nonstationary model for the analysis of spatio-temporal data. We consider that there are some fixed spatial locations (generally called the monitoring sites) and that the data have been observed at those locations over a period of time. To model the data we assume that the data generating process is driven by some latent spatio-temporal process, which itself is evolving with time in some unknown way. We model this evolutionary transformation via compositions of a Gaussian processand also model the unknown functional dependence between the data generating process and the latent spatio-temporal process (observational transformation) by another Gaussian process. We investigate this model in detail, explore the covariance structure and formulate a fully Bayesian method for inference andprediction. Finally, we apply our nonparametric model on two simulated data sets and a real data set and establish its effectiveness.

Abstract:
Heat exchangers have its major application in
automobile, air condition, refrigerator, power plants, and many others. Heat
transfer characteristics and performance of Copper spiral heat exchanger are
investigated and compared with pure water. Nanofluid can enhance
thermos-physical properties. Experiment is carried out for water based SiO_{2} Nanofluid with 15nm average sized
nanoparticle at varying air velocity and mass flow rate of fluid to investigate
its effect on heat transfer coefficient. From the experimental data, a closed
form solution for Nusselt number has been calculated using ∈-NTU method. A new correlation has been proposed as
a function of Reynolds number and Prandtl number. The heat transfer rate,
effectiveness, has been significantly higher compared to pure water and with increasing volume fraction of nanoparticles.

Abstract:
Microarray has been a popular method for representing biological data. Microarray technology allows biologists to monitor genome-wide patterns of gene expression in a high-throughput fashion. Clustering the biological sequences according to their components may reveal the biological functionality among the sequences. Data cluster analysis is an important task in microarray data. There is no clustering algorithm that can be universally used to solve all problems. Therefore in this paper comparative study of data cluster analysis for microarray is presented. Here the most popular cluster algorithms that can be applied for microarray data are discussed. The uncertainty of data, optimization and density estimation are considered for comparison.

Abstract:
Recently Dutta and Bhattacharya (2013) introduced a novel Markov Chain Monte Carlo methodology that can simultaneously update all the components of high dimensional parameters using simple deterministic transformations of a one-dimensional random variable drawn from any arbitrary distribution defined on a relevant support. The methodology, which the authors refer to as Transformation-based Markov Chain Monte Carlo (TMCMC), greatly enhances computational speed and acceptance rate in high-dimensional problems. Two significant transformations associated with TMCMC are additive and multiplicative transformations. Combinations of additive and multiplicative transformations are also of much interest. In this work we investigate geometric ergodicity associated with additive and multiplicative TMCMC, along with their combinations, and illustrate their efficiency in practice with simulation studies.

Abstract:
Study of diffusion limits of the Metropolis-Hastings algorithm in high dimensions yields useful quantificaton of the scaling of the underlying proposal distribution in terms of the dimensionality. Here we consider the recently introduced Transformation-based Markov Chain Monte Carlo (TMCMC) (Dutta and Bhattacharya (2013a)), a methodology that is designed to update all the parameters simultaneously using some simple deterministic transformation of a one-dimensional random variable drawn from some arbitrary distribution on a relevant support. The additive transformation based TMCMC is similar in spirit to random walk Metropolis, except the fact that unlike the latter, additive TMCMC uses a single draw from a one-dimensional proposal distribution to update the high-dimensional parameter. In this paper, we study the diffusion limits of additive TMCMC under various set-ups ranging from the product structure of the target density to the case where the target is absolutely continuous with respect to a Gaussian measure; we also consider the additive TMCMC within Gibbs approach for all the above set-ups. These investigations lead to appropriate scaling of the one-dimensional proposal density. We also show that the optimal acceptance rate of additive TMCMC is 0.439 under all the aforementioned set-ups, in contrast with the well-established 0.234 acceptance rate associated with optimal random walk Metropolis algorithms under the same set-ups. We also elucidate the ramifications of our results and the advantages of additive TMCMC over random walk Metropolis with ample simulation studies. It was found that in all set ups we considered- namely iid component set up, independent scaling components, a wide range of dependent set ups as well as real spatial data application- TMCMC performed much better and converged to the target density much faster than the RWMH algorithm.

Abstract:
This study was carried out to investigate the therapeutic role of the ethanolic extract of Pleurotus cornucopiae on sodium arsenite induced nephrotoxicity in rats. Sodium arsenite at the dose of 8 mg•kg^{–1} body weight orally caused renal damage in rats as manifested by the significant rise in serum levels of serum urea, uric acid and creatinine level compared with control. Ethanolic extracts of P. cornucopiae (400 mg•kg^{–1} body weight per day) was administered orally for 30 days to sodium arsenite pre-treated rats. The results show significant decrease in the serum urea, uric acid and creatinine levels in comparison to the arsenic treated denotes the nephroprotective effect of P. cornucopiae against sodium arsenite induced toxicity. Furthermore, it also possesses antioxidant effect as lipid peroxidation (MDA) levels decreased in P. cornucopiae treated group in comparison to arsenic treated group. Thus, the present study reveals that P. cornucopiae possesses nephroprotective as well as antioxidant property against arsenic induced toxicity.

Abstract:
Historical research can also mean gathering data from situations that have already occurred and performing statistical analysis on this data just as we would in a traditional experiment. The one key difference between this type of research and the type described in the first paragraph concerns the manipulation of data. Since historical research relies on data from the past, there is no way to manipulate it. Studying the grades of older students, for example, and younger students may provide some insight into the differences between these two groups, but manipulating the work experience is impossible. Therefore, historical research can often lead to present day experiments that attempt to further explore what has occurred in the past.Historical method comprises the techniques and guidelines by which historians use primary sources and other evidence to research and then to write histories in the form of accounts of the past. The question of the nature, and even the possibility, of a sound historical method is raised in the philosophy