Abstract:
Background: Teeth when subjected to bleaching bring about the desiccation of the enamel, making it more susceptible to stain absorption. While subjecting the freshly bleached enamel surface to various surface treatments of Fluoride and Casein Phosphopeptide - Amorphous Calcium phosphate (CPP-ACP) brought about the reduction in stain absorption, which is assessed in this study. Aims: The study aims to evaluate the tea stain absorption on freshly bleached enamel surface of extracted human teeth with varied surface treatment. The stain absorption was evaluated at the end of one hour and 24 hours post bleaching. Materials and Methods: Forty extracted human permanent maxillary central incisors were subjected to bleaching with 10% carbamide peroxide for eight days. They were divided into four groups of 10 each. Group I was control group. Group II was immersed in tea solution without surface treatment, while Group III and IV were immersed in tea solution with surface treatment of topical Fluoride and CPP-ACP respectively. Spectrophotometer was used for color analysis. Results: Surface treatment with CPP-ACP and topical fluoride on freshly beached enamel surface, significantly reduced the stain absorption. Conclusion: Remineralizing agents reduce stain absorption after tooth bleaching.

Abstract:
Many statistical problems involve data from thousands of parallel cases. Each case has some associated effect size, and most cases will have no effect. It is often important to estimate the effect size and the local or tail-area false discovery rate for each case. Most current methods do this separately, and most are designed for normal data. This paper uses an empirical Bayes mixture model approach to estimate both quantities together for exponential family data. The proposed method yields simple, interpretable models that can still be used nonparametrically. It can also estimate an empirical null and incorporate it fully into the model. The method outperforms existing effect size and false discovery rate estimation procedures in normal data simulations; it nearly acheives the Bayes error for effect size estimation. The method is implemented in an R package (mixfdr), freely available from CRAN.

Abstract:
Developmental and growth rates are known to vary in response to genetic, developmental, physiological and environmental factors. However, developmental variations that exist within a cohort under any constant rearing condition are not so well investigated. A few such prominent polymorphisms have been studied, but not the subtle ones. The current study investigates the presence of such varying rates of development, slow and fast, in a cohort reared under constant conditions in two ladybirds, Cheilomenes sexmaculata and Propylea dissecta. Our results reveal slow and fast developers in the cohorts of each species and the ratio of slow and fast developers was similar. Slow developers showed a female biased sex ratio. The two developmental variants differed significantly in juvenile duration only in the first instar and the pupal stage, though variations in developmental time were observed in all stages. Fecundity was higher in slow developers, but developmental rates did not affect egg viability. The similar ratio in both ladybirds indicates it to be a result of either presence of a constant ratio across species or an effect of the similar rearing environment.

Abstract:
Estimator algorithms in learning automata are useful tools for adaptive, real-time optimization in computer science and engineering applications. This paper investigates theoretical convergence properties for a special case of estimator algorithms: the pursuit learning algorithm. In this note, we identify and fill a gap in existing proofs of probabilistic convergence for pursuit learning. It is tradition to take the pursuit learning tuning parameter to be fixed in practical applications, but our proof sheds light on the importance of a vanishing sequence of tuning parameters in a theoretical convergence analysis.

Abstract:
Many large-scale machine learning problems involve estimating an unknown parameter $\theta_{i}$ for each of many items. For example, a key problem in sponsored search is to estimate the click through rate (CTR) of each of billions of query-ad pairs. Most common methods, though, only give a point estimate of each $\theta_{i}$. A posterior distribution for each $\theta_{i}$ is usually more useful but harder to get. We present a simple post-processing technique that takes point estimates or scores $t_{i}$ (from any method) and estimates an approximate posterior for each $\theta_{i}$. We build on the idea of calibration, a common post-processing technique that estimates $\mathrm{E}\left(\theta_{i}\!\!\bigm|\!\! t_{i}\right)$. Our method, second order calibration, uses empirical Bayes methods to estimate the distribution of $\theta_{i}\!\!\bigm|\!\! t_{i}$ and uses the estimated distribution as an approximation to the posterior distribution of $\theta_{i}$. We show that this can yield improved point estimates and useful accuracy estimates. The method scales to large problems - our motivating example is a CTR estimation problem involving tens of billions of query-ad pairs.

Aim: To evaluate the radioopacity of fish bones
from a number of species using digital radiography in order to establish
whether advances in acquisition and interpretative techniques have affected the
radiologist’s ability to detect impacted fish bones. Methods: The bones from
six species of fish commonly consumed in the United Kingdom were radiographed
using a soft tissue neck phantom by means of a digital radiographic X-ray
tube. The images were looked at by 15 radiology consultants and registrars who
determined whether the bones were visible or not using General Electric (GE)
PACS workstations. Results: The radio-graphed bones from all six species of
fish were visible by all 15 (100%) radiology registrars and consultants.
Conclusion: Digital radiogramphy and modern PACS workstations have meant that
fish bones can be visualized irrespective of species. The lateral neck radiograph therefore may still have an important role in the investigation of
impacted fish bones in the aerodigestive tract.

Abstract:
Calibration is a basic property for prediction systems, and algorithms for achieving it are well-studied in both statistics and machine learning. In many applications, however, the predictions are used to make decisions that select which observations are made. This makes calibration difficult, as adjusting predictions to achieve calibration changes future data. We focus on click-through-rate (CTR) prediction for search ad auctions. Here, CTR predictions are used by an auction that determines which ads are shown, and we want to maximize the value generated by the auction. We show that certain natural notions of calibration can be impossible to achieve, depending on the details of the auction. We also show that it can be impossible to maximize auction efficiency while using calibrated predictions. Finally, we give conditions under which calibration is achievable and simultaneously maximizes auction efficiency: roughly speaking, bids and queries must not contain information about CTRs that is not already captured by the predictions.