Abstract:
Objective Here, we use pattern-classification to investigate diagnostic information for multiple sclerosis (MS; relapsing-remitting type) in lesioned areas, areas of normal-appearing grey matter (NAGM), and normal-appearing white matter (NAWM) as measured by standard MR techniques. Methods A lesion mapping was carried out by an experienced neurologist for Turbo Inversion Recovery Magnitude (TIRM) images of individual subjects. Combining this mapping with templates from a neuroanatomic atlas, the TIRM images were segmented into three areas of homogenous tissue types (Lesions, NAGM, and NAWM) after spatial standardization. For each area, a linear Support Vector Machine algorithm was used in multiple local classification analyses to determine the diagnostic accuracy in separating MS patients from healthy controls based on voxel tissue intensity patterns extracted from small spherical subregions of these larger areas. To control for covariates, we also excluded group-specific biases in deformation fields as a potential source of information. Results Among regions containing lesions a posterior parietal WM area was maximally informative about the clinical status (96% accuracy, p<10？13). Cerebellar regions were maximally informative among NAGM areas (84% accuracy, p<10？7). A posterior brain region was maximally informative among NAWM areas (91% accuracy, p<10？10). Interpretation We identified regions indicating MS in lesioned, but also NAGM, and NAWM areas. This complements the current perception that standard MR techniques mainly capture macroscopic tissue variations due to focal lesion processes. Compared to current diagnostic guidelines for MS that define areas of diagnostic information with moderate spatial specificity, we identified hotspots of MS associated tissue alterations with high specificity defined on a millimeter scale.

Abstract:
We conducted a cross-sectional survey in a random sample of adults ≥40 years of age living in Pampas de San Juan de Miraflores, Lima, Peru. We asked participants to respond to a survey that included questions on sociodemographics, tobacco use and dependence.We enrolled 316 participants. Average monthly household income was ≤ 400 USD and nearly all homes had running water, sewage, and electricity. Most individuals had not completed high school. Smoking prevalence was 16% overall, yet daily smoking prevalence was 1.9%. Former daily smokers comprised 3.8% of current nonsmokers and 9.1% current occasional smokers. Average scores for the Fagerstrom Test for Nicotine Dependence for daily smokers and occasional smokers were 1.5 and 0, respectively.Daily use of tobacco is uncommon among adults in peri-urban communities of Lima, Peru, unlike their counterparts in Lima and other Latin American capital cities. Tobacco dependence is also low. Hence, efforts aimed at primary prevention are of utmost importance in these communities. This study provides an accurate baseline using an internationally recognized assessment tool (Global Adult Tobacco Survey), allowing for accurate assessment of tobacco control interventions over time.Tobacco use is the leading cause of preventable death in the world, resulting in millions of deaths annually, more than HIV/AIDS, tuberculosis and malaria [1]. Tobacco smoking is an important public health concern worldwide leading to pulmonary disease, various cancers including those of the respiratory, digestive, and genitourinary systems, and certain forms of leukemia and premature death [2]. Tobacco smoking causes over half of all avoidable deaths worldwide [3]. It accounted for an estimated four to five million deaths per year by 2000 [4], and contributed to an estimated 4.1% of years of life lost [5]. Low- and middle-income countries comprise 82% of the world smoking population, consume 74% of the total number of inhaled tobacco products consumed ea

Abstract:
The yield map is generated by fitting the yield surface shape of yield monitor data mainly using paraboloid cones on floating neighborhoods. Each yield map value is determined by the fit of such a cone on an elliptical neighborhood that is wider across the harvest tracks than it is along them. The coefficients of regression for modeling the paraboloid cones and the scale parameter are estimated using robust weighted M-estimators where the weights decrease quadratically from 1 in the middle to zero at the border of the selected neighborhood. The robust way of estimating the model parameters supersedes a procedure for detecting outliers. For a given neighborhood shape, this yield mapping method is implemented by the Fortran program paraboloidmapping.exe, which can be downloaded from the web. The size of the selected neighborhood is considered appropriate if the variance of the yield map values equals the variance of the true yields, which is the difference between the variance of the raw yield data and the error variance of the yield monitor. It is estimated using a robust variogram on data that have not had the trend removed.

Abstract:
Nitrogen rate trials are often performed to determine the economically optimum N application rate. For this purpose, the yield is modeled as a function of the N application. The regression analysis provides an estimate of the modeled function and thus also an estimate of the economic optimum, N_{opt}. Obtaining the accuracy of such estimates by confidence intervals for N_{opt} is subject to the model assumptions. The dependence of these assumptions is a further source of inaccuracy. The N_{opt} estimate also strongly depends on the N level design, i.e., the area on which the model is fitted. A small area around the supposed N_{opt} diminishes the dependence of the model assumptions, but prolongs the confidence interval. The investigations of the impact of the mentioned sources on the inaccuracy of the N_{opt} estimate rely on N rate trials on the experimental field Sieblerfeld (Bavaria). The models applied are the quadratic and the linear-plus-plateau yield regression model.

Abstract:
The House of Asterion is a short story by Jorge Luis Borges that retells the classical myth of the Cretan Minotaur from an alternate perspective. The House of Asterion features the Minotaur, aka Asterion, who waits for “redemption” in his labyrinth. Many literary critics have suggested that the Borgesian labyrinth is a metaphor for human existence and the universe itself. Others have correctly interpreted Asterion’s ironic death at the hands of Theseus as his eagerly awaited redemption. Borges’ subversion of the reader’s expectations becomes the departure point for a systemic functional stylistic analysis of the story in one of its English translations, revealing how deeper-level meanings in the text are construed through its lexicogrammatical structure. A systemic functional stylistic reading suggests that on a higher level of reality, Asterion’s redemption is not only the freedom that death affords, but also a transformation that transcends his fictional universe. Asterion’s twofold redemption is brought about not only by the archetypal hero Theseus but also by the reader, who through the process of reading enables Asterion’s emancipation from the labyrinth.

Many biodiversity researchers have
responded to the financial constraints faced by policy makers to develop models
based upon the “Noah’s Ark” metaphor, implying that society can save only a
limited amount of biodiversity. Unfortunately, as Herman Daly (Land Economics, 1991) pointed out, such
microeconomic rules can allow an ark to sink albeit in some optimal fashion.
So, I step back to look at the macroeconomic question, how big should the ark
be? I start with Norgaard’s (Ecological
Economics, 2010) framework, which is based upon the concept of a production
possibility frontier combined with a sustainability criterion. I develop a
model from that starting point by shifting to an isoquant framework while
maintaining the strong sustainability criterion. I demonstrate how this model
allows for identifying and addressing the key biodiversity protection policy
criteria at the macroeconomic level. One key conclusion from this modeling is
that Daly’s analysis remains remarkably prescient.

Abstract:
The Navier-Stokes equations for incompressible fluid flows with impervious boundary and free surface are analyzed by means of a perturbation procedure involving dimensionless variables and a dimensionless perturbation parameter which is composed of kinematic viscosity of fluid, the acceleration of gravity and a characteristic length. The new dimensionless variables are introduced into the equation system. In addition, the perturbation parameter is introduced into terms for deriving approximations systems of different orders. Such systems are obtained by equating coefficients of like powers of perturbation parameter for the successive coefficients in the series. In these systems several terms are analyzed with regards to size and significance. Based on those systems, suitable solutions of NS equations can be found for different boundary conditions. For example, a relation for stationary channel flow is obtained as approximation to the NS equations of the lowest order after transformation back to dimensional variables.

Abstract:
In this
paper, I suggest a possible explanation for the accelerating expansion of the
universe. This model does not require any dark energy or quintessence. Rather,
the idea is to suggest a different view on the origin of general relativity.
Since it is very difficult to say something in general, I will mainly restrict
myself to the case of very low curvature. The question about the underlying
reasons for the acceleration is also closely related to the question whether
the universe is a finite or infinite. It is part of the purpose of this paper
to argue that a phase of accelerating expansion may be very well compatible
with the idea of a closed universe.

Abstract:
In this
paper, a simple model for a closed multiverse as a finite probability space is
analyzed. For each moment of time on a discrete time-scale, only a finite
number of states are possible and hence each possible universe can be viewed as
a path in a huge but finite graph. By considering very general statistical
assumptions, essentially originating from Boltzmann, we make the set of all such
paths (the multiverse) into a probability space, and argue that under certain
assumptions, the probability for a monotonic behavior of the entropy is
enormously much larger then for a behavior with low entropy at both ends. The
methods used are just very simple combinatorial ones, but the conclusion
suggests that we may live in a multiverse which from a global point of view is
completely time-symmetric in the sense that universes with Time’s Arrow
directed forwards and backwards are equally probable. However, for an observer
confined to just one universe, time will still be asymmetric.

Abstract:
In this paper, we investigate a certain property of curvature which
differs in a remarkable way between Lorentz geometry and Euclidean geometry. In
a certain sense, it turns out that rotating topological objects may have less
curvature (as measured by integrating the square of the scalar curvature) than
non-rotating ones. This is a consequence of the indefinite metric used in
relativity theory. The results in this paper are mainly based of computer
computations, and so far there is no satisfactory underlying mathematical
theory. Some open problems are presented.