Abstract:
Terms in the UMLS (Unified Medical Language System) related to precancers were extracted. Extracted terms were reviewed and additional terms added. Each precancer was assigned one of six general classes. The entire classification was assembled as an XML (eXtensible Mark-up Language) file. A Perl script converted the XML file into a browser-viewable HTML (HyperText Mark-up Language) file.The classification contained 4700 precancer terms, 568 distinct precancer concepts and six precancer classes: 1) Acquired microscopic precancers; 2) acquired large lesions with microscopic atypia; 3) Precursor lesions occurring with inherited hyperplastic syndromes that progress to cancer; 4) Acquired diffuse hyperplasias and diffuse metaplasias; 5) Currently unclassified entities; and 6) Superclass and modifiers.This work represents the first attempt to create a comprehensive listing of the precancers, the first attempt to classify precancers by their biological properties and the first attempt to create a pathologic classification of precancers using standard metadata (XML). The classification is placed in the public domain, and comment is invited by the authors, who are prepared to curate and modify the classification.Premalignant lesions are arguably the most important disease entities of modern man. In theory, the identification and elimination of cancer precursors would lead to the near-eradication of cancer [1]. The importance of the precancers was recently emphasized by the American Association for Cancer Research Task Force on the Treatment and Prevention of Intraepithelial Neoplasia [2]. In this report, the Task Force recognized IEN [intraepithelial neoplasia] as a near-obligate precursor to invasive cancer and identified IEN as a treatable disease. "Reducing IEN burden, therefore, is an important and suitable goal for medical (noninvasive) intervention to reduce invasive cancer risk and to reduce surgical morbidity. Achieving the prevention and regression of IEN confers an

Abstract:
Accurate prediction of survival rates of cancer patients is often key to stratify patients for prognosis and treatment. Survival prediction is often accomplished by the TNM system that involves only three factors: tumor extent, lymph node involvement, and metastasis. This prediction from the TNM has been limited, because other potential prognostic factors are not used in the system. Based on availability of large cancer datasets, it is possible to establish powerful prediction systems by using machine learning procedures and statistical methods. In this paper, we present an ensemble clustering-based approach to develop prognostic systems of cancer patients. Our method starts with grouping combinations that are formed using levels of factors recorded in the data. The dissimilarity measure between combinations is obtained through a sequence of data partitions produced by multiple use of PAM algorithm. This dissimilarity measure is then used with a hierarchical clustering method in order to find clusters of combinations. Prediction of survival is made simply by using the survival function derived from each cluster. Our approach admits multiple factors and provides a practical and useful tool in outcome prediction of cancer patients. A demonstration of use of the proposed method is given for lung cancer patients.

Abstract:
A new procedure for coarse-graining dynamical triangulations is presented. The procedure provides a meaning for the relevant value of observables when "probing at large scales", e.g. the average scalar curvature. The scheme may also be useful as a starting point for a new type of renormalisation procedure, suitable for dynamically triangulated quantum gravity. Random Delaunay triangulations have previously been used to produce discretisations of continuous Euclidean manifolds, and the coarse-graining scheme is an extension of this idea, using random simplicial complexes produced from a dynamical triangulation. In order for a coarse-graining process to be useful, it should preserve the properties of the original dynamical triangulation that are relevant when probing at large scales. Some general discussion of this point is given, along with some arguments in favour of the proposed scheme.

Abstract:
This paper presents an brief review of some recent work on the causal set approach to quantum gravity. Causal sets are a discretisation of spacetime that allow the symmetries of GR to be preserved in the continuum approximation. One proposed application of causal sets is to use them as the histories in a quantum sum-over-histories, i.e. to construct a quantum theory of spacetime. It is expected by many that quantum gravity will introduce some kind of "fuzziness", uncertainty and perhaps discreteness into spacetime, and generic effects of this fuzziness are currently being sought. Applied as a model of discrete spacetime, causal sets can be used to construct simple phenomenological models which allow us to understand some of the consequences of this general expectation.

Abstract:
The principle of common cause is discussed as a possible fundamental principle of physics. Some revisions of Reichenbach's formulation of the principle are given, which lead to a version given by Bell. Various similar forms are compared and some equivalence results proved. The further problems of causality in a quantal system, and indeterministic causal structure, are addressed, with a view to defining a causality principle applicable to quantum gravity.

Abstract:
This paper reviews the histories approach to quantum mechanics. This discussion is then applied to theories of quantum gravity. It is argued that some of the quantum histories must approximate (in a suitable sense) to classical histories, if the correct classical regime is to be recovered. This observation has significance for the formulation of new theories (such as quantum gravity theories) as it puts a constraint on the kinematics, if the quantum/classical correspondence principle is to be preserved. Consequences for quantum gravity, particularly for Lorentz symmetry and the idea of "emergent geometry", are discussed.

Abstract:
This essay discusses the idea that a Theory of Everything would not be complete without a theory of consciousness as one of its parts, and the suggestion that new physics may be needed to describe consciousness. I argue that the motivations behind searching for such a theory arise as the result of misunderstandings over the use of language when talking about consciousness.

Abstract:
This paper addresses arguments that "separability" is an assumption of Bell's theorem, and that abandoning this assumption in our interpretation of quantum mechanics (a position sometimes referred to as "holism") will allow us to restore a satisfying locality principle. Separability here means that all events associated to the union of some set of disjoint regions are combinations of events associated to each region taken separately. In this article, it is shown that: (a) localised events can be consistently defined without implying separability; (b) the definition of Bell's locality condition does not rely on separability in any way; (c) the proof of Bell's theorem does not use separability as an assumption. If, inspired by considerations of non-separability, the assumptions of Bell's theorem are weakened, what remains no longer embodies the locality principle. Teller's argument for "relational holism" and Howard's arguments concerning separability are criticised in the light of these results. Howard's claim that Einstein grounded his arguments on the incompleteness of QM with a separability assumption is also challenged. Instead, Einstein is better interpreted as referring merely to the existence of localised events. Finally, it is argued that Bell rejected the idea that separability is an assumption of his theorem.

Abstract:
This article concerns the fate of local Lorentz invariance in quantum gravity, particularly for approaches in which a discrete structure replaces continuum spacetime. Some features of standard quantum mechanics, presented in a sum-over-histories formulation, are reviewed, and their consequences for such theories are discussed. It is argued that, if the individual histories of a theory give bad approximations to macroscopic continuum properties in some frames, then it is inevitable that the theory violates Lorentz symmetry.