Publish in OALib Journal

ISSN: 2333-9721

APC: Only $99


Any time

2019 ( 209 )

2018 ( 322 )

2017 ( 284 )

2016 ( 394 )

Custom range...

Search Results: 1 - 10 of 170690 matches for " Donald E Henson "
All listed articles are free for downloading (OA Articles)
Page 1 /170690
Display every page Item
Classifying the precancers: A metadata approach
Jules J Berman, Donald E Henson
BMC Medical Informatics and Decision Making , 2003, DOI: 10.1186/1472-6947-3-8
Abstract: Terms in the UMLS (Unified Medical Language System) related to precancers were extracted. Extracted terms were reviewed and additional terms added. Each precancer was assigned one of six general classes. The entire classification was assembled as an XML (eXtensible Mark-up Language) file. A Perl script converted the XML file into a browser-viewable HTML (HyperText Mark-up Language) file.The classification contained 4700 precancer terms, 568 distinct precancer concepts and six precancer classes: 1) Acquired microscopic precancers; 2) acquired large lesions with microscopic atypia; 3) Precursor lesions occurring with inherited hyperplastic syndromes that progress to cancer; 4) Acquired diffuse hyperplasias and diffuse metaplasias; 5) Currently unclassified entities; and 6) Superclass and modifiers.This work represents the first attempt to create a comprehensive listing of the precancers, the first attempt to classify precancers by their biological properties and the first attempt to create a pathologic classification of precancers using standard metadata (XML). The classification is placed in the public domain, and comment is invited by the authors, who are prepared to curate and modify the classification.Premalignant lesions are arguably the most important disease entities of modern man. In theory, the identification and elimination of cancer precursors would lead to the near-eradication of cancer [1]. The importance of the precancers was recently emphasized by the American Association for Cancer Research Task Force on the Treatment and Prevention of Intraepithelial Neoplasia [2]. In this report, the Task Force recognized IEN [intraepithelial neoplasia] as a near-obligate precursor to invasive cancer and identified IEN as a treatable disease. "Reducing IEN burden, therefore, is an important and suitable goal for medical (noninvasive) intervention to reduce invasive cancer risk and to reduce surgical morbidity. Achieving the prevention and regression of IEN confers an
Developing Prognostic Systems of Cancer Patients by Ensemble Clustering
Dechang Chen,Kai Xing,Donald Henson,Li Sheng,Arnold M. Schwartz,Xiuzhen Cheng
Journal of Biomedicine and Biotechnology , 2009, DOI: 10.1155/2009/632786
Abstract: Accurate prediction of survival rates of cancer patients is often key to stratify patients for prognosis and treatment. Survival prediction is often accomplished by the TNM system that involves only three factors: tumor extent, lymph node involvement, and metastasis. This prediction from the TNM has been limited, because other potential prognostic factors are not used in the system. Based on availability of large cancer datasets, it is possible to establish powerful prediction systems by using machine learning procedures and statistical methods. In this paper, we present an ensemble clustering-based approach to develop prognostic systems of cancer patients. Our method starts with grouping combinations that are formed using levels of factors recorded in the data. The dissimilarity measure between combinations is obtained through a sequence of data partitions produced by multiple use of PAM algorithm. This dissimilarity measure is then used with a hierarchical clustering method in order to find clusters of combinations. Prediction of survival is made simply by using the survival function derived from each cluster. Our approach admits multiple factors and provides a practical and useful tool in outcome prediction of cancer patients. A demonstration of use of the proposed method is given for lung cancer patients.
On finite approximations of topological algebraic systems
L. Yu. Glebsky,E. I. Gordon,C. W. Henson
Mathematics , 2003,
Abstract: We introduce and discuss a definition of approximation of a topological algebraic system $A$ by finite algebraic systems of some class $\K$. For the case of a discrete algebraic system this definition is equivalent to the well-known definition of a local embedding of an algebraic system $A$ in a class $\K$ of algebraic systems. According to this definition $A$ is locally embedded in $K$ iff it is a subsystem of an ultraproduct of some systems in $\K$. We obtain a similar characterization of approximation of a locally compact system $A$ by systems in $\K$. We inroduce the bounded formulas of the signature of $A$ and their approximations similar to those introduced by C.W.Henson \cite{he} for Banach spaces. We prove that a positive bounded formula $\f$ holds in $A$ if all precise enough approximations of $\f$ hold in all precise enough approximations of $A$. We prove that a locally compact field cannot be approximated by finite associative rings (not necessary commutative). Finite approximations of the field $\R$ can be concedered as computer systems for reals. Thus, it is impossible to construct a computer arithmetic for reals that is an associative ring.
Ergodic theorem for a Loeb space and hyperfinite approximations of dynamical systems
L. Yu. Glebsky,E. I. Gordon,C. W. Henson
Mathematics , 2011,
Abstract: Although the G.Birkhoff Ergodic Theorem (BET) is trivial for finite spaces, this does not help in proving it for hyperfinite Loeb spaces. The proof of the BET for this case, suggested by T. Kamae, works, actually, for arbitrary probability spaces, as it was shown by Y. Katznelson and B. Weiss. In this paper we discuss the reason why the usual approach, based on transfer of some simple facts about arbitrary large finite spaces on infinite spaces using nonstandard analysis technique, does not work for the BET. We show that the the BET for hyperfinite spaces may be interpreted as some qualitative result for very big finite spaces. We introduce the notion of a hyperfinite approximation of a dynamical system and prove the existence of such an approximation. The standard versions of the results obtained in terms of sequences of finite dynamical systems are formulated.
Nonstandard analysis of the behavior of ergodic means of dynamical systems on very big finite probability spaces
E. I. Gordon,L. Yu. Glebsky,C. W. Henson
Mathematics , 2012,
Abstract: The trivial proof of the ergodic theorem for a finite set $Y$ and a permutation $T:Y\to Y$ shows that for an arbitrary function $f:Y\to{\mathbb R}$ the sequence of ergodic means $A_n(f,T)$ stabilizes for $n \gg |T|$. We show that if $|Y|$ is very large and $|f(y)| \ll |Y|$ for almost all $y$, then $A_n(f,T)$ stabilizes for significantly long segments of very large numbers $n$ that are, however, $\ll |T|$. This statement has a natural rigorous formulation in the setting of nonstandard analysis, which is, in fact, equivalent to the ergodic theorem for infinite probability spaces. Its standard formulation in terms of sequences of finite probability spaces is complicated. We also discuss some other properties of the sequence $A_n(f,T)$ for very large finite $|Y|$ and $n$. A special consideration is given to the case, when a very big finite space $Y$ and its permutation $T$ approximate a dynamical system $(X,\nu, \tau)$, where $X$ is compact metric space, $\nu$ is a Borel measure on $X$ and $\tau:X\to X$ is a measure preserving transformation. The definition of approximation introduced here is new to our knowledge.
Centre for Audio-Visual Study and Practice in Archaeology (CASPAR)
Don Henson
Archaeology International , 2011, DOI: 10.5334/ai.1311
Coarse graining dynamical triangulations: a new scheme
Joe Henson
Physics , 2009, DOI: 10.1088/0264-9381/26/17/175019
Abstract: A new procedure for coarse-graining dynamical triangulations is presented. The procedure provides a meaning for the relevant value of observables when "probing at large scales", e.g. the average scalar curvature. The scheme may also be useful as a starting point for a new type of renormalisation procedure, suitable for dynamically triangulated quantum gravity. Random Delaunay triangulations have previously been used to produce discretisations of continuous Euclidean manifolds, and the coarse-graining scheme is an extension of this idea, using random simplicial complexes produced from a dynamical triangulation. In order for a coarse-graining process to be useful, it should preserve the properties of the original dynamical triangulation that are relevant when probing at large scales. Some general discussion of this point is given, along with some arguments in favour of the proposed scheme.
Discovering the Discrete Universe
Joe Henson
Physics , 2010,
Abstract: This paper presents an brief review of some recent work on the causal set approach to quantum gravity. Causal sets are a discretisation of spacetime that allow the symmetries of GR to be preserved in the continuum approximation. One proposed application of causal sets is to use them as the histories in a quantum sum-over-histories, i.e. to construct a quantum theory of spacetime. It is expected by many that quantum gravity will introduce some kind of "fuzziness", uncertainty and perhaps discreteness into spacetime, and generic effects of this fuzziness are currently being sought. Applied as a model of discrete spacetime, causal sets can be used to construct simple phenomenological models which allow us to understand some of the consequences of this general expectation.
Comparing causality principles
Joe Henson
Physics , 2004,
Abstract: The principle of common cause is discussed as a possible fundamental principle of physics. Some revisions of Reichenbach's formulation of the principle are given, which lead to a version given by Bell. Various similar forms are compared and some equivalence results proved. The further problems of causality in a quantal system, and indeterministic causal structure, are addressed, with a view to defining a causality principle applicable to quantum gravity.
Quantum Histories and Quantum Gravity
Joe Henson
Physics , 2009, DOI: 10.1088/1742-6596/174/1/012020
Abstract: This paper reviews the histories approach to quantum mechanics. This discussion is then applied to theories of quantum gravity. It is argued that some of the quantum histories must approximate (in a suitable sense) to classical histories, if the correct classical regime is to be recovered. This observation has significance for the formulation of new theories (such as quantum gravity theories) as it puts a constraint on the kinematics, if the quantum/classical correspondence principle is to be preserved. Consequences for quantum gravity, particularly for Lorentz symmetry and the idea of "emergent geometry", are discussed.
Page 1 /170690
Display every page Item

Copyright © 2008-2017 Open Access Library. All rights reserved.