oalib
Search Results: 1 - 10 of 100 matches for " "
All listed articles are free for downloading (OA Articles)
Page 1 /100
Display every page Item
Cancer: looking for simplicity and finding complexity
Fabio Grizzi, Maurizio Chiriva-Internati
Cancer Cell International , 2006, DOI: 10.1186/1475-2867-6-4
Abstract: Carcinogenesis has long been thought to be a multi-step process [1]; however, it has only recently become possible to identify a large number of the molecular events underlying the initiation and progression of different human tumors [2]. After a quarter century of rapid advances, cancer research has generated an intricate body of knowledge showing that cancer is a disease that involves dynamic changes in the genome [3].The foundations of this knowledge were mainly laid by the discovery of genomic alterations or mutations that produce oncogenes with a dominant gain of function and tumour-suppressor genes with a recessive loss of function. Both of these cancer gene classes were identified on the basis of their alterations in human and animal neoplastic cells, and their elicitation of cancer phenotypes in experimental models [4-7].Although considerable advances have been made in terms of our molecular and cellular knowledge, very little is understood about the physics underlying human carcinogenesis. It is now well known that the conception of anatomical entities as an infinite hierarchy of infinitely graduated forms and the increasing discoveries of functional variables have generated a growing awareness of complexity, thus highlighting new and exciting properties of organized biological matter [8].More than 100 distinct types of human cancer have been described, and subtypes of tumors can be found within specific organs. Cancer is increasingly recognized as being a highly heterogeneous disease within individual tumors, and within and between tumour types [9]. This heterogeneity is manifested at both genetic and phenotypic level, and primarily determines the self-progression of neoplastic disease and its response to therapy.The discovery of this increasing complexity has led many researchers to ask a number of stimulating questions. How many distinct regulatory circuits within each type of target cell must be disrupted in order to make it become cancerous? Does the s
ARPES on HTSC: simplicity vs. complexity  [PDF]
A. A. Kordyuk,S. V. Borisenko
Physics , 2005, DOI: 10.1063/1.2199429
Abstract: A notable role in understanding of microscopic electronic properties of high temperature superconductors (HTSC) belongs to angle resolved photoemission spectroscopy (ARPES). This technique supplies a direct window into reciprocal space of solids: the momentum-energy space where quasiparticles (the electrons dressed in clouds of interactions) dwell. Any interaction in the electronic system, e.g. superconducting pairing, leads to modification of the quasi-particle spectrum--to redistribution of the spectral weight over the momentum-energy space probed by ARPES. A continued development of the technique had an effect that the picture seen through the ARPES window became clearer and sharper until the complexity of the electronic band structure of the cuprates had been resolved. Now, in an optimal for superconductivity doping range, the cuprates much resemble a normal metal with well predicted electronic structure, though with rather strong electron-electron interaction. This principal disentanglement of the complex physics from complex structure reduced the mystery of HTSC to a tangible problem of interaction responsible for quasi-particle formation. Here we present a short overview of resent ARPES results, which, we believe, denote a way to resolve the HTSC puzzle.
Instrumentational complexity of music genres and why simplicity sells  [PDF]
Gamaliel Percino,Peter Klimek,Stefan Thurner
Computer Science , 2014, DOI: 10.1371/journal.pone.0115255
Abstract: Listening habits are strongly influenced by two opposing aspects, the desire for variety and the demand for uniformity in music. In this work we quantify these two notions in terms of musical instrumentation and production technologies that are typically involved in crafting popular music. We assign a "complexity value" to each music style. A style is complex if it shows the property of having both high variety and low uniformity in instrumentation. We find a strong inverse relation between variety and uniformity of music styles that is remarkably stable over the last half century. Individual styles, however, show dramatic changes in their "complexity" during that period. Styles like "new wave" or "disco" quickly climbed towards higher complexity in the 70s and fell back to low complexity levels shortly afterwards, whereas styles like "folk rock" remained at constant high complexity levels. We show that changes in the complexity of a style are related to its number of sales and to the number of artists contributing to that style. As a style attracts a growing number of artists, its instrumentational variety usually increases. At the same time the instrumentational uniformity of a style decreases, i.e. a unique stylistic and increasingly complex expression pattern emerges. In contrast, album sales of a given style typically increase with decreasing complexity. This can be interpreted as music becoming increasingly formulaic once commercial or mainstream success sets in.
On the intelligibility of the universe and the notions of simplicity, complexity and irreducibility  [PDF]
G. J. Chaitin
Mathematics , 2002,
Abstract: We discuss views about whether the universe can be rationally comprehended, starting with Plato, then Leibniz, and then the views of some distinguished scientists of the previous century. Based on this, we defend the thesis that comprehension is compression, i.e., explaining many facts using few theoretical assumptions, and that a theory may be viewed as a computer program for calculating observations. This provides motivation for defining the complexity of something to be the size of the simplest theory for it, in other words, the size of the smallest program for calculating it. This is the central idea of algorithmic information theory (AIT), a field of theoretical computer science. Using the mathematical concept of program-size complexity, we exhibit irreducible mathematical facts, mathematical facts that cannot be demonstrated using any mathematical theory simpler than they are. It follows that the world of mathematical ideas has infinite complexity and is therefore not fully comprehensible, at least not in a static fashion. Whether the physical world has finite or infinite complexity remains to be seen. Current science believes that the world contains randomness, and is therefore also infinitely complex, but a deterministic universe that simulates randomness via pseudo-randomness is also a possibility, at least according to recent highly speculative work of S. Wolfram.
The Nash Equilibrium Revisited: Chaos and Complexity Hidden in Simplicity  [PDF]
Philip V. Fellman
Computer Science , 2007,
Abstract: The Nash Equilibrium is a much discussed, deceptively complex, method for the analysis of non-cooperative games. If one reads many of the commonly available definitions the description of the Nash Equilibrium is deceptively simple in appearance. Modern research has discovered a number of new and important complex properties of the Nash Equilibrium, some of which remain as contemporary conundrums of extraordinary difficulty and complexity. Among the recently discovered features which the Nash Equilibrium exhibits under various conditions are heteroclinic Hamiltonian dynamics, a very complex asymptotic structure in the context of two-player bi-matrix games and a number of computationally complex or computationally intractable features in other settings. This paper reviews those findings and then suggests how they may inform various market prediction strategies.
Simplicity versus complexity in modelling groundwater recharge in Chalk catchments
R. B. Bradford,R. Ragab,S. M. Crooks,F. Bouraoui
Hydrology and Earth System Sciences (HESS) & Discussions (HESSD) , 2002,
Abstract: Models of varying complexity are available to provide estimates of recharge in headwater Chalk catchments. Some measure of how estimates vary between different models can help guide the choice of model for a particular application. This paper compares recharge estimates derived from four models employing input data at varying spatial resolutions for a Chalk headwater catchment (River Pang, UK) over a four-year period (1992-1995) that includes a range of climatic conditions. One model was validated against river flow data to provide a measure of their relative performance. Each model gave similar total recharge for the crucial winter recharge period when evaporation is low. However, the simple models produced relatively lower estimates of the summer and early autumn recharge due to the way in which processes governing recharge especially evaporation and infiltration are represented. The relative uniformity of land use, soil types and rainfall across headwater, drift-free Chalk catchments suggests that complex, distributed models offer limited benefits for recharge estimates at the catchment scale compared to simple models. Nonetheless, distributed models would be justified for studies where the pattern and amount of recharge need to be known in greater detail and to provide more reliable estimates of recharge during years with low rainfall. Keywords: Chalk, modelling, groundwater recharge
Reality as Simplicity  [PDF]
Giulio Ruffini
Physics , 2009,
Abstract: The aim of this paper is to study the relevance of simplicity and its formal representation as Kolmogorov or algorithmic complexity in the cognitive sciences. The discussion is based on two premises: 1) all human experience is generated in the brain, 2) the brain has only access to information. Taken together, these two premises lead us to conclude that all the elements of what we call `reality' are derived mental constructs based on information and compression, i.e., algorithmic models derived from the search for simplicity in data. Naturally, these premises apply to humans in real or virtual environments as well as robots or other cognitive systems. Based on this, it is further hypothesized that there is a hierarchy of processing levels where simplicity and compression play a major role. As applications, I illustrate first the relevance of compression and simplicity in fundamental neuroscience with an analysis of the Mismatch Negativity paradigm. Then I discuss the applicability to Presence research, which studies how to produce real-feeling experiences in mediated interaction, and use Bayesian modeling to define in a formal way different aspects of the illusion of Presence. The idea is put forth that given alternative models (interpretations) for a given mediated interaction, a brain will select the simplest one it can construct weighted by prior models. In the final section the universality of these ideas and applications in robotics, machine learning, biology and education is discussed. I emphasize that there is a common conceptual thread based on the idea of simplicity, which suggests a common study approach.
Algorithmic Simplicity and Relevance  [PDF]
Jean-Louis Dessalles
Computer Science , 2012,
Abstract: The human mind is known to be sensitive to complexity. For instance, the visual system reconstructs hidden parts of objects following a principle of maximum simplicity. We suggest here that higher cognitive processes, such as the selection of relevant situations, are sensitive to variations of complexity. Situations are relevant to human beings when they appear simpler to describe than to generate. This definition offers a predictive (i.e. falsifiable) model for the selection of situations worth reporting (interestingness) and for what individuals consider an appropriate move in conversation.
论世界体系的简单性与复杂性――在希格斯玻色子发现之后
On the simplicity and complexity of world system:after discovering Higgs boson
 [PDF]

王思涛,赵煦
WANG Sitao
, ZHAO Xu

- , 2015, DOI: 1672-3104(2015)03-0009-06
Abstract: 摘 要: 在科学史上,许多人相信存在一种理论可为世界提供一个终极的说明。希格斯玻色子的发现,为人们对标准模型理论为世界提供简单性的终极说明增强了信心。这种对世界体系的简单性信念正是支撑科学家们不断前行的动力。但现代科学却表明无序、不确定性、偶然性、随机性、概率等在现实世界中无处不在。标准模型理论在为世界提供一个简单性的说明之外,并未能将复杂性驱除出科学领域。简单性与复杂性这一对矛盾的共存,将是未来世界的本质特征。
Abstract: In the history of science, people firmly believe that there is a theory which can provide the world with an ultimate explanation. The discovery of the Higgs boson strengthens people’s confidence in Standard Model theory which provides a simple description for the world in the ultimate. Suchfaith in simplicity of the world system is the very momentum which aspires scientists to go forward. But modern science indicates that there exist disorder, uncertainty, chance, randomness, probability everywhere in the world. Except providing a simple explanation for the world, Standard Model theory is not yet able to get rid of the complexity from the science. The co-existence of the contradictory pair of simplicity and complexity is the essential characteristic of the world
On the simplicity of multigerms  [PDF]
Raúl Oset Sinha,Maria Aparecida Soares Ruas,Roberta Wik Atique
Mathematics , 2015,
Abstract: We prove several results regarding the simplicity of germs and multigerms obtained via the operations of augmentation, simultaneous augmentation and concatenation and generalised concatenation. We also give some results in the case where one of the branches is a non stable primitive germ. Using our results we obtain a list which includes all simple multigerms from $\mathbb C^3$ to $\mathbb C^3$.
Page 1 /100
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.