oalib

Publish in OALib Journal

ISSN: 2333-9721

APC: Only $99

Submit

Any time

2019 ( 155 )

2018 ( 276 )

2017 ( 293 )

2016 ( 460 )

Custom range...

Search Results: 1 - 10 of 225089 matches for " Jochen Rütschlin "
All listed articles are free for downloading (OA Articles)
Page 1 /225089
Display every page Item
Pattern-Based Development and Management of Cloud Applications
Christoph Fehling,Frank Leymann,Jochen Rütschlin,David Schumm
Future Internet , 2012, DOI: 10.3390/fi4010110
Abstract: Cloud-based applications require a high degree of automation regarding their IT resource management, for example, to handle scalability or resource failures. This automation is enabled by cloud providers offering management interfaces accessed by applications without human interaction. The properties of clouds, especially pay-per-use billing and low availability of individual resources, demand such a timely system management. We call the automated steps to perform one of these management tasks a “management flow”. Because the emerging behavior of the overall system is comprised of many such management flows and is often hard to predict, we propose defining abstract management flows, describing common steps handling the management tasks. These abstract management flows may then be refined for each individual use case. We cover abstract management flows describing how to make an application elastic, resilient regarding IT resource failure, and how to move application components between different runtime environments. The requirements of these management flows for handled applications are expressed using architectural patterns that have to be implemented by the applications. These dependencies result in abstract management flows being interrelated with architectural patterns in a uniform pattern catalog. We propose a method by use of a catalog to guide application managers during the refinement of abstract management flows at the design stage of an application. Following this method, runtime-specific management functionality and management interfaces are used to obtain automated management flows for a developed application.
Hadronic B decays to open charm at the BaBar Experiment
Jochen R. Schieck
Physics , 2001,
Abstract: Using about 23M $B \bar B$ events collected in 1999-2000 with the BABAR detector, we report on the decays $B \to D^{(*)}\bar D^{(*)}K$ and $B^0 \to D^{*+}D^{*-}$. The branching fractions of the low background decay modes $B \to D^{(*)}\bar D^{(*)}K$ are determined to be ${\cal B}(B^0 \to D^{*-}D^{0}K^+) = (2.8 \pm 0.7 \pm 0.5)\times 10^{-3}$ and ${\cal B}(B^0 \to D^{*-}D^{*0}K^+) = (6.8 \pm 1.7 \pm 1.7)\times 10^{-3}$, where the first error quoted is statistical and the second systematic. Observation of a significant number of candidates in the color-suppressed decay mode $B^+\to D^{*+}D^{*-}K^+$ is reported with a branching fraction ${\cal B}(B^+\to D^{*+}D^{*-}K^+)= (3.4\pm 1.6\pm 0.9)\times 10^{-3}$. Decays of the type $B \to D^{(*)} \bar D^{(*)}$ can be used to provide a measurement of the parameter $\sin 2 \beta$ of the Unitarity Triangle. For this decay mode we measure a branching fraction of ${\cal BR} (B^0 \to D^{*+}D^{*-}) = (8.0 \pm 1.6\pm 1.2)\times 10^{-4}$. All results presented here are preliminary.
Physiological Sensing of Carbon Dioxide/Bicarbonate/pH via Cyclic Nucleotide Signaling
Jochen Buck,Lonny R. Levin
Sensors , 2011, DOI: 10.3390/s110202112
Abstract: Carbon dioxide (CO2) is produced by living organisms as a byproduct of metabolism. In physiological systems, CO2 is unequivocally linked with bicarbonate (HCO3?) and pH via a ubiquitous family of carbonic anhydrases, and numerous biological processes are dependent upon a mechanism for sensing the level of CO2, HCO3, and/or pH. The discovery that soluble adenylyl cyclase (sAC) is directly regulated by bicarbonate provided a link between CO2/HCO3/pH chemosensing and signaling via the widely used second messenger cyclic AMP. This review summarizes the evidence that bicarbonate-regulated sAC, and additional, subsequently identified bicarbonate-regulate nucleotidyl cyclases, function as evolutionarily conserved CO2/HCO3/pH chemosensors in a wide variety of physiological systems.
Optimal Scheduling of Peer-to-Peer File Dissemination
Jochen Mundinger,Richard R. Weber,Gideon Weiss
Mathematics , 2006,
Abstract: Peer-to-peer (P2P) overlay networks such as BitTorrent and Avalanche are increasingly used for disseminating potentially large files from a server to many end users via the Internet. The key idea is to divide the file into many equally-sized parts and then let users download each part (or, for network coding based systems such as Avalanche, linear combinations of the parts) either from the server or from another user who has already downloaded it. However, their performance evaluation has typically been limited to comparing one system relative to another and typically been realized by means of simulation and measurements. In contrast, we provide an analytic performance analysis that is based on a new uplink-sharing version of the well-known broadcasting problem. Assuming equal upload capacities, we show that the minimal time to disseminate the file is the same as for the simultaneous send/receive version of the broadcasting problem. For general upload capacities, we provide a mixed integer linear program (MILP) solution and a complementary fluid limit solution. We thus provide a lower bound which can be used as a performance benchmark for any P2P file dissemination system. We also investigate the performance of a decentralized strategy, providing evidence that the performance of necessarily decentralized P2P file dissemination systems should be close to this bound and therefore that it is useful in practice.
Polynomial Time Data Reduction for Dominating Set
Jochen Alber,Michael R. Fellows,Rolf Niedermeier
Computer Science , 2002,
Abstract: Dealing with the NP-complete Dominating Set problem on undirected graphs, we demonstrate the power of data reduction by preprocessing from a theoretical as well as a practical side. In particular, we prove that Dominating Set restricted to planar graphs has a so-called problem kernel of linear size, achieved by two simple and easy to implement reduction rules. Moreover, having implemented our reduction rules, first experiments indicate the impressive practical potential of these rules. Thus, this work seems to open up a new and prospective way how to cope with one of the most important problems in graph theory and combinatorial optimization.
On the intermittency exponent of the turbulent energy cascade
Jochen Cleve,Martin Greiner,Bruce R. Pearson,Katepalli R. Sreenivasan
Physics , 2004, DOI: 10.1103/PhysRevE.69.066316
Abstract: We consider the turbulent energy dissipation from one-dimensional records in experiments using air and gaseous helium at cryogenic temperatures, and obtain the intermittency exponent via the two-point correlation function of the energy dissipation. The air data are obtained in a number of flows in a wind tunnel and the atmospheric boundary layer at a height of about 35 m above the ground. The helium data correspond to the centerline of a jet exhausting into a container. The air data on the intermittency exponent are consistent with each other and with a trend that increases with the Taylor microscale Reynolds number, R_\lambda, of up to about 1000 and saturates thereafter. On the other hand, the helium data cluster around a constant value at nearly all R_\lambda, this being about half of the asymptotic value for the air data. Some possible explanation is offered for this anomaly.
Competition of Intermediaries in a Differentiated Duopoly  [PDF]
Sonja Brangewitz, Jochen Manegold
Theoretical Economics Letters (TEL) , 2016, DOI: 10.4236/tel.2016.66124
Abstract: On an intermediate goods market with asymmetric production technologies as well as vertical and horizontal product differentiation, we analyze the influence of simultaneous competition for resources and customers. The intermediaries face either price or quantity competition on the output market and a monopolistic, strategically acting supplier on the input market. We find that there exist quality and productivity differences such that for quantity competition only one intermediary is willing to procure inputs from the input supplier, while for price competition both intermediaries are willing to purchase inputs. Moreover, the well-known welfare advantage of price competition can in general be no longer confirmed in our model with an endogenous input market and asymmetric intermediaries.
Case-based medical informatics
Stefan V Pantazi, José F Arocha, Jochen R Moehr
BMC Medical Informatics and Decision Making , 2004, DOI: 10.1186/1472-6947-4-19
Abstract: We outline the fundamental differences between Medical Informatics and theoretical sciences and propose that Medical Informatics research should advance individual knowledge processing (case-based reasoning) and that natural language processing research is an important step towards this goal that may have ethical implications for patient-centered health medicine.We focus on fundamental aspects of decision-making, which connect human expertise with individual knowledge processing. We continue with a knowledge spectrum perspective on biomedical knowledge and conclude that case-based reasoning is the paradigm that can advance towards personalized healthcare and that can enable the education of patients and providers.We center the discussion on formal methods of knowledge representation around the frame problem. We propose a context-dependent view on the notion of "meaning" and advocate the need for case-based reasoning research and natural language processing. In the context of memory based knowledge processing, pattern recognition, comparison and analogy-making, we conclude that while humans seem to naturally support the case-based reasoning paradigm (memory of past experiences of problem-solving and powerful case matching mechanisms), technical solutions are challenging.Finally, we discuss the major challenges for a technical solution: case record comprehensiveness, organization of information on similarity principles, development of pattern recognition and solving ethical issues.Medical Informatics is an applied science that should be committed to advancing patient-centered medicine through individual knowledge processing. Case-based reasoning is the technical solution that enables a continuous individual knowledge processing and could be applied providing that challenges and ethical issues arising are addressed appropriately.Our aim is to place Medical Informatics in the context of other sciences and to bring coherence in its formal education [1]. This will necessa
Ring opening metathesis polymerization-derived block copolymers bearing chelating ligands: synthesis, metal immobilization and use in hydroformylation under micellar conditions
Gajanan M. Pawar,Jochen Weckesser,Siegfried Blechert,Michael R. Buchmeiser
Beilstein Journal of Organic Chemistry , 2010, DOI: 10.3762/bjoc.6.28
Abstract: Norborn-5-ene-(N,N-dipyrid-2-yl)carbamide (M1) was copolymerized with exo,exo-[2-(3-ethoxycarbonyl-7-oxabicyclo[2.2.1]hept-5-en-2-carbonyloxy)ethyl]trimethylammonium iodide (M2) using the Schrock catalyst Mo(N-2,6-Me2-C6H3)(CHCMe2Ph)(OCMe(CF3)2)2 [Mo] to yield poly(M1-b-M2). In water, poly(M1-b-M2) forms micelles with a critical micelle-forming concentration (cmc) of 2.8 × 10 6 mol L 1; Reaction of poly(M1-b-M2) with [Rh(COD)Cl]2 (COD = cycloocta-1,5-diene) yields the Rh(I)-loaded block copolymer poly(M1-b-M2)-Rh containing 18 mg of Rh(I)/g of block copolymer with a cmc of 2.2 × 10 6 mol L 1. The Rh-loaded polymer was used for the hydroformylation of 1-octene under micellar conditions. The data obtained were compared to those obtained with a monomeric analogue, i.e. CH3CON(Py)2RhCl(COD) (C1, Py = 2-pyridyl). Using the polymer-supported catalyst under micellar conditions, a significant increase in selectivity, i.e. an increase in the n:iso ratio was accomplished, which could be further enhanced by the addition of excess ligand, e.g., triphenylphosphite. Special features of the micellar catalytic set up are discussed.
Can we evaluate a fine-grained emission model using high-resolution atmospheric transport modelling and regional fossil fuel CO2 observations?
Felix R. Vogel,Balendra Thiruchittampalam,Jochen Theloke,Roberto Kretschmer
Tellus B , 2013, DOI: 10.3402/tellusb.v65i0.18681
Abstract: Quantifying carbon dioxide emissions from fossil fuel burning (FFCO2) is a crucial task to assess continental carbon fluxes and to track anthropogenic emissions changes in the future. In the present study, we investigate potentials and challenges when combining observational data with simulations using high-resolution atmospheric transport and emission modelling. These challenges concern, for example, erroneous vertical mixing or uncertainties in the disaggregation of national total emissions to higher spatial and temporal resolution. In our study, the hourly regional fossil fuel CO2 offset (ΔFFCO2) is simulated by transporting emissions from a 5 min×5 min emission model (IER2005) that provides FFCO2 emissions from different emission categories. Our Lagrangian particle dispersion model (STILT) is driven by 25 km×25 km meteorological data from the European Center for Medium-Range Weather Forecast (ECMWF). We evaluate this modelling framework (STILT/ECMWF+IER2005) for the year 2005 using hourly ΔFFCO2 estimates derived from 14C, CO and 222Radon (222Rn) observations at an urban site in south-western Germany (Heidelberg). Analysing the mean diurnal cycles of ΔFFCO2 for different seasons, we find that the large seasonal and diurnal variation of emission factors used in the bottom-up emission model (spanning one order of magnitude) are adequate. Furthermore, we show that the use of 222Rn as an independent tracer helps to overcome problems in timing as well as strength of the vertical mixing in the transport model. By applying this variability correction, the model-observation agreement is significantly improved for simulated ΔFFCO2. We found a significant overestimation of ΔFFCO2 concentrations during situations where the air masses predominantly originate from densely populated areas. This is most likely caused by the spatial disaggregation methodology for the residential emissions, which to some extent relies on a constant per capita-based distribution. In the case of domestic heating emissions, this does not appear to be sufficient.
Page 1 /225089
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.