oalib

Publish in OALib Journal

ISSN: 2333-9721

APC: Only $99

Submit

Any time

2020 ( 23 )

2019 ( 229 )

2018 ( 278 )

2017 ( 279 )

Custom range...

Search Results: 1 - 10 of 219622 matches for " C. Hagemann "
All listed articles are free for downloading (OA Articles)
Page 1 /219622
Display every page Item
Climate model bias correction and the role of timescales
J. O. Haerter, S. Hagemann, C. Moseley,C. Piani
Hydrology and Earth System Sciences (HESS) & Discussions (HESSD) , 2011,
Abstract: It is well known that output from climate models cannot be used to force hydrological simulations without some form of preprocessing to remove the existing biases. In principle, statistical bias correction methodologies act on model output so the statistical properties of the corrected data match those of the observations. However, the improvements to the statistical properties of the data are limited to the specific timescale of the fluctuations that are considered. For example, a statistical bias correction methodology for mean daily temperature values might be detrimental to monthly statistics. Also, in applying bias corrections derived from present day to scenario simulations, an assumption is made on the stationarity of the bias over the largest timescales. First, we point out several conditions that have to be fulfilled by model data to make the application of a statistical bias correction meaningful. We then examine the effects of mixing fluctuations on different timescales and suggest an alternative statistical methodology, referred to here as a cascade bias correction method, that eliminates, or greatly reduces, the negative effects.
Climate model bias correction and the role of timescales
J. O. Haerter,S. Hagemann,C. Moseley,C. Piani
Hydrology and Earth System Sciences Discussions , 2010, DOI: 10.5194/hessd-7-7863-2010
Abstract: It is well known that output from climate models cannot be used to force hydrological simulations without some form of preprocessing to remove the existing biases. In principle, statistical bias correction methodologies act on model output so the statistical properties of the corrected data match those of the observations. However the improvements to the statistical properties of the data are limited to the specific time scale of the fluctuations that are considered. For example, a statistical bias correction methodology for mean daily values might be detrimental to monthly statistics. Also, in applying bias corrections derived from present day to scenario simulations, an assumption is made of persistence of the bias over the largest timescales. We examine the effects of mixing fluctuations on different time scales and suggest an improved statistical methodology, referred to here as a cascade bias correction method, that eliminates, or greatly reduces, the negative effects.
Quantifying different sources of uncertainty in hydrological projections in an Alpine watershed
C. Dobler, S. Hagemann, R. L. Wilby,J. St tter
Hydrology and Earth System Sciences (HESS) & Discussions (HESSD) , 2012,
Abstract: Many studies have investigated potential climate change impacts on regional hydrology; less attention has been given to the components of uncertainty that affect these scenarios. This study quantifies uncertainties resulting from (i) General Circulation Models (GCMs), (ii) Regional Climate Models (RCMs), (iii) bias-correction of RCMs, and (iv) hydrological model parameterization using a multi-model framework. This consists of three GCMs, three RCMs, three bias-correction techniques, and sets of hydrological model parameters. The study is performed for the Lech watershed (~ 1000 km2), located in the Northern Limestone Alps, Austria. Bias-corrected climate data are used to drive the hydrological model HQsim to simulate runoff under present (1971–2000) and future (2070–2099) climate conditions. Hydrological model parameter uncertainty is assessed by Monte Carlo sampling. The model chain is found to perform well under present climate conditions. However, hydrological projections are associated with high uncertainty, mainly due to the choice of GCM and RCM. Uncertainty due to bias-correction is found to have greatest influence on projections of extreme river flows, and the choice of method(s) is an important consideration in snowmelt systems. Overall, hydrological model parameterization is least important. The study also demonstrates how an improved understanding of the physical processes governing future river flows can help focus attention on the scientifically tractable elements of the uncertainty.
Quantifying different sources of uncertainty in hydrological projections at the catchment scale
C. Dobler,S. Hagemann,R. L. Wilby,J. St?tter
Hydrology and Earth System Sciences Discussions , 2012, DOI: 10.5194/hessd-9-8173-2012
Abstract: Many studies have investigated potential climate change impacts on regional hydrology; less attention has been given to the components of uncertainty that affect these scenarios. This study quantifies uncertainties resulting from (i) General Circulation Models (GCMs), (ii) Regional Climate Models (RCMs), (iii) bias-correction of RCMs, and (iv) hydrological model parameterization using a multi model framework. This consists of three GCMs, three RCMs, three bias-correction techniques, and sets of hydrological model parameters. The study is performed for the Lech watershed (~1000 km2), located in the Northern Limestone Alps, Austria. Bias-corrected climate data are used to drive the hydrological model HQsim to simulate runoff under present (1971–2000) and future (2070–2099) climate conditions. Hydrological model parameter uncertainty is assessed by Monte Carlo sampling. The model chain is found to perform well under present climate conditions. However, hydrological projections are associated with large uncertainty, mainly due to the choice of GCM and RCM. Uncertainty due to bias-correction is found to have greatest influence on projections of extreme river flows and the choice of method(s) is an important consideration in snowmelt systems. Overall, hydrological model parameterization is least important. The study also demonstrates how an improved understanding of the physical processes governing future river flows can help focus attention on the scientifically tractable elements of the uncertainty.
Climate change impact on available water resources obtained using multiple global climate and hydrology models
S. Hagemann,C. Chen,D. B. Clark,S. Folwell
Earth System Dynamics Discussions , 2012, DOI: 10.5194/esdd-3-1321-2012
Abstract: Climate change is expected to alter the hydrological cycle resulting in large-scale impacts on water availability. However, future climate change impact assessments are highly uncertain. For the first time, multiple global climate (three) and hydrological models (eight) were used to systematically assess the hydrological response to climate change and project the future state of global water resources. The results show a large spread in projected changes in water resources within the climate–hydrology modelling chain for some regions. They clearly demonstrate that climate models are not the only source of uncertainty for hydrological change. But there are also areas showing a robust change signal, such as at high latitudes and in some mid-latitude regions, where the models agree on the sign of projected hydrological changes, indicative of higher confidence. In many catchments an increase of available water resources is expected but there are some severe decreases in central and Southern Europe, the Middle East, the Mississippi river basin, Southern Africa, Southern China and south eastern Australia.
Parallel Statistical Multi-resolution Estimation
Jan Lebert,Lutz Künneke,Johannes Hagemann,Stephan C. Kramer
Physics , 2015,
Abstract: We discuss several strategies to implement Dykstra's projection algorithm on NVIDIA's compute unified device architecture (CUDA). Dykstra's algorithm is the central step in and the computationally most expensive part of statistical multi-resolution methods. It projects a given vector onto the intersection of convex sets. Compared with a CPU implementation our CUDA implementation is one order of magnitude faster. For a further speed up and to reduce memory consumption we have developed a new variant, which we call incomplete Dykstra's algorithm. Implemented in CUDA it is one order of magnitude faster than the CUDA implementation of the standard Dykstra algorithm. As sample application we discuss using the incomplete Dykstra's algorithm as preprocessor for the recently developed super-resolution optical fluctuation imaging (SOFI) method (Dertinger et al. 2009). We show that statistical multi-resolution estimation can enhance the resolution improvement of the plain SOFI algorithm just as the Fourier-reweighting of SOFI. The results are compared in terms of their power spectrum and their Fourier ring correlation (Saxton and Baumeister 1982). The Fourier ring correlation indicates that the resolution for typical second order SOFI images can be improved by about 30 per cent. Our results show that a careful parallelization of Dykstra's algorithm enables its use in large-scale statistical multi-resolution analyses.
Real-Time Phase Masks for Interactive Stimulation of Optogenetic Neurons
Stephan C. Kramer,Johannes Hagemann,D. Russell Luke
Mathematics , 2013,
Abstract: Experiments with networks of optogenetically altered neurons require stimulation with high spatio-temporal selectivity. Computer-assisted holography is an energy-efficient method for robust and reliable addressing of single neurons on the millisecond-timescale inherent to biologial information processing. We show that real-time control of neurons can be achieved by a CUDA-based hologram computation.
German-speaking economists in British exile 1933-1945
Herald Hagemann
PSL Quarterly Review , 2007,
Abstract:
Cluster-Robust Bootstrap Inference in Quantile Regression Models
Andreas Hagemann
Statistics , 2014,
Abstract: In this paper I develop a wild bootstrap procedure for cluster-robust inference in linear quantile regression models. I show that the bootstrap leads to asymptotically valid inference on the entire quantile regression process in a setting with a large number of small, heterogeneous clusters and provides consistent estimates of the asymptotic covariance function of that process. The proposed bootstrap procedure is easy to implement and performs well even when the number of clusters is much smaller than the sample size. An application to Project STAR data is provided.
Robust Spectral Analysis
Andreas Hagemann
Statistics , 2011,
Abstract: In this paper I introduce quantile spectral densities that summarize the cyclical behavior of time series across their whole distribution by analyzing periodicities in quantile crossings. This approach can capture systematic changes in the impact of cycles on the distribution of a time series and allows robust spectral estimation and inference in situations where the dependence structure is not accurately captured by the auto-covariance function. I study the statistical properties of quantile spectral estimators in a large class of nonlinear time series models and discuss inference both at fixed and across all frequencies. Monte Carlo experiments illustrate the advantages of quantile spectral analysis over classical methods when standard assumptions are violated.
Page 1 /219622
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.