oalib

Publish in OALib Journal

ISSN: 2333-9721

APC: Only $99

Submit

Any time

2019 ( 136 )

2018 ( 238 )

2017 ( 264 )

2016 ( 343 )

Custom range...

Search Results: 1 - 10 of 164546 matches for " James F Beck "
All listed articles are free for downloading (OA Articles)
Page 1 /164546
Display every page Item
Reduced Nasal Nitric Oxide Production in Cystic Fibrosis Patients with Elevated Systemic Inflammation Markers
Ruth K. Michl, Julia Hentschel, Christiane Fischer, James F. Beck, Jochen G. Mainz
PLOS ONE , 2013, DOI: 10.1371/journal.pone.0079141
Abstract: Background Nitric oxide (NO) is produced within the respiratory tract and can be detected in exhaled bronchial and nasal air. The concentration varies in specific diseases, being elevated in patients with asthma and bronchiectasis, but decreased in primary ciliary dyskinesia. In cystic fibrosis (CF), conflicting data exist on NO levels, which are reported unexplained as either decreased or normal. Functionally, NO production in the paranasal sinuses is considered as a location-specific first-line defence mechanism. The aim of this study was to investigate the correlation between upper and lower airway NO levels and blood inflammatory parameters, CF-pathogen colonisation, and clinical data. Methods and Findings Nasal and bronchial NO concentrations from 57 CF patients were determined using an electrochemical analyser and correlated to pathogen colonisation of the upper and lower airways which were microbiologically assessed from nasal lavage and sputum samples. Statistical analyses were performed with respect to clinical parameters (lung function, BMI), laboratory findings (CRP, leucocytes, total-IgG, fibrinogen), and anti-inflammatory and antibiotic therapy. There were significant correlations between nasal and bronchial NO levels (rho = 0.48, p<0.001), but no correlation between NO levels and specific pathogen colonisation. In patients receiving azithromycin, significantly reduced bronchial NO and a tendency to reduced nasal NO could be found. Interestingly, a significant inverse correlation of nasal NO to CRP (rho = ?0.28, p = 0.04) and to leucocytes (rho = ?0.41, p = 0.003) was observed. In contrast, bronchial NO levels showed no correlation to clinical or inflammatory parameters. Conclusion Given that NO in the paranasal sinuses is part of the first-line defence mechanism against pathogens, our finding of reduced nasal NO in CF patients with elevated systemic inflammatory markers indicates impaired upper airway defence. This may facilitate further pathogen acquisition in the sinonasal area, with consequences for lung colonisation and the overall outcome in CF.
Comparative evaluation of the treatment efficacy of suberoylanilide hydroxamic acid (SAHA) and paclitaxel in ovarian cancer cell lines and primary ovarian cancer cells from patients
Jürgen Sonnemann, Jennifer G?nge, Sabine Pilz, Christine St?tzer, Ralf Ohlinger, Antje Belau, Gerd Lorenz, James F Beck
BMC Cancer , 2006, DOI: 10.1186/1471-2407-6-183
Abstract: We compared a prototypic histone deacetylase inhibitor, suberoylanilide hydroxamic acid (SAHA), and paclitaxel for their treatment efficacy in ovarian cancer cell lines and in primary patient-derived ovarian cancer cells. The primary cancer cells were isolated from malignant ascites collected from five patients with stage III ovarian carcinomas. Cytotoxic activities were evaluated by Alamar Blue assay and by caspase-3 activation. The ability of SAHA to kill drug-resistant 2780AD cells was also assessed.By employing the cell lines OVCAR-3, SK-OV-3, and A2780, we established SAHA at concentrations of 1 to 20 μM to be as efficient in inducing cell death as paclitaxel at concentrations of 3 to 300 nM. Consequently, we treated the patient-derived cancer cells with these doses of the drugs. All five isolates were sensitive to SAHA, with cell killing ranging from 21% to 63% after a 72-h exposure to 20 μM SAHA, while four of them were resistant to paclitaxel (i.e., <10% cell death at 300 nM paclitaxel for 72 hours). Likewise, treatment with SAHA led to an increase in caspase-3 activity in all five isolates, whereas treatment with paclitaxel had no effect on caspase-3 activity in three of them. 2780AD cells were responsive to SAHA but resistant to paclitaxel.These ex vivo findings raise the possibility that SAHA may prove effective in the treatment of paclitaxel-resistant ovarian cancer in vivo.Ovarian cancer is the most lethal gynaecological neoplasm, accounting for over 6% of deaths from cancer in women [1]. The standard treatment is a combination of surgery and chemotherapy, the latter usually consisting of a taxane/platinum combination. By this regimen, initial response rates of more than 80% are achieved [2]. Unfortunately, in the vast majority of women, diagnosis occurs after the disease has already disseminated beyond the ovaries. These patients typically relapse and eventually die as the tumours become refractory to treatment. Actually, drug resistance is supposed to
Machine Learning Techniques Accurately Classify Microbial Communities by Bacterial Vaginosis Characteristics
Daniel Beck, James A. Foster
PLOS ONE , 2014, DOI: 10.1371/journal.pone.0087830
Abstract: Microbial communities are important to human health. Bacterial vaginosis (BV) is a disease associated with the vagina microbiome. While the causes of BV are unknown, the microbial community in the vagina appears to play a role. We use three different machine-learning techniques to classify microbial communities into BV categories. These three techniques include genetic programming (GP), random forests (RF), and logistic regression (LR). We evaluate the classification accuracy of each of these techniques on two different datasets. We then deconstruct the classification models to identify important features of the microbial community. We found that the classification models produced by the machine learning techniques obtained accuracies above 90% for Nugent score BV and above 80% for Amsel criteria BV. While the classification models identify largely different sets of important features, the shared features often agree with past research.
Hierarchical sparse Bayesian learning: theory and application for inferring structural damage from incomplete modal data
Yong Huang,James L. Beck
Statistics , 2015,
Abstract: Structural damage due to excessive loading or environmental degradation typically occurs in localized areas in the absence of collapse. This prior information about the spatial sparseness of structural damage is exploited here by a hierarchical sparse Bayesian learning framework with the goal of reducing the source of ill-conditioning in the stiffness loss inversion problem for damage detection. Sparse Bayesian learning methodologies automatically prune away irrelevant or inactive features from a set of potential candidates, and so they are effective probabilistic tools for producing sparse explanatory subsets. We have previously proposed such an approach to establish the probability of localized stiffness reductions that serve as a proxy for damage by using noisy incomplete modal data from before and after possible damage. The core idea centers on a specific hierarchical Bayesian model that promotes spatial sparseness in the inferred stiffness reductions in a way that is consistent with the Bayesian Ockham razor. In this paper, we improve the theory of our previously proposed sparse Bayesian learning approach by eliminating an approximation and, more importantly, incorporating a constraint on stiffness increases. Our approach has many appealing features that are summarized at the end of the paper. We validate the approach by applying it to the Phase II simulated and experimental benchmark studies sponsored by the IASC-ASCE Task Group on Structural Health Monitoring. The results show that it can reliably detect, locate and assess damage by inferring substructure stiffness losses from the identified modal parameters. The occurrence of missed and false damage alerts is effectively suppressed.
Hierarchical sparse Bayesian learning for structural health monitoring with incomplete modal data
Yong Huang,James L. Beck
Statistics , 2014, DOI: 10.1615/Int.J.UncertaintyQuantification.2015011808
Abstract: For civil structures, structural damage due to severe loading events such as earthquakes, or due to long-term environmental degradation, usually occurs in localized areas of a structure. A new sparse Bayesian probabilistic framework for computing the probability of localized stiffness reductions induced by damage is presented that uses noisy incomplete modal data from before and after possible damage. This new approach employs system modal parameters of the structure as extra variables for Bayesian model updating with incomplete modal data. A specific hierarchical Bayesian model is constructed that promotes spatial sparseness in the inferred stiffness reductions in a way that is consistent with the Bayesian Ockham razor. To obtain the most plausible model of sparse stiffness reductions together with its uncertainty within a specified class of models, the method employs an optimization scheme that iterates among all uncertain parameters, including the hierarchical hyper-parameters. The approach has four important benefits: (1) it infers spatially-sparse stiffness changes based on the identified modal parameters; (2) the uncertainty in the inferred stiffness reductions is quantified; (3) no matching of model and experimental modes is needed, and (4) solving the nonlinear eigenvalue problem of a structural model is not required. The proposed method is applied to two previously-studied examples using simulated data: a ten-story shear-building and the three-dimensional braced-frame model from the Phase II Simulated Benchmark problem sponsored by the IASC-ASCE Task Group on Structural Health Monitoring. The results show that the occurrence of false-positive and false-negative damage detection is clearly reduced in the presence of modeling error. Furthermore, the identified most probable stiffness loss ratios are close to their actual values.
Asymptotically Independent Markov Sampling: a new MCMC scheme for Bayesian Inference
James L. Beck,Konstantin M. Zuev
Statistics , 2011,
Abstract: In Bayesian statistics, many problems can be expressed as the evaluation of the expectation of a quantity of interest with respect to the posterior distribution. Standard Monte Carlo method is often not applicable because the encountered posterior distributions cannot be sampled directly. In this case, the most popular strategies are the importance sampling method, Markov chain Monte Carlo, and annealing. In this paper, we introduce a new scheme for Bayesian inference, called Asymptotically Independent Markov Sampling (AIMS), which is based on the above methods. We derive important ergodic properties of AIMS. In particular, it is shown that, under certain conditions, the AIMS algorithm produces a uniformly ergodic Markov chain. The choice of the free parameters of the algorithm is discussed and recommendations are provided for this choice, both theoretically and heuristically based. The efficiency of AIMS is demonstrated with three numerical examples, which include both multi-modal and higher-dimensional target posterior distributions.
Rare Event Simulation
James L. Beck,Konstantin M. Zuev
Statistics , 2015,
Abstract: Rare events are events that are expected to occur infrequently, or more technically, those that have low probabilities (say, order of $10^{-3}$ or less) of occurring according to a probability model. In the context of uncertainty quantification, the rare events often correspond to failure of systems designed for high reliability, meaning that the system performance fails to meet some design or operation specifications. As reviewed in this section, computation of such rare-event probabilities is challenging. Analytical solutions are usually not available for non-trivial problems and standard Monte Carlo simulation is computationally inefficient. Therefore, much research effort has focused on developing advanced stochastic simulation methods that are more efficient. In this section, we address the problem of estimating rare-event probabilities by Monte Carlo simulation, Importance Sampling and Subset Simulation for highly reliable dynamic systems.
Approximate Bayesian Computation by Subset Simulation
Manuel Chiachio,James L. Beck,Juan Chiachio,Guillermo Rus
Statistics , 2014,
Abstract: A new Approximate Bayesian Computation (ABC) algorithm for Bayesian updating of model parameters is proposed in this paper, which combines the ABC principles with the technique of Subset Simulation for efficient rare-event simulation, first developed in S.K. Au and J.L. Beck [1]. It has been named ABC- SubSim. The idea is to choose the nested decreasing sequence of regions in Subset Simulation as the regions that correspond to increasingly closer approximations of the actual data vector in observation space. The efficiency of the algorithm is demonstrated in two examples that illustrate some of the challenges faced in real-world applications of ABC. We show that the proposed algorithm outperforms other recent sequential ABC algorithms in terms of computational efficiency while achieving the same, or better, measure of ac- curacy in the posterior distribution. We also show that ABC-SubSim readily provides an estimate of the evidence (marginal likelihood) for posterior model class assessment, as a by-product.
Robust Bayesian compressive sensing with data loss recovery for structural health monitoring signals
Yong Huang,James L. Beck,Stephen Wu,Hui Li
Statistics , 2015,
Abstract: The application of compressive sensing (CS) to structural health monitoring is an emerging research topic. The basic idea in CS is to use a specially-designed wireless sensor to sample signals that are sparse in some basis (e.g. wavelet basis) directly in a compressed form, and then to reconstruct (decompress) these signals accurately using some inversion algorithm after transmission to a central processing unit. However, most signals in structural health monitoring are only approximately sparse, i.e. only a relatively small number of the signal coefficients in some basis are significant, but the other coefficients are usually not exactly zero. In this case, perfect reconstruction from compressed measurements is not expected. A new Bayesian CS algorithm is proposed in which robust treatment of the uncertain parameters is explored, including integration over the prediction-error precision parameter to remove it as a "nuisance" parameter. The performance of the new CS algorithm is investigated using compressed data from accelerometers installed on a space-frame structure and on a cable-stayed bridge. Compared with other state-of-the-art CS methods including our previously-published Bayesian method which uses MAP (maximum a posteriori) estimation of the prediction-error precision parameter, the new algorithm shows superior performance in reconstruction robustness and posterior uncertainty quantification. Furthermore, our method can be utilized for recovery of lost data during wireless transmission, regardless of the level of sparseness in the signal.
Robust Bayesian compressive sensing for signals in structural health monitoring
Yong Huang,James L. Beck,Stephen Wu,Hui Li
Statistics , 2014, DOI: 10.1111/mice.12051
Abstract: In structural health monitoring (SHM) systems, massive amounts of data are often generated that need data compression techniques to reduce the cost of signal transfer and storage. Compressive sensing (CS) is a novel data acquisition method whereby the compression is done in a sensor simultaneously with the sampling. If the original sensed signal is sufficiently sparse in terms of some basis, the decompression can be done essentially perfectly up to some critical compression ratio. In this article, a Bayesian compressive sensing (BCS) method is investigated that uses sparse Bayesian learning to reconstruct signals from a compressive sensor. By explicitly quantifying the uncertainty in the reconstructed signal, the BCS technique exhibits an obvious benefit over existing regularized norm-minimization CS methods that provide a single signal estimate. However, current BCS algorithms suffer from a robustness problem: sometimes the reconstruction errors are very large when the number of measurements is a lot less than the number of signal degrees of freedom that are needed to capture the signal accurately in a directly sampled form. In this paper, we present improvements to the BCS reconstruction method to enhance its robustness so that even higher compression ratios can be used and we examine the tradeoff between efficiently compressing data and accurately decompressing it. Synthetic data and actual acceleration data collected from a bridge SHM system are used as examples. Compared with the state-of-the-art BCS reconstruction algorithms, the improved BCS algorithm demonstrates superior performance. With the same reconstruction error, the proposed BCS algorithm works with relatively large compression ratios and it can achieve perfect lossless compression performance with quite high compression ratios. Furthermore, the error bars for the signal reconstruction are also quantified effectively.
Page 1 /164546
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.