Publish in OALib Journal

ISSN: 2333-9721

APC: Only $99


Any time

2019 ( 55 )

2018 ( 97 )

2017 ( 88 )

2016 ( 120 )

Custom range...

Search Results: 1 - 10 of 34666 matches for " Daniel Woo "
All listed articles are free for downloading (OA Articles)
Page 1 /34666
Display every page Item
A Note on Preconditioning by Low-Stretch Spanning Trees
Daniel A Spielman,Jaeoh Woo
Computer Science , 2009,
Abstract: Boman and Hendrickson observed that one can solve linear systems in Laplacian matrices in time $\bigO{m^{3/2 + o (1)} \ln (1/\epsilon)}$ by preconditioning with the Laplacian of a low-stretch spanning tree. By examining the distribution of eigenvalues of the preconditioned linear system, we prove that the preconditioned conjugate gradient will actually solve the linear system in time $\softO{m^{4/3} \ln (1/\epsilon)}$.
Reliability Design of Ice-Maker System Subjected to Repetitive Loading  [PDF]
Seong-Woo Woo
Engineering (ENG) , 2016, DOI: 10.4236/eng.2016.89056
Abstract: Parametric Accelerated Life Testing (ALT) was used to improve the reliability of ice-maker system with a fractured helix upper dispenser in field. By using bond graphs and state equations, a variety of mechanical loads in the assembly were analyzed. The acceleration factor was derived from a generalized life-stress failure model with a new load concept. To reproduce the failure modes and mechanisms causing the fracture, new sample size equation was derived. The sample size equation with the acceleration factor also enabled the parametric accelerated life testing to quickly reproduce early failure in field. Consequently, the failure modes and mechanisms found were identical with those of the failed sample. The design of this testing should help an engineer uncover the design parameters affecting the reliability of fractured helix upper dispenser in field. By eliminating the design flaws, gaps and weldline, the B1 life of the redesign of helix upper dispenser is now guaranteed to be over 10 years with a yearly failure rate of 0.1% that is the reliability quantitative test specifications (RQ).
Quality assessment of buccal versus blood genomic DNA using the Affymetrix 500 K GeneChip
Jessica G Woo, Guangyun Sun, Mary Haverbusch, Subbarao Indugula, Lisa J Martin, Joseph P Broderick, Ranjan Deka, Daniel Woo
BMC Genetics , 2007, DOI: 10.1186/1471-2156-8-79
Abstract: Buccal cytobrushes stored for ~7 years at -80°C prior to extraction yielded sufficient double stranded DNA (dsDNA) to be successfully genotyped on the Affymetrix ~262 K NspI chip, with yields between 536 and 1047 ng dsDNA. Using the BRLMM algorithm, genotyping call rates for blood samples averaged 98.4%, and for buccal samples averaged 97.8%. Matched blood samples exhibited 99.2% concordance, while matched blood and buccal samples exhibited 98.8% concordance.Buccal cytobrushes stored long-term result in sufficient dsDNA concentrations to achieve high genotyping call rates and concordance with stored blood samples in the context of Affymetrix 500 K SNP genotyping. Thus, given high-quality collection and storage protocols, it is possible to use stored buccal cytobrush samples for genome-wide association studies.While blood is considered the optimal source for DNA, inclusion of a blood draw may deter study participation [1]. Buccal cytobrush collection is a simple, painless procedure that allows for effective DNA sampling from a large population, and has been used in several large epidemiologic studies [2,3]. However, concerns regarding the use of buccal brushes have included the lower quantity of genomic DNA isolated [4], lower quality of DNA [4,5], and the fidelity of results from buccal brushes compared with blood samples [5-7]. In addition, there is a concern that older buccal brush samples may not yield as high-quality results as fresh samples [8].The advent of large scale genotyping platforms has also resulted in a reduction in the amount of DNA required. The Affymetrix 500 K GeneChip requires only 250 ng of total genomic DNA per chip, 500 ng total, and this DNA quantity has not changed with the recent release of the Affymetrix 5.0 and 6.0 chips, which enable genotyping up to 1.8 million genetic markers [9-11]. Thus, the DNA requirements of the Affymetrix chips are well below the expected yield of total DNA for buccal samples. As the Affymetrix system uses restri
Effect of On-Line Hemodiafiltration on Dry Weight Adjustment in Intradialytic Hypotension-Prone Patients: Comparative Study of Conventional Hemodialysis and On-Line Hemodiafiltration  [PDF]
Sun Woo Kang
Open Journal of Nephrology (OJNeph) , 2014, DOI: 10.4236/ojneph.2014.41001

Introduction: Correct adjustment of dry weight after hemodialysis (HD) with no signs of hypervolemia is important. Intradialytic hypotension (IDH) is the most common complication during HD. IDH occurs in 15% to 30% and possibly in up to 50% of dialysis sessions. IDH augments mortality essentially due to chronic overhydration and the inability to reach the proper dry weight. On-line hemodiafiltration (ol-HDF) has been reported to reduce the frequency of IDH. The aim of this study was to assess the effect of ol-HDF on hemodynamic stability and dry weight adjustment compared with low-flux HD. Methods: IDH-prone HD patients at our center were enrolled. This study was designed as a crossover trial with two phases (A arm: low-flux HD for 8 weeks followed by ol-HDF for 8 weeks vs. B arm: ol-HDF for 8 weeks followed by low-flux HD for 8 weeks) and two treatment arms (ol-HDF vs. low-flux HD), each phase lasting 8 weeks. We measured the proportion of body water using a body composition monitor (BCM). Results: In a comparison of the systolic blood pressure (SBP) and diastolic blood pressure (DBP) reductions from the baseline blood pressure between the HD and ol-HDF groups, statistically significant differences were observed only in the SBP of the B arm (SBP: HD vs. HDF, -9.83 ± 6.64 vs. -4.62 ± 1.61 mmHg, p = 0.036; DBP: HD vs. HDF, -3.29 ± 4.05 vs. -1.86 ± 1.49 mmHg, p = 0.261). Neither the mean of the interdialytic body weight gains nor the frequency of IDH was different between the A and B arms (p = 0.817 and p = 0.562, respectively). In terms of dialysis modality, there were no significant differences in the amount of overhydration between the conventional

A Computational Investigation of the Catalytic Properties of Graphene Oxide: Exploring Mechanisms Using DFT Methods
Danil W. Boukhvalov,Daniel R. Dreyer,Christopher W. Bielawski,Young-Woo Son
Physics , 2012, DOI: 10.1002/cctc.201200210
Abstract: Here we describe a computational study undertaken in an effort to elucidate the reaction mechanisms behind the experimentally observed oxidations and hydrations catalyzed by graphene oxide (GO). Using the oxidation of benzyl alcohol to benzaldehyde as a model reaction, density functional theory (DFT) calculations revealed that this reactivity stemmed from the transfer of hydrogen atoms from the organic molecule to the GO surface. In particular, neighbouring epoxide groups decorating GO's basal plane were ring-opened, resulting in the formation of diols, followed by dehydration. Consistent with the experimentally-observed dependence of this chemistry on molecular oxygen, our calculations revealed that the partially reduced catalyst was able to be recharged by molecular oxygen, allowing for catalyst turnover. Functional group-free carbon materials, such as graphite, were calculated to have substantially higher reaction barriers, indicating that the high chemical potential and rich functionality of GO are necessary for the observed reactivity.
Investigating bounds on decoherence in quantum mechanics via B and D-mixing
Alexander Lenz,David Hodges,Daniel Hulme,Sandra Kvedaraite,Jack Richings,Jian Shen Woo,Philip Waite
Physics , 2014, DOI: 10.1016/j.nuclphysb.2014.09.007
Abstract: We investigate bounds on decoherence in quantum mechanics by studying $B$ and $D$-mixing observables, making use of many precise new measurements, particularly from the LHC and B factories. In that respect we show that the stringent bounds obtained by a different group in 2013 rely on unjustified assumptions. Finally, we point out which experimental measurements could improve the decoherence bounds considerably.
Predicting Mortality and Functional Outcomes after Ischemic Stroke: External Validation of a Prognostic Model  [PDF]
Achala Vagal, Heidi Sucharewv, Christopher Lindsell, Dawn Kleindorfer, Kathleen Alwell, Charles J. Moomaw, Daniel Woo, Matthew Flaherty, Pooja Khatri, Opeolu Adeoye, Simona Ferioli, Jason Mackey, Sharyl Martini, Felipe De Los Rios La Rosa F., Brett Kissela
Journal of Behavioral and Brain Science (JBBS) , 2018, DOI: 10.4236/jbbs.2018.810036
Abstract: Background: We previously developed predictive models for 3-month mortality and modified Rankin Score (mRS) after ischemic stroke. Aim: The aim was to test model validity for 3-month mortality and mRS after ischemic stroke in two independent data sets. Methods: Our derivation models used data from 451 subjects with ischemic stroke in 1999 enrolled in the Greater Cincinnati/Northern Kentucky Stroke Study (GCKNSS). We utilized two separate cohorts of ischemic strokes through GCKNSS (460 in 2005 and 504 in 2010) to assess external validity by utilizing measures of agreement between predicted and observed values, calibration, and discrimination using Transparent Reporting of a multivariable prediction model for Individual Prognosis or Diagnosis. Results: The 3-month mortality model performed well in the validation datasets with an average prediction error (Brier score) of 0.045 for 2005 and 0.053 for 2010 and excellent discrimination with an area under the curve of 0.86 (95% CI: 0.79, 0.93) for 2005 and 0.84 (0.76, 0.92) for 2010. Predicted 3-month mRS also performed well in the validation datasets with R2 of 0.57 for 2005 and 0.50 for 2010 and a root mean square error of 0.85 for 2005 and 1.05 for 2010. Predicted mRS tended to be higher than actual in both validation datasets. Re-estimation of the model parameters for age and severe white matter hyperintensity in both 2005 and 2010, and for diabetes in 2005, improved predictive accuracy. Conclusions: Our previously developed stroke models performed well in two study periods, suggesting validity of the model predictions.
Improving the Reliability of a Domestic Refrigerator Compressor Subjected to Repetitive Loading  [PDF]
Seong-Woo Woo, Dennis L. O’Neal
Engineering (ENG) , 2016, DOI: 10.4236/eng.2016.83012
Abstract: As a reliability quantitative specification, parametric accelerated life testing was used to assess the reliability of a newly designed compressor of a commercial refrigerator subjected to repetitive stresses. A generalized life-stress failure model and new sample size equation with a new load concept were derived starting with the basic refrigeration cycle. The sample size equation with the acceleration factor also enabled the parametric accelerated life testing to quickly evaluate the expected lifetime. The design of this testing should help an engineer uncover the design parameters affecting reliability during the design process of the compressor system. Consequently, it should help companies improve product reliability and avoid recalls due to the product failures in the field. A newly designed compressor in a commercial refrigerator was used as a test case.
A telecom-wavelength atomic quantum memory in optical fiber for heralded polarization qubits
Jeongwan Jin,Erhan Saglamyurek,Marcel. li Grimau Puigibert,Varun B. Verma,Francesco Marsili,Sae Woo Nam,Daniel Oblak,Wolfgang Tittel
Physics , 2015, DOI: 10.1103/PhysRevLett.115.140501
Abstract: Photon-based quantum information processing promises new technologies including optical quantum computing, quantum cryptography, and distributed quantum networks. Polarization-encoded photons at telecommunication wavelengths provide a compelling platform for practical realization of these technologies. However, despite important success towards building elementary components compatible with this platform, including sources of entangled photons, efficient single photon detectors, and on-chip quantum circuits, a missing element has been atomic quantum memory that directly allows for reversible mapping of quantum states encoded in the polarization degree of a telecom-wavelength photon. Here we demonstrate the quantum storage and retrieval of polarization states of heralded single-photons at telecom-wavelength by implementing the atomic frequency comb protocol in an ensemble of erbium atoms doped into an optical fiber. Despite remaining limitations in our proof-of-principle demonstration such as small storage efficiency and storage time, our broadband light-matter interface reveals the potential for use in future quantum information processing.
Quantum storage of entangled telecom-wavelength photons in an erbium-doped optical fibre
Erhan Saglamyurek,Jeongwan Jin,Varun B. Verma,Matthew D. Shaw,Francesco Marsili,Sae Woo Nam,Daniel Oblak,Wolfgang Tittel
Physics , 2014, DOI: 10.1038/nphoton.2014.311
Abstract: The realization of a future quantum Internet requires processing and storing quantum information at local nodes, and interconnecting distant nodes using free-space and fibre-optic links. Quantum memories for light are key elements of such quantum networks. However, to date, neither an atomic quantum memory for non-classical states of light operating at a wavelength compatible with standard telecom fibre infrastructure, nor a fibre-based implementation of a quantum memory has been reported. Here we demonstrate the storage and faithful recall of the state of a 1532 nm wavelength photon, entangled with a 795 nm photon, in an ensemble of cryogenically cooled erbium ions doped into a 20 meter-long silicate fibre using a photon-echo quantum memory protocol. Despite its currently limited efficiency and storage time, our broadband light-matter interface brings fibre-based quantum networks one step closer to reality. Furthermore, it facilitates novel tests of light-matter interaction and collective atomic effects in unconventional materials.
Page 1 /34666
Display every page Item

Copyright © 2008-2017 Open Access Library. All rights reserved.