Publish in OALib Journal

ISSN: 2333-9721

APC: Only $99


Any time

2018 ( 9 )

2017 ( 14 )

2016 ( 15 )

2015 ( 343 )

Custom range...

Search Results: 1 - 10 of 6703 matches for " Tom Delbanco "
All listed articles are free for downloading (OA Articles)
Page 1 /6703
Display every page Item
Evaluating the impact of patients' online access to doctors' visit notes: designing and executing the OpenNotes project
Suzanne G Leveille, Janice Walker, James D Ralston, Stephen E Ross, Joann G Elmore, Tom Delbanco
BMC Medical Informatics and Decision Making , 2012, DOI: 10.1186/1472-6947-12-32
Abstract: Using a mixed methods approach, we designed a quasi-experimental study in 3 diverse healthcare systems in Boston, Pennsylvania, and Seattle. Two sites had existing patient internet portals; the third used an experimental portal. We targeted 3 key areas where we hypothesized the greatest impacts: beliefs and attitudes about OpenNotes, use of the patient internet portals, and patient-doctor communication. PCPs in the 3 sites were invited to participate in the intervention. Patients who were registered portal users of participating PCPs were given access to their PCPs' visit notes for one year. PCPs who declined participation in the intervention and their patients served as the comparison groups for the study. We applied the RE-AIM framework to our design in order to capture as comprehensive a picture as possible of the impact of OpenNotes. We developed pre- and post-intervention surveys for online administration addressing attitudes and experiences based on interviews and focus groups with patients and doctors. In addition, we tracked use of the internet portals before and during the intervention.PCP participation varied from 19% to 87% across the 3 sites; a total of 114 PCPs enrolled in the intervention with their 22,000 patients who were registered portal users. Approximately 40% of intervention and non-intervention patients at the 3 sites responded to the online survey, yielding a total of approximately 38,000 patient surveys.Many primary care physicians were willing to participate in this "real world" experiment testing the impact of OpenNotes on their patients and their practices. Results from this trial will inform providers, policy makers, and patients who contemplate such changes at a time of exploding interest in transparency, patient safety, and improving the quality of care.Providers and policymakers are pursuing many strategies to increase the engagement of patients in promoting health and managing illness. As the general trend toward transparency accelera
What’s Wrong with Requirements Specification? An Analysis of the Fundamental Failings of Conventional Thinking about Software Requirements, and Some Suggestions for Getting it Right  [PDF]
Tom Gilb
Journal of Software Engineering and Applications (JSEA) , 2010, DOI: 10.4236/jsea.2010.39096
Abstract: We know many of our IT projects fail and disappoint. The poor state of requirements methods and practice is frequently stated as a factor for IT project failure. In this paper, I discuss what I believe is the fundamental cause: we think like programmers, not engineers and managers. We do not concentrate on value delivery, but instead on functions, on use-cases and on code delivery. Further, management is not taking its responsibility to make things better. In this paper, ten practical key principles are proposed, which aim to improve the quality of requirements specification.
Internal Resource Audit for Strategists—A Proposal  [PDF]
Tom Connor
iBusiness (IB) , 2011, DOI: 10.4236/ib.2011.33038
Abstract: It is the purpose of this article to suggest a structured approach to internal resource audit, which, whilst of necessity general-purpose in design, would be capable of adaptation to particular company cases. Consequently this paper does not aim at theory development, but to make a conceptual contribution to the art and practice of management. It will, however, offer some criticism of current theory from a management perspective.
Tensioned Metastable Fluid Detectors in Nuclear Security for Passively Monitoring of Special Nuclear Materials―Part A  [PDF]
Tom Grimes, Rusi Taleyarkhan
World Journal of Nuclear Science and Technology (WJNST) , 2011, DOI: 10.4236/wjnst.2011.13010
Abstract: This paper (constituting Part A) describes the transformational Tensioned Metastable Fluid Detector (TMFD) based method for “passive” detection of Special Nuclear Materials (SNMs) as related to nuclear security. Purdue University is developing novel, multi-purpose tension metastable fluid nuclear particle detectors by which multiple types of nuclear particles can be detected with high (90%+) intrinsic efficiency, spectroscopic capability, directional information, rapid response, large standoff and significant cost-savings compared with state-of-the-art systems. This paper focuses specifically on recent advances in the use of these novel detector systems for neutron spectroscopy. These techniques will then be discussed and evaluated in the context of area monitoring in waste processing applications with a focus on passive monitoring of radioactive source particles from SNMs. The companion paper (Part B) addresses TMFD technology as it pertains to active interrogation.
Pass/Fail Criterion for a Simple Radiation  [PDF]
Tom Burr, Avigdor Gavron
Modern Instrumentation (MI) , 2012, DOI: 10.4236/mi.2012.13004
Abstract: One of the simplest tests of a radiation portal monitor (RPM) is a series of n repeats (a vehicle drive-through) in which the ith repeat records a total number of counts Xi and alarms if Xi ≥ T where T is an alarm threshold. The RPM performance tests we consider use n repeats to estimate the probability p = P(Xi ≥ T). This paper addresses criterion A for testing RPMs, where criterion A is: for specified source strength, we must be at least 95% confident that p ≥ 0.5. To assess criterion A, we consider a distribution-free test and a test relying on assuming the counts Xi have approximately a Poisson distribution. Both test options require tolerance interval construction.
An Evaluation of the English Language Curriculum of the Nigeria Certificate in Education: A Case Study of a College of Education  [PDF]
Oris Tom-Lawyer
Open Journal of Social Sciences (JSS) , 2014, DOI: 10.4236/jss.2014.27011
Abstract: This treatise is a pilot study that evaluated the implementation of the English language curriculum of the Nigeria Certificate in Education at a College of Education in Ogun State, Nigeria. The certificate is the basic qualification for teaching. The poor performance of Nigerian students in external English examinations has continued to be a source of worry to parents, educational stakeholders and the government. This problem has impeded the transition to higher education of many Nigerian students. In order to proffer solution to this problem, the effectiveness of the training of English language teachers need to be examined. The study sought to fill the gap by evaluating the implementation of the English language curriculum of the NCE in order to determine the effectiveness of the schooling of teachers. In investigating these issues, a mixed methods approach was used to utilise a case study. The sample comprised ten lecturers and twenty students drawn through convenience sampling techniques. The instruments were questionnaires, observation checklists, interviews and field notes. The methods of analysis were descriptive/inferential statistics and thematic content analysis. The findings revealed that lecturers employed mostly a combination of teaching modes in classrooms. The resources (physical and human) were found to be inadequate and the school technologically deficient. Furthermore, the negative attitudes of the students impacted on the implementation of the curriculum. The study identified the ineffective implementation of the NCE English language curriculum. The paper recommends that parents and other stakeholders should thoroughly investigate teacher training.
Calibration of Nondestructive Assay Instruments: An Application of Linear Regression and Propagation of Variance  [PDF]
Stephen Croft, Tom Burr
Applied Mathematics (AM) , 2014, DOI: 10.4236/am.2014.55075

Several nondestructive assay (NDA) methods to quantify special nuclear materials use calibration curves that are linear in the predictor, either directly or as an intermediate step. The linear response model is also often used to illustrate the fundamentals of calibration, and is the usual detector behavior assumed when evaluating detection limits. It is therefore important for the NDA community to have a common understanding of how to implement a linear calibration according to the common method of least squares and how to assess uncertainty in inferred nuclear quantities during the prediction stage following calibration. Therefore, this paper illustrates regression, residual diagnostics, effect of estimation errors in estimated variances used for weighted least squares, and variance propagation in a form suitable for implementation. Before the calibration can be used, a transformation of axes is required; this step, along with variance propagation is not currently explained in available NDA standard guidelines. The role of systematic and random uncertainty is illustrated and expands on that given previously for the chosen practical NDA example. A listing of open-source software is provided in the Appendix.

Analysis on the Growth Rhythm and Cold Tolerance of Five-Year Old Eucalyptus benthamii Plantation for Bioenergy  [PDF]
Aihua Yu, Tom Gallagher
Open Journal of Forestry (OJF) , 2015, DOI: 10.4236/ojf.2015.56052
Abstract: A research plot of Eucalyptus benthamii was planted to evaluate this species’ ability to supply the emerging bioenergy markets that are developing in the southern U.S. The plot was planted in two different densities to investigate the growth parameters and the cold tolerance. The stand was measured annually through five growing seasons. The results indicated that the growth difference among the young E. benthamii was noticeable. For example, the maximum and minimum value of five-year old trees at diameter breast height (DBH) was 27.9 centimeters and 1.27 centimeters; and the maximum and minimum value of tree height was 22.86 meters and 2.44 meters, respectively. The yearly change in DBH and height of E. benthamii had significant differences. The average annual survival rates of E. benthamii had differences under the two planting densities (1650 trees ha-1 and 1237 trees ha-1). The densities also had effects on the height and DBH growth of E. benthamii. The average DBH and height of 1650 trees ha-1 plantation were 11.18 centimeters and 15.03 meters, and the average DBH and height of 1237 trees ha-1 plantation were 13.46 centimeters and 16.28 meters. The volume per hectare of 1650 trees ha-1 and 1237 trees ha-1 plantation were 111.45 cubic meters and 101.15 cubic meters, respectively. Average diameter growth was almost 2.54 centimeters per year and average height growth was over 3 meters. E. benthamii plantations were considered tolerant to -7.4 degrees Celsius and a cold spell during early 2014 (-11.3 degrees Celsius for two consecutive nights) killed the plantation. The growth of E. benthamii also
A Logical Characterization of Constraint-Based Causal Discovery
Tom Claassen,Tom Heskes
Computer Science , 2012,
Abstract: We present a novel approach to constraint-based causal discovery, that takes the form of straightforward logical inference, applied to a list of simple, logical statements about causal relations that are derived directly from observed (in)dependencies. It is both sound and complete, in the sense that all invariant features of the corresponding partial ancestral graph (PAG) are identified, even in the presence of latent variables and selection bias. The approach shows that every identifiable causal relation corresponds to one of just two fundamental forms. More importantly, as the basic building blocks of the method do not rely on the detailed (graphical) structure of the corresponding PAG, it opens up a range of new opportunities, including more robust inference, detailed accountability, and application to large models.
A Bayesian Approach to Constraint Based Causal Inference
Tom Claassen,Tom Heskes
Computer Science , 2012,
Abstract: We target the problem of accuracy and robustness in causal inference from finite data sets. Some state-of-the-art algorithms produce clear output complete with solid theoretical guarantees but are susceptible to propagating erroneous decisions, while others are very adept at handling and representing uncertainty, but need to rely on undesirable assumptions. Our aim is to combine the inherent robustness of the Bayesian approach with the theoretical strength and clarity of constraint-based methods. We use a Bayesian score to obtain probability estimates on the input statements used in a constraint-based procedure. These are subsequently processed in decreasing order of reliability, letting more reliable decisions take precedence in case of con icts, until a single output model is obtained. Tests show that a basic implementation of the resulting Bayesian Constraint-based Causal Discovery (BCCD) algorithm already outperforms established procedures such as FCI and Conservative PC. It can also indicate which causal decisions in the output have high reliability and which do not.
Page 1 /6703
Display every page Item

Copyright © 2008-2017 Open Access Library. All rights reserved.