Publish in OALib Journal

ISSN: 2333-9721

APC: Only $99


Any time

2019 ( 256 )

2018 ( 408 )

2017 ( 390 )

2016 ( 586 )

Custom range...

Search Results: 1 - 10 of 339860 matches for " David J. Gavaghan "
All listed articles are free for downloading (OA Articles)
Page 1 /339860
Display every page Item
Pooling data for Number Needed to Treat: no problems for apples
R Andrew Moore, David J Gavaghan, Jayne E Edwards, Phillip Wiffen, Henry J McQuay
BMC Medical Research Methodology , 2002, DOI: 10.1186/1471-2288-2-2
Abstract: A review of nursing interventions for smoking cessation from the Cochrane Library provided different values for NNT depending on how NNTs were calculated. The Cochrane review was evaluated for clinical heterogeneity using L'Abbé plot and subsequent analysis by secondary and primary care settings.Three studies in primary care had low (4%) baseline quit rates, and nursing interventions were without effect. Seven trials in hospital settings with patients after cardiac surgery, or heart attack, or even with cancer, had high baseline quit rates (25%). Nursing intervention to stop smoking in the hospital setting was effective, with an NNT of 14 (95% confidence interval 9 to 26). The assumptions involved in using risk difference and odds ratio scales for calculating NNTs are discussed.Clinical common sense and concentration on raw data helps to detect clinical heterogeneity. Once robust statistical tests have told us that an intervention works, we then need to know how well it works. The number needed to treat or harm is just one way of showing that, and when used sensibly can be a useful tool.Cates [1] concentrates on Simpson's paradox, which relates to problems that can arise when there is an imbalance between treatment and placebo arms in controlled trials. This "paradox" is hardly new, having first been discussed by E.H. Simpson 50 years ago [2], and is now a staple of any undergraduate statistics course. Cates further contends that NNTs should be calculated from weighted risk differences (or odds ratios) rather than pooled raw events, although this is relevant to Simpson's paradox only if inappropriate statistical methods are being used in inappropriate circumstances.It all comes down to the old problem of meta-analysis, of whether you are comparing apples with something else, and how you count the apples when you've got them.All of this is based on a numerical analysis of a Cochrane review of nursing interventions for smoking cessation [3]. The pooled raw data show t
Using evidence from different sources: an example using paracetamol 1000 mg plus codeine 60 mg
Lesley A Smith, R Andrew Moore, Henry J McQuay, David Gavaghan
BMC Medical Research Methodology , 2001, DOI: 10.1186/1471-2288-1-1
Abstract: Randomised, double-blind, placebo-controlled trials of paracetamol 1000 mg and codeine 60 mg had an NNT of 2.2 (95% confidence interval 1.7 to 2.9) for at least 50% pain relief over four to six hours in three trials with 197 patients. Computer simulation of randomised trials demonstrated 92% confidence that the simulated NNT was within ± 0.5 of the underlying value of 2.2 with this number of patients. The result was supported a rational dose-response relationship for different doses of paracetamol and codeine in 17 additional trials with 1,195 patients. Three controlled trials lacking a placebo and with 117 patients treated with of paracetamol 1000 mg and codeine 60 mg had 73% (95%CI 56% to 81%) of patients with at least 50% pain relief, compared with 57% (48% to 66%) in placebo controlled trials. Six trials in acute pain were omitted because of design issues, like the use of different pain measures or multiple dosing regimens. In each paracetamol 1000 mg and codeine 60 mg was shown to be better than placebo or comparators for at least one measure.Different designs of high quality trials can be used to support limited information used in meta-analysis without recourse to low quality trials that might be biased.The use of evidence-based approaches to therapeutic decision making can frequently raise the problem of how to make decisions when evidence is in limited supply. Often systematic reviews limit trial inclusion in an attempt to generate clinical homogeneity and allow sensible meta-analysis. The problem, though, is that other, useful, information is omitted. An example of this is the popular combination of paracetamol with codeine for treatment of acute and, more frequently, chronic pain. For the combination of 1000 mg paracetamol plus 60 mg codeine, for instance, there was information on only 127 patients in two placebo-controlled acute pain studies [1, 2].Systematic reviews should seek unbiased evidence, which may limit the number of studies available for analy
The Free Energy Landscape of Dimerization of a Membrane Protein, NanC
Thomas A. Dunton,Joseph E. Goose,David J. Gavaghan,Mark S. P. Sansom,James M. Osborne
PLOS Computational Biology , 2014, DOI: doi/10.1371/journal.pcbi.1003417
Abstract: Membrane proteins are frequently present in crowded environments, which favour lateral association and, on occasions, two-dimensional crystallization. To better understand the non-specific lateral association of a membrane protein we have characterized the free energy landscape for the dimerization of a bacterial outer membrane protein, NanC, in a phospholipid bilayer membrane. NanC is a member of the KdgM-family of bacterial outer membrane proteins and is responsible for sialic acid transport in E. coli. Umbrella sampling and coarse-grained molecular dynamics were employed to calculate the potentials of mean force (PMF) for a variety of restrained relative orientations of two NanC proteins as the separation of their centres of mass was varied. We found the free energy of dimerization for NanC to be in the range of to . Differences in the depths of the PMFs for the various orientations are related to the shape of the proteins. This was quantified by calculating the lipid-inaccessible buried surface area of the proteins in the region around the minimum of each PMF. The depth of the potential well of the PMF was shown to depend approximately linearly on the buried surface area. We were able to resolve local minima in the restrained PMFs that would not be revealed using conventional umbrella sampling. In particular, these features reflected the local organization of the intervening lipids between the two interacting proteins. Through a comparison with the distribution of lipids around a single freely-diffusing NanC, we were able to predict the location of these restrained local minima for the orientational configuration in which they were most pronounced. Our ability to make this prediction highlights the important role that lipid organization plays in the association of two NanCs in a bilayer.
A Two-Dimensional Model of the Colonic Crypt Accounting for the Role of the Basement Membrane and Pericryptal Fibroblast Sheath
Sara-Jane Dunn ,Paul L. Appleton,Scott A. Nelson,Inke S. N?thke,David J. Gavaghan,James M. Osborne
PLOS Computational Biology , 2012, DOI: 10.1371/journal.pcbi.1002515
Abstract: The role of the basement membrane is vital in maintaining the integrity and structure of an epithelial layer, acting as both a mechanical support and forming the physical interface between epithelial cells and the surrounding connective tissue. The function of this membrane is explored here in the context of the epithelial monolayer that lines the colonic crypt, test-tube shaped invaginations that punctuate the lining of the intestine and coordinate a regular turnover of cells to replenish the epithelial layer every few days. To investigate the consequence of genetic mutations that perturb the system dynamics and can lead to colorectal cancer, it must be possible to track the emerging tissue level changes that arise in the crypt. To that end, a theoretical crypt model with a realistic, deformable geometry is required. A new discrete crypt model is presented, which focuses on the interaction between cell- and tissue-level behaviour, while incorporating key subcellular components. The model contains a novel description of the role of the surrounding tissue and musculature, based upon experimental observations of the tissue structure of the crypt, which are also reported. A two-dimensional (2D) cross-sectional geometry is considered, and the shape of the crypt is allowed to evolve and deform. Simulation results reveal how the shape of the crypt may contribute mechanically to the asymmetric division events typically associated with the stem cells at the base. The model predicts that epithelial cell migration may arise due to feedback between cell loss at the crypt collar and density-dependent cell division, an hypothesis which can be investigated in a wet lab. This work forms the basis for investigation of the deformation of the crypt structure that can occur due to proliferation of cells exhibiting mutant phenotypes, experiments that would not be possible in vivo or in vitro.
Ten Simple Rules for Effective Computational Research
James M. Osborne ,Miguel O. Bernabeu,Maria Bruna,Ben Calderhead,Jonathan Cooper,Neil Dalchau,Sara-Jane Dunn,Alexander G. Fletcher,Robin Freeman,Derek Groen,Bernhard Knapp,Greg J. McInerny,Gary R. Mirams,Joe Pitt-Francis,Biswa Sengupta,David W. Wright,Christian A. Yates,David J. Gavaghan,Stephen Emmott,Charlotte Deane
PLOS Computational Biology , 2014, DOI: doi/10.1371/journal.pcbi.1003506
Chaste: An Open Source C++ Library for Computational Physiology and Biology
Gary R. Mirams ,Christopher J. Arthurs,Miguel O. Bernabeu,Rafel Bordas,Jonathan Cooper,Alberto Corrias,Yohan Davit,Sara-Jane Dunn,Alexander G. Fletcher,Daniel G. Harvey,Megan E. Marsh,James M. Osborne,Pras Pathmanathan,Joe Pitt-Francis,James Southern,Nejib Zemzemi,David J. Gavaghan
PLOS Computational Biology , 2013, DOI: 10.1371/journal.pcbi.1002970
Abstract: Chaste — Cancer, Heart And Soft Tissue Environment — is an open source C++ library for the computational simulation of mathematical models developed for physiology and biology. Code development has been driven by two initial applications: cardiac electrophysiology and cancer development. A large number of cardiac electrophysiology studies have been enabled and performed, including high-performance computational investigations of defibrillation on realistic human cardiac geometries. New models for the initiation and growth of tumours have been developed. In particular, cell-based simulations have provided novel insight into the role of stem cells in the colorectal crypt. Chaste is constantly evolving and is now being applied to a far wider range of problems. The code provides modules for handling common scientific computing components, such as meshes and solvers for ordinary and partial differential equations (ODEs/PDEs). Re-use of these components avoids the need for researchers to ‘re-invent the wheel’ with each new project, accelerating the rate of progress in new applications. Chaste is developed using industrially-derived techniques, in particular test-driven development, to ensure code quality, re-use and reliability. In this article we provide examples that illustrate the types of problems Chaste can be used to solve, which can be run on a desktop computer. We highlight some scientific studies that have used or are using Chaste, and the insights they have provided. The source code, both for specific releases and the development version, is available to download under an open source Berkeley Software Distribution (BSD) licence at http://www.cs.ox.ac.uk/chaste, together with details of a mailing list and links to documentation and tutorials.
Science communication under scrutiny
Helen Gavaghan
Genome Biology , 2003, DOI: 10.1186/gb-spotlight-20030721-01
Abstract: Under the chairmanship of Patrick Bateson, the society's biological secretary, a working group will produce guidance on best practice, to be published sometime in the fall. It will be sent to anyone receiving funding from the Royal Society and to the fellows, and it will be disseminated to the wider scientific community both within and outside industry. A separate brief is to be produced for the public.The reports will identify ways in which peer review can be improved to increase public confidence in research. They will also consider alternatives to peer review for assessing the quality of research results released to the public.In an interview with us, Bateson said there is mistrust of science as evidenced, for example, by some responses to the Royal Society's work on genetically modified organisms (GMOs). Given the current centrality of peer review to scientific claims and the importance of claims from science in such controversial areas as GMOs and the measles, mumps, and rubella vaccine, the working party will examine this process closely.Some of the better known and more widely discussed concerns about peer review are: how journals and grant-giving bodies select reviewers; whether reviewers should remain anonymous; whether reviewers ever hold up the publication of their rivals' work or purloin data; whether papers submitted by big names in their field are as carefully scrutinized as is the work of lesser known researchers. "Much of this is paranoid," said Bateson, "but not all. The issues need to be examined openly."Some have even said the system of peer review is so flawed, why not simply do away with it," he added. Yet alternative methods of ensuring the quality of research findings also have drawbacks. An example is preprint publication, in which unpublished findings are openly subjected to the wider criticism of peers. This currently happens in some fields of physics, in artificial intelligence, and in larger, specialized institutions.In branches of the bi
Open-access publishing finds official favor
Helen Gavaghan
Genome Biology , 2003, DOI: 10.1186/gb-spotlight-20030701-01
Abstract: And in an effort to stimulate discussion about the steps needed to promote open access more broadly, an international group of scholars, funders, librarians, editors, and lawyers - both scientists and nonscientists - released a draft definition of open-access publication on the Web on June 20. The definition is part of the "Bethesda statement on open-access publishing," drawn up as a result of a meeting held in April at the headquarters of the Howard Hughes Medical Institute (HHMI) in Chevy Chase, Md.In a separate development last week, US congressman Rep. Martin Sabo (D-Minn.) introduced a bill in the House of Representatives (June 26) that would prohibit copyright protection for any works stemming from substantially federally funded research.Under the terms of the deal with BioMed Central, the Joint Information Systems Committee (JISC), a committee of the UK's further and higher education funding bodies, is making a blanket payment covering university membership of BioMed Central. As a result, from July 1 university researchers in the United Kingdom whose work is accepted for publication in one of the company's peer-reviewed, online journals will not have to pay an author fee. In these journals, all research articles can, of course, be accessed free of charge by anyone with an Internet connection, and copyright is retained by the author.In both its commercial guise, as pioneered by BioMed Central, and the not-for-profit version being developed by, among others, the Public Library of Science (PLoS), open-access publishing is gaining increasing attention in the current international debate about scholarly communication. It is one option for making research more visible and reducing the cost to academia of journal subscriptions."What's not to like about the idea?" asked Gerry Rubin, vice president of the HHMI, "Why would you want to publish a paper if you don't want people to read it?"For JISC the deal is part of a wider agenda. "The current scholarly communication s
New head for European Science Foundation
Helen Gavaghan
Genome Biology , 2003, DOI: 10.1186/gb-spotlight-20030610-01
Abstract: Currently, the ESF is a modestly funded body—its budget this year is €12 million—but it is intellectually influential in basic science circles because of its network of connections to national academies and scientific organizations around Europe. In the future, if Andersson has his way, the ESF will turn its influence into substantial economic muscle and will play a significant part in directing the proposed European Research Council.The idea for such a council has been around for several years, and in April this year an ESF working group chaired by Richard Sykes, rector of Imperial College London, released a report suggesting how such a council could be introduced. Andersson was a member of the working group.In the group's vision, existing pan-European research funds for basic research would be transferred to the council. After 5 years, the council would have achieved a funding stream equivalent to that of a research council in one of the major member states. None of the details on how this is to be achieved are clear.Such a pot of money is needed, Andersson said, because although scientists in different countries know their peers, it is not easy to find funding that also crosses national boundaries.Andersson's ideas for the ESF and his aspiration that it play a significant part in a European Research Council are bold, but they are a sign of the times. The European Council of Ministers said at a meeting in Barcelona in March 2002 that by 2010, it wanted to see 3% of gross domestic product spent on research, development, and innovation, compared with 1.9% now. And the European Commission is pushing for the establishment of a European Research Area that would cooperate on projects too large for one country.Andersson, a professor of biochemistry, is the president of Link?ping University in Sweden. ESF's Governing Council has recommended that he take over the organization when Enric Banda's term of office comes to an end later this year.
The future's bright, the future's online
Helen Gavaghan
Genome Biology , 2001, DOI: 10.1186/gb-spotlight-20011023-02
Abstract: Founded in the spring of 2000 by Louise Pack Kaelbling, Professor of Computer Science and Engineering at Massachusetts Institute of Technology (MIT), JMLR makes its research papers freely available on the web. For archiving purposes, MIT Press is to publish a hard copy version, which will be made available on a not-for-profit basis.The move sends a powerful message to tenure track committees that, though the journal is new and does not have the academic record of JML, publication in JMLR should be assessed on the basis that it has the backing of a substantial portion of the editorial board that contributed to the reputation of JML.By releasing their announcement to the machine learning community by email, the 40 machine learning scientists also turned up the temperature on what is already one of the hottest topics concerning the conduct of science, namely the question of how scientists can most effectively review, register, disseminate and archive the knowledge gleaned during their work in this age of email and the Internet.Before making their move, JML's editorial board experienced several years of increasing frustration in their efforts to persuade Kluwers to listen to concerns.At stake were a number of increasingly common areas of disagreement. Some, like how many pages the publisher budgets per year for the publication, can seem insignificant to the general public, but they go to the heart of the practice of science and can prevent important breakthroughs from being scrutinised in a timely manner."Machine learning is a fast moving field and papers that had already spent several months going through peer review were sitting for additional months in a pile waiting to be published because we did not have enough pages and could not guarantee to the publishers that we could sustain a higher page count," says Robert Holte, JML's executive editor (a scientist, not a representative of Kluwers) and a Professor at the University of Alberta. "By the time they hit peoples'
Page 1 /339860
Display every page Item

Copyright © 2008-2017 Open Access Library. All rights reserved.