oalib

Publish in OALib Journal

ISSN: 2333-9721

APC: Only $99

Submit

Any time

2019 ( 27 )

2018 ( 42 )

2017 ( 52 )

2016 ( 58 )

Custom range...

Search Results: 1 - 10 of 23793 matches for " Mark Jones "
All listed articles are free for downloading (OA Articles)
Page 1 /23793
Display every page Item
TaxMan: a taxonomic database manager
Martin Jones, Mark Blaxter
BMC Bioinformatics , 2006, DOI: 10.1186/1471-2105-7-536
Abstract: TaxMan uses freely available tools to allow rapid assembly, storage and analysis of large, aligned DNA and protein sequence datasets for user-defined sets of species and genes. The user provides GenBank format files and a list of gene names and synonyms for the loci to analyse. Sequences are extracted from the GenBank files on the basis of annotation and sequence similarity. Consensus sequences are built automatically. Alignment is carried out (where possible, at the protein level) and aligned sequences are stored in a database. TaxMan can automatically determine the best subset of taxa to examine phylogeny at a given taxonomic level. By using the stored aligned sequences, large concatenated multiple sequence alignments can be generated rapidly for a subset and output in analysis-ready file formats. Trees resulting from phylogenetic analysis can be stored and compared with a reference taxonomy.TaxMan allows rapid automated assembly of a multigene datasets of aligned sequences for large taxonomic groups. By extracting sequences on the basis of both annotation and BLAST similarity, it ensures that all available sequence data can be brought to bear on a phylogenetic problem, but remains fast enough to cope with many thousands of records. By automatically assisting in the selection of the best subset of taxa to address a particular phylogenetic problem, TaxMan greatly speeds up the process of generating multiple sequence alignments for phylogenetic analysis. Our results indicate that an automated phylogenetic workbench can be a useful tool when correctly guided by user knowledge.Recently, there has been much interest in the use of large, concatenated multiple sequence alignments ('supermatrices') for phylogenetic analysis [1,2]. Such datasets have been shown to be useful in resolving difficult phylogenetic questions with a high degree of confidence. By combining the phylogenetic signal from multiple genes, clades can be recovered that are not recovered under analysis of
Note on Existence and Non-Existence of Large Subsets of Binary Vectors with Similar Distances
Gregory Gutin,Mark Jones
Computer Science , 2012,
Abstract: We consider vectors from $\{0,1\}^n$. The weight of such a vector $v$ is the sum of the coordinates of $v$. The distance ratio of a set $L$ of vectors is ${\rm dr}(L):=\max \{\rho(x,y):\ x,y \in L\}/ \min \{\rho(x,y):\ x,y \in L,\ x\neq y\},$ where $\rho(x,y)$ is the Hamming distance between $x$ and $y$. We prove that (a) for every constant $\lambda>1$ there are no positive constants $\alpha$ and $C$ such that every set $K$ of at least $\lambda^p$ vectors with weight $p$ contains a subset $K'$ with $|K'|\ge |K|^{\alpha}$ and ${\rm dr}(K')\le C$, % even when $|K|\ge \lambda$, (b) For a set $K$ of vectors with weight $p$, and a constant $C>2$, there exists $K'\subseteq K$ such that ${\rm dr}(K')\le C$ and $|K'| \ge |K|^\alpha$, where $\alpha = 1/ \lceil \log(p/2)/\log(C/2) \rceil$.
Parameterized Algorithms for Load Coloring Problem
Gregory Gutin,Mark Jones
Computer Science , 2013,
Abstract: One way to state the Load Coloring Problem (LCP) is as follows. Let $G=(V,E)$ be graph and let $f:V\rightarrow \{{\rm red}, {\rm blue}\}$ be a 2-coloring. An edge $e\in E$ is called red (blue) if both end-vertices of $e$ are red (blue). For a 2-coloring $f$, let $r'_f$ and $b'_f$ be the number of red and blue edges and let $\mu_f(G)=\min\{r'_f,b'_f\}$. Let $\mu(G)$ be the maximum of $\mu_f(G)$ over all 2-colorings. We introduce the parameterized problem $k$-LCP of deciding whether $\mu(G)\ge k$, where $k$ is the parameter. We prove that this problem admits a kernel with at most $7k$. Ahuja et al. (2007) proved that one can find an optimal 2-coloring on trees in polynomial time. We generalize this by showing that an optimal 2-coloring on graphs with tree decomposition of width $t$ can be found in time $O^*(2^t)$. We also show that either $G$ is a Yes-instance of $k$-LCP or the treewidth of $G$ is at most $2k$. Thus, $k$-LCP can be solved in time $O^*(4^k).$
Dimethylsulfide and Coral Bleaching: Links to Solar Radiation, Low Level Cloud and the Regulation of Seawater Temperatures and Climate in the Great Barrier Reef  [PDF]
Graham Jones, Mark Curran, Hilton Swan, Elisabeth Deschaseaux
American Journal of Climate Change (AJCC) , 2017, DOI: 10.4236/ajcc.2017.62017
Abstract: Coral reefs produce atmospheric dimethylsulfide (DMSa) which oxidises to non-sea-salt (nss) sulfate aerosols, precursors of cloud condensation nuclei (CCN) and low level cloud (LLC), reducing solar radiation and regulating sea surface temperatures (SSTs). Here we report measurements of solar radiation, SST, LLC, DMS flux, \"\", and rainfall before, during and after a major coral bleaching event at Magnetic Island in the central Great Barrier Reef (GBR). Measurements are compared with those made at the nearby fringing reef of Or-pheus Island where coral bleaching did not occur. Extreme solar radiation levels occurred from November to late January and could have reflected cloud radiative effects that increased downwelling of solar radiation. High levels of LLC often coincided with high periodic fluxes of DMS from the unbleached coral reef at Orpheus Island (e.g. 14 - 20 μmol·m-2·d-1), in direct contrast to the very low fluxes of DMS that were emitted from the bleached, human-impacted Magnetic Island fringing reef (nd-0.8 μmol·m-2·d-1) when SSTs were >30°C. Continuous SSTs measurements at the Magnetic Island reef revealed various heating and cooling periods, interspersed with stable SSTs. Cooling periods (negative climate feedback) ranged from -1°C to -3°C (7 day mean -1.6°C), and often seemed to occur during low tides, periodic pulses of DMS flux and LLC, keeping SSTs < 30°C. In contrast warming periods of +1°C to +3°C (positive climate feedback, 7 day mean +1.52°C), seemed to occur during increasing tides, decreasing DMS flux and low to medium levels of LLC which increased solar radiation and caused SSTs over 30°C and corals to bleach. Alternation between these two states or types of feedback is indicated in this research and may be a function of enhanced scattering of solar radiation from nss-sulfate aerosols that originate from oxidation of DMSa produced from the coral reefs in
Track-Monitoring and Analyzing Machine Clearances during Wood Forwarding  [PDF]
Marie-France Jones, Mark Castonguay, Dirk Jaeger, Paul Arp
Open Journal of Forestry (OJF) , 2018, DOI: 10.4236/ojf.2018.83020
Abstract: This article reports on track-monitoring and analyzing machine clearances during wood forwarding across seasons and weather, using ultrasonic distance sensors in combination with time-stamped GPS xy locations, at 10 sec intervals. The resulting data, obtained from 54 harvesting blocks, were analyzed by machine type (two wood forwarders and one grapple skidder), stand type (softwood plantation versus natural hardwood stands), month, slope, cartographic depth-to-water (DTW) classes, number of passes along track, and machine speed. For the most part, clearances were highly variable, due to passing over stumps, rocks, harvest slash, brushmats, ruts, and snow cover when present. This variability was on average greater for the lighter-weight wood forwarders than for the heavier-weight skidder, with the former mostly moving along equally spaced lines on brushmats, while the paths of the latter spread away from central wood-landing sites. In terms of trends, machines moved 1) more slowly on wet ground, 2) faster during returning than forwarding, and 3) fastest along wood-landing roads, as to be expected. Low clearances were most notable during winter on snow-covered ground, and on non-frozen shallow DTW and wet multiple-pass ground. During dry weather conditions, clearances also increased from low-pass tracks to multi-pass tracks due to repeat soil compaction of broadened tracks. These results are presented block-by-block and by machine type. Each block-based clearance frequency pattern was quantified through regression analysis and using a gamma probability distribution function.
Late presentation of congenital diaphragmatic Hernia after a diagnostic laparoscopic surgery (a case report)
Yap Kok Hooi,Jones Mark
Journal of Cardiothoracic Surgery , 2013, DOI: 10.1186/1749-8090-8-8
Abstract: The authors report a rare case of 17-year-old lady with late presentation of congenital diaphragmatic hernia. She presented with vague abdominal pain and was thought to have urinary tract infection, ruptured ovarian cyst, and appendicitis by different medical teams in the first few days. She eventually underwent a diagnostic laparoscopy with no significant findings. In the early postoperative recovery period, she suffered from severe cardiorespiratory distress and a large intestinal left diaphragmatic hernia was diagnosed subsequently. At further operation a strangulated loop of large bowel herniating through a left antero-lateral congenital diaphragmatic hernia was discovered, which was reduced and repaired with a prolene mesh through thoracotomy. She made an excellent recovery and was discharged a few days after the operation. The authors postulate a mechanism of positive pressure from laparoscopic surgery causing herniation of large bowel through a pre-existing diaphragmatic defect. This case highlights the diagnostic challenge of this disease due to its diverse clinical presentation, the importance of prompt diagnosis and intervention.
jMOTU and Taxonerator: Turning DNA Barcode Sequences into Annotated Operational Taxonomic Units
Martin Jones,Anisah Ghoorah,Mark Blaxter
PLOS ONE , 2012, DOI: 10.1371/journal.pone.0019259
Abstract: DNA barcoding and other DNA sequence-based techniques for investigating and estimating biodiversity require explicit methods for associating individual sequences with taxa, as it is at the taxon level that biodiversity is assessed. For many projects, the bioinformatic analyses required pose problems for laboratories whose prime expertise is not in bioinformatics. User-friendly tools are required for both clustering sequences into molecular operational taxonomic units (MOTU) and for associating these MOTU with known organismal taxonomies.
Cases of Adverse Reaction to Psychotropic Drugs and Possible Association with Pharmacogenetics
Irina Piatkov,Trudi Jones,Mark McLean
Journal of Personalized Medicine , 2012, DOI: 10.3390/jpm2040149
Abstract: Thousands of samples for pharmacogenetic tests have been analysed in our laboratory since its establishment. In this article we describe some of the most interesting cases of CYP poor metabolisers associated with adverse reactions to psychotropic drugs. Prevention of disease/illness, including Adverse Drug Reaction (ADR), is an aim of modern medicine. Scientific data supports the fact that evaluation of drug toxicology includes several factors, one of which is genetic variations in pharmacodynamics and pharmacokinetics of drug pathways. These variations are only a part of toxicity evaluation, however, even if it would help to prevent only a small percentage of patients from suffering adverse drug reactions, especially life threatening ADRs, pharmacogenetic testing should play a significant role in any modern psychopharmacologic practice. Medical practitioners should also consider the use of other medications or alternative dosing strategies for drugs in patients identified as altered metabolisers. This will promise not only better and safer treatments for patients, but also potentially lowering overall healthcare costs.
A systems approach to clinical oncology: Focus on breast cancer
Mark Abramovitz, Brian Leyland-Jones
Proteome Science , 2006, DOI: 10.1186/1477-5956-4-5
Abstract: However, we do not believe that genomics is adequate as a sole prognostic and predictive platform in breast cancer. The key proteins driving oncogenesis, for example, can undergo post-translational modifications; moreover, if we are ever to move individualization of therapy into the practical world of blood-based assays, serum proteomics becomes critical. Proteomic platforms, including tissue micro-arrays (TMA) and protein chip arrays, in conjunction with surface-enhanced laser desorption ionization time-of-flight mass spectrometry (SELDI-TOF/MS), have been the technologies most widely applied to the characterization of tumours and serum from breast cancer patients, with still limited but encouraging results.This review will focus on these genomic and proteomic platforms, with an emphasis placed on the utilization of FFPE tumour tissue samples and serum, as they have been applied to the study of breast cancer for the discovery of gene signatures and biomarkers for the early diagnosis, prognosis and prediction of treatment outcome. The ultimate goal is to be able to apply a systems biology approach to the information gleaned from the combination of these techniques in order to select the best treatment strategy, monitor its effectiveness and make changes as rapidly as possible where needed to achieve the optimal therapeutic results for the patient.In the United States it is estimated that approximately 213,000 new cases of invasive breast cancer will be diagnosed in 2006 and 41,000 women are expected to die from this disease [1]. Breast cancer will account for ~31% of new cancer cases among women in the United states in 2006 [1]. Current treatment strategies rely mainly on anatomic staging that continues to play a significant role in the decision making process. Classical pathological indexes that are used to predict survival, development of metastatic disease or guide selection of primary therapy in patients with breast cancer include the Nottingham Prognostic Index
Consistent cosmology with Higgs thermal inflation in a minimal extension of the MSSM
Hindmarsh, Mark;Jones, D. R. Timothy
High Energy Physics - Phenomenology , 2013,
Abstract: We consider a class of models in which minimal gauged F-term hybrid inflation is coupled renormalisably to the minimal supersymmetric standard model (MSSM), with no extra ingredients; we call this class the "minimal hybrid inflationary supersymmetric standard model" (MHISSM). The singlet inflaton supplies the Higgs mu-term, and allows an exit from inflation to a vacuum characterised by large Higgs vevs. The true ground state is reached after an period of thermal inflation along the Higgs flat direction. The scalar spectral index is reduced from the standard F-term value, to approximately 0.976 in the case where the inflaton potential is dominated by the 1-loop corrections. The reheat temperature following thermal inflation is about 10^9 GeV, solving the gravitino overclosure problem. A Higgs condensate reduces the cosmic string mass per unit length, rendering it compatible with the Cosmic Microwave Background constraints without tuning the inflaton coupling. With the minimal U(1)' gauge symmetry in the inflaton sector, where one of the waterfall fields generates a right-handed neutrino mass, we investigate the Higgs thermal inflation scenario in three popular supersymmetry-breaking schemes: AMSB, GMSB and the CMSSM, focusing on the constraints on the gravitino mass.
Page 1 /23793
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.