oalib

Publish in OALib Journal

ISSN: 2333-9721

APC: Only $99

Submit

Search Results: 1 - 10 of 158 matches for " Cornel Doniga "
All listed articles are free for downloading (OA Articles)
Page 1 /158
Display every page Item
Current State of the Art for Existing Critical Systems in Urban Seismic Area
Ana-Diana Anca?,Cornel Doniga,Gabriela Atanasiu
Bulletin of the Polytechnic Institute of Jassy, Constructions, Architechture Section , 2006,
Abstract: The paper is structured in three main parts. In the first part, there is a classification of the critical systems existing in the urban background, which are part of an extremely urging informational infrastructure in case of a seismic event. The critical systems are identified as those systems whose malfunction may lead to disasters affecting the human being and the environment as well. The second part focuses on the underground pipe networks, presenting at the same time, their classification into three categories of great importance in case of an earthquake, and last but not least, the final part contains the presentation of an analysis methodology of seismic behavior of the underground pipe networks, considered as part of the existing critical systems, in case of a seismic event in an urban area.
Targeting the Inflammation Culprit in Patients with Psoriasis/Psoriatic Arthritis and Associated Cardiovascular Comorbidities. Is the IL-17 Inhibitor the New Kid on the Block?  [PDF]
Cornel Pater
World Journal of Cardiovascular Diseases (WJCD) , 2019, DOI: 10.4236/wjcd.2019.94024
Abstract: Despite half-century old, but comprehensive national and international guidance, evidence of clinical effectiveness and widespread agreement on management of risk factors along with sophisticated measures for primary and secondary prevention of major cardiovascular events, cardiovascular disease remains the dominant cause of death and disability world-wide. Life style changes at population-level (e.g., lower salt and saturated fat consumption or reduced/banned amount of industrially-produced trans fatty acids in specific products, etc.) or changes at individual level (e.g., targeting modifiable risk factors/life style changes affecting smoking/tobacco use, poor diet, high blood cholesterol, high blood pressure, insufficient physical activity, overweight/obesity) have reduced coronary heart disease mortality to variable extent in different countries (mostly so reported in Finland, Iceland and Sweden) at the beginning of the new century. Overall, however, cardiovascular mortality is estimated to increase in the next coming years until 2030 at a cost exceeding US $1044 billion. Several decades of status quo are also noted in the therapeutic spectrum of cardiovascular disease, mainly consisting of variations to LDL-C lowering agents, antihypertensives, anticoagulants, antiplatelets and fibrinolytics. Most of the therapeutic interventions are “tertiary” in nature (probably some 60%), meaning that treatment is instituted once the individual has developed a pathologic condition; “secondary prevention” may cover some 25%-30% (meant to prevent re-occurrence of the condition or occurrence of complications) while “primary prevention” is left with 10%-15% share (most commonly implying life style changes at individual level and rarely pharmacological intervention). For almost three decades, the so-called inflammatory hypothesis has been promoted as a reasonable pathogenetic theory behind initiation and growth of atherosclerotic plaque (Alexander RW, 1994; Ross R, 1999). With the discovery of molecular and cellular pathways that promote atherosclerosis and the role of
The Rich-Gini-Simpson quadratic index of biodiversity  [PDF]
Radu Cornel Guiasu, Silviu Guiasu
Natural Science (NS) , 2010, DOI: 10.4236/ns.2010.210140
Abstract: The Gini-Simpson quadratic index is a classic measure of diversity, widely used by ecologists. As shown recently, however, this index is not suitable for the measurement of beta diversity when the number of species is very large. The objective of this paper is to introduce the Rich- Gini-Simpson quadratic index which preserves all the qualities of the classic Gini-Simpson index but behaves very well even when the number of species is very large. The additive partitioning of species diversity using the Rich-Gini- Simpson quadratic index and an application from island biogeography are analyzed.
The weighted quadratic index of biodiversity for pairs of species: a generalization of Rao’s index  [PDF]
Radu Cornel Guiasu, Silviu Guiasu
Natural Science (NS) , 2011, DOI: 10.4236/ns.2011.39104
Abstract: The distribution of biodiversity at multiple sites of a region has been traditionally investigated through the additive partitioning of the regional biodiversity, called γ-diversity, into the average within-site biodiversity or α-diversity, and the biodiversity among sites, or β-diversity. The standard additive partitioning of diversity requires the use of a measure of diversity which is a concave function of the relative abundance of species, like the Shannon entropy or the Gini- Simpson index, for instance. When a phylogenetic distance between species is also taken into account, Rao’s quadratic index has been used as a measure of dissimilarity. Rao’s index, however, is not a concave function of the distribution of relative abundance of either individual species or pairs of species and, consequently, only some nonstandard additive partitionings of diversity have been given using this index. The objective of this paper is to show that the weighted quadratic index of biodiversity, a generalization of the weighted Gini-Simpson index to the pairs of species, is a concave function of the joint distribution of the relative abundance of pairs of species and, therefore, may be used in the standard additive partitioning of diversity instead of Rao’s index. The replication property of this new measure is also discussed.
Weighted Gini-Simpson Quadratic Index of Biodiversity for Interdependent Species  [PDF]
Radu Cornel Guiasu, Silviu Guiasu
Natural Science (NS) , 2014, DOI: 10.4236/ns.2014.67044
Abstract:

The weighted Gini-Simpson quadratic index is the simplest measure of biodiversity which takes into account the relative abundance of species and some weights assigned to the species. These weights could be assigned based on factors such as the phylogenetic distance between species, or their relative conservation values, or even the species richness or vulnerability of the habitats where these species live. In the vast majority of cases where the biodiversity is measured the species are supposed to be independent, which means that the relative proportion of a pair of species is the product of the relative proportions of the component species making up the respective pair. In the first section of the paper, the main versions of the weighted Gini-Simpson index of biodiversity for the pairs and triads of independent species are presented. In the second section of the paper, the weighted Gini-Simpson quadratic index is calculated for the general case when the species are interdependent. In this instance, the weights reflect the conservation values of the species and the distribution pattern variability of the subsets of species in the respective habitat induced by the inter-dependence between species. The third section contains a numerical example.

Current trends in the cardiovascular clinical trial arena (I)
Cornel Pater
Trials , 2004, DOI: 10.1186/1468-6708-5-4
Abstract: The disagreement also substantially affects the most viable alternative to placebo-controlled trials: actively controlled equivalence/noninferiority trials. To a great extent, this situation was prompted by numerous previous trials of this type that were marked by fundamental methodological flaws and consequent false claims, inconsistencies, and potential harm to patients.As the development and use of generic drugs continue to escalate, along with concurrent pressure to control medical costs by substituting less-expensive therapies for established ones, any claim that a new drug, intervention, or therapy is "equivalent" to another should not be accepted without close scrutiny. Adherence to proper methods in conducting studies of equivalence will help investigators to avoid false claims and inconsistencies. These matters will be addressed in the third article of this three-part series.The cardiovascular indication has been the largest or second-largest focus of clinical trials for the past decade (the central nervous system has occupied first place since 1999), making up 15.5% of all clinical investigator contracts[1] Correspondingly, the cardiovascular therapeutic area commands the largest market for prescription drugs – nearly one fourth of branded prescription drug sales – as the dominant indication for branded medicines sold commercially during the past few years. The total worldwide cardiovascular market is expected to show revenues of $91.2 billion in 2008, an increase of 6.9% compared with 2003 [2]The WHO ICD-9 coding system specifies 46 cardiovascular diseases within this therapeutic area; however, 65% of all cardiovascular trials address the top six of these subindications. Essential hypertension is in first place (27.1% of trials), followed by congestive heart failure (13.1%) and cerebrovascular disease (9.9%).The above mentioned figures are more than justified by the impressive ranking identifying cardiovascular disease as the dominant cause of death and d
The current status of primary prevention in coronary heart disease
Cornel Pater
Trials , 2001, DOI: 10.1186/cvm-2-1-024
Abstract: Research has resulted in major improvements in health-care in the past 50 years. Advances in the field of genomics/genetics are anticipated to lead to further acceleration in the progress of research development, with the promise of a new era for diagnosis, treatment and prevention of disease. But despite spectacular progress in medicine and general improvement of health across the world, cardiovascular diseases remain a global problem, and coronary heart disease (CHD) in particular is anticipated to be a problem over the next 30 years, both for the developed and developing world.Retrospective analysis of health and social problems illustrates limited success in identifying and dealing with potentially preventable health problems. Recent conclusions from the European Action on Secondary Prevention through Intervention to Reduce Events (EUROASPIRE) II [1] drawn by Wood, who coordinated the study, are relevant here. Among the many disappointing results was the fact that 81% of the individuals surveyed in 1999/2000 were overweight, with a third of them obese. The proportion of obese people increased sharply from 25% in 1995/96, while the number of smokers was unchanged, despite anti-smoking campaigns. Further, 61% of those surveyed had hypertension and 59% had abnormally high cholesterol, despite increased use of antihypertensive and cholesterol lowering drug treatment. Wood argued that the findings revealed "inadequate standard of care" and "a collective failure of the medical practice." He claimed that cardiologists are too focused on acute management and are paying insufficient attention to prevention and long-term treatment.The multifaceted clinical complexity of CHD, with a bias towards acute treatment, neglect of preventive care, and inappropriate long-term treatment of patients after acute coronary events, requires fundamental reform to improve patients' outcomes and quality of life, as well as the cost-effectiveness of treatment. Future preventive measures need
Beyond the Evidence of the New Hypertension Guidelines. Blood pressure measurement – is it good enough for accurate diagnosis of hypertension? Time might be in, for a paradigm shift (I)
Cornel Pater
Trials , 2005, DOI: 10.1186/1468-6708-6-6
Abstract: The scientific community world-wide and especially professionals interested in the topic of hypertension are witnessing currently an unprecedented debate over the issue of appropriateness of using different drugs/drug classes for the treatment of hypertension. An endless supply of recent and less recent "drug-news", some in support of, others against the current guidelines, justifying the use of selected types of drug treatment or criticising other, are coming out in the scientific literature on an almost weekly basis. The latest of such debate (at the time of writing this paper) pertains the safety profile of ARBs vs ACE inhibitors.To great extent, the factual situation has been fuelled by the new hypertension guidelines (different for USA, Europe, New Zeeland and UK) through, apparently small inconsistencies and conflicting messages, that might have generated substantial and perpetuating confusion among both prescribing physicians and their patients, regardless of their country of origin.The overwhelming message conveyed by most guidelines and opinion leaders is the widespread use of diuretics as first-line agents in all patients with blood pressure above a certain cut-off level and the increasingly aggressive approach towards diagnosis and treatment of hypertension. This, apparently well-justified, logical and easily comprehensible message is unfortunately miss-obeyed by most physicians, on both parts of the Atlantic.Amazingly, the message assumes a universal simplicity of both diagnosis and treatment of hypertension, while ignoring several hypertension-specific variables, commonly known to have high level of complexity, such as:- accuracy of recorded blood pressure and the great inter-observer variability,- diversity in the competency and training of diagnosing physician,- individual patient/disease profile with highly subjective preferences,- difficulty in reaching consensus among opinion leaders,- pharmaceutical industry's influence, and, nonetheless,- the lar
The Blood Pressure "Uncertainty Range" – a pragmatic approach to overcome current diagnostic uncertainties (II)
Cornel Pater
Trials , 2005, DOI: 10.1186/1468-6708-6-5
Abstract: In spite of these impressive advances and, deeply disappointingly from a public health perspective, the real picture of hypertension management is overshadowed by widespread diagnostic inaccuracies (underdiagnosis, overdiagnosis) as well as by treatment failures generated by undertreatment, overtreatment, and misuse of medications.The scientific, medical and patient communities as well as decision-makers worldwide are striving for greatest possible health gains from available resources.A seemingly well-crystallised reasoning is that comprehensive strategic approaches must not only target hypertension as a pathological entity, but rather, take into account the wider environment in which hypertension is a major risk factor for cardiovascular disease carrying a great deal of our inheritance, and its interplay in the constellation of other, well-known, modifiable risk factors, i.e., attention is to be switched from one's "blood pressure level" to one's absolute cardiovascular risk and its determinants. Likewise, a risk/benefit assessment in each individual case is required in order to achieve best possible results.Nevertheless, it is of paramount importance to insure generalizability of ABPM use in clinical practice with the aim of improving the accuracy of a first diagnosis for both individual treatment and clinical research purposes. Widespread adoption of the method requires quick adjustment of current guidelines, development of appropriate technology infrastructure and training of staff (i.e., education, decision support, and information systems for practitioners and patients). Progress can be achieved in a few years, or in the next 25 years.During the past decades, hypertension, denoting abnormal elevation of blood pressure, has commonly been assigned a distinct disease quality. The majority of the medical community as well as renowned medical textbooks have considered it a pathological entity requiring diagnostic and appropriate treatment in most individuals havin
Individualizing therapy – in search of approaches to maximize the benefit of drug treatment (II)
Cornel Pater
Trials , 2004, DOI: 10.1186/1468-6708-5-7
Abstract: Despite tremendous advances in the science and technology of drug development, as well the emergence of guidance and consensus building among scientists, many clinicians, pharmacists, and consumers remain uninformed regarding the scientific basis of establishing bioequivalence, the generic-drug approval process, and the issues related to individualizing therapy in general [7,8]. The consequence may be drug dosing errors: overdosing or underdosing of drugs, resulting in the occurrence of harmful effects or the nonoccurrence of the expected treatment benefit.Recent information [9] indicates that doctors are not consistently prescribing proven treatments at recommended doses, and at times they are not prescribing proven treatments at all. A decrease in dose may decrease the efficacy (relative risk reduction [RRR]) of therapy and thereby decrease the treatment's net benefit. Not prescribing an agent will effectively nullify the potential benefit to individuals, and when repeated frequently enough, failure to prescribe the agent will significantly decrease the benefit to the population as a whole. At other times, doctors tend to prescribe a drug more generally than clinical trials dictate. The treatment of a population with lower outcome prevalence (OP) decreases net benefit and may lead to harm. Overdosing may increase treatment-related harm, and underdosing may erode efficacy; both will result in diminished treatment benefits. Finally, noncompliance on the part of the patient may lead to a decrease in efficacy and a requisite decrease in net treatment benefit. If a patient reduces the dose without totally eliminating the drug, the risk of non-dose-related side effects of treatment may remain.The relationship between the terms mentioned above, which govern treatment success, can be expressed mathematically as follows [10-12]:Net Benefit = RRR * OP - HarmThe graphical representation in Figure 1 allows for a series of observations that expand our understanding of the bene
Page 1 /158
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.