oalib

Publish in OALib Journal

ISSN: 2333-9721

APC: Only $99

Submit

Any time

2019 ( 33 )

2018 ( 70 )

2017 ( 45 )

2016 ( 58 )

Custom range...

Search Results: 1 - 10 of 11682 matches for " George Varghese "
All listed articles are free for downloading (OA Articles)
Page 1 /11682
Display every page Item
Within and Cross Volatility Contagion Effects among Stock, Crude and Forex Returns: Empirical Evidence from Five Emerging Economies  [PDF]
George Varghese
Theoretical Economics Letters (TEL) , 2018, DOI: 10.4236/tel.2018.88095
Abstract:

The paper examines the spillover effects of return volatility among stock market index, foreign exchange market and WTI crude oil market across five emerging nations. Here a Trivariate Diagonal BEKK-GARCH model is used to estimate the time-varying conditional variance and to test the own-volatility spillover effects of returns among the three underlying variables. We find that significant own volatility spillover exists in the WTI returns followed by that of stock returns and forex returns. Further, fluctuations in WTI returns exert considerable influence over the stock market volatilities. The lagged variance of the variables as well as their lagged squared residuals from the mean equation has a positive and significant impact on the current volatility in most cases. The findings of the study are of pertinent importance to financiers, economists, investors and policymakers.

Financial Inclusion: Opportunities, Issues and Challenges  [PDF]
George Varghese, Lakshmi Viswanathan
Theoretical Economics Letters (TEL) , 2018, DOI: 10.4236/tel.2018.811126
Abstract: An all-inclusive financial system is essential for a nation as it augments efficiency and welfare by providing scope for secure and safe saving practices and by facilitating a wide range of improved financial services. The focus of the present study is on identifying the opportunities, issues, and challenges of financial inclusion in India.
A Monte Carlo test of linkage disequilibrium for single nucleotide polymorphisms
Hongyan Xu, Varghese George
BMC Research Notes , 2011, DOI: 10.1186/1756-0500-4-124
Abstract: We develop a Monte Carlo based test for LD based on the null distribution of the r2 statistic. Our test is based on r2 and can be reported together with r2. Simulation studies show that it offers slightly better power than existing methods.Our approach provides an alternative test for LD and has been implemented as a R program for ease of use. It also provides a general framework to account for other haplotype inference methods in LD testing.Genetic association studies, especially large-scale genome-wide association studies have become very popular in recent years due to the rapid advancement of genotyping technologies and the completion of the Human Genome Project [1,2]. More than 400 susceptibility regions have been identified through genome-wide association approach. This approach relies on the linkage disequilibrium information between genetic markers, mostly single-nucleotide polymorphisms (SNPs), hence been termed linkage disequilibrium mapping. Linkage disequilibrium (LD) refers to the nonrandom association of alleles at different loci on the same haplotype. The underlying assumption of genetic association studies is that there are some disease causing loci in the genome, and if the SNPs under investigation (i.e. markers) and the disease-causing loci are in close proximity, the marker alleles will be associated with the alleles at the disease-causing loci. In other words, those markers are in LD with the disease causing loci if they are in close proximity. Since markers in high LD are highly correlated, testing the significance of LD between alleles of markers is also useful in finding LD blocks and tag-SNPs. This could reduce the number of markers required in genome-wide studies. In addition to gene mapping, LD information also proves to be useful in evolutionary studies of gene dynamics, tracing human origin and history, and studies of genome structure and forensic science.Consider two bi-allelic SNPs, marker A and marker B. The two alleles at marker A are
Biff (Bloom Filter) Codes : Fast Error Correction for Large Data Sets
Michael Mitzenmacher,George Varghese
Computer Science , 2012,
Abstract: Large data sets are increasingly common in cloud and virtualized environments. For example, transfers of multiple gigabytes are commonplace, as are replicated blocks of such sizes. There is a need for fast error-correction or data reconciliation in such settings even when the expected number of errors is small. Motivated by such cloud reconciliation problems, we consider error-correction schemes designed for large data, after explaining why previous approaches appear unsuitable. We introduce Biff codes, which are based on Bloom filters and are designed for large data. For Biff codes with a message of length $L$ and $E$ errors, the encoding time is $O(L)$, decoding time is $O(L + E)$ and the space overhead is $O(E)$. Biff codes are low-density parity-check codes; they are similar to Tornado codes, but are designed for errors instead of erasures. Further, Biff codes are designed to be very simple, removing any explicit graph structures and based entirely on hash tables. We derive Biff codes by a simple reduction from a set reconciliation algorithm for a recently developed data structure, invertible Bloom lookup tables. While the underlying theory is extremely simple, what makes this code especially attractive is the ease with which it can be implemented and the speed of decoding. We present results from a prototype implementation that decodes messages of 1 million words with thousands of errors in well under a second.
“General Concepts Of Capacity Based Design”
Sujin S. George,Valsson Varghese
Golden Research Thoughts , 2012, DOI: 10.9780/22315063
Abstract: An earthquake resisting building is one that has been deliberately designed in such a way that the structure remains safe and suffers no appreciable damage during destructive earthquake. However, it has been seen that during past earthquakes many of the buildings were collapsed due to failure of vertical members. Therefore, it is necessary to provide vertical members strong so as to sustain the design earthquake without catastrophic failure. Capacity designing is aiming towards providing vertical members stronger compared to horizontal structural elements. Astructure designed with capacity design concept does not develop any suitable failure mechanism or modes of inelastic deformation which cause the failure of the structures.
A Model Study on Accelerated Consolidation of Coir Reinforced Lateritic Lithomarge Soil Blends with Vertical Sand Drains for Pavement Foundations  [PDF]
George Varghese, Hegde Ramakrishna, A.G. Nirmal Kumar, L. Durga Prashanth, G. Santosh
Open Journal of Soil Science (OJSS) , 2012, DOI: 10.4236/ojss.2012.23038
Abstract: Sub-grade soils of lateritic origin encountered in the construction of highway embankments in various regions of India, often comprise intrusions of soft lithomargic soils that result in large settlements during constructions, and differential settlements at later stages. This necessitates the use of appropriate soil improvement techniques to improve the load-carrying capacity of pavements. This work deals with accelerated consolidation of un-reinforced and coir-rein- forced lateritic lithomargic soil blends, provided with three vertical sand drains. The load-settlement characteristics were studied for various preloads ranging from 50 kg (0.0013 N/mm2) to 500 kg (0.013N/mm2) on soil specimens prepared in circular ferro-cement moulds. It was observed that at lower preloads up to 200kg, across the blends, the relative increase in consolidation (Rct) for randomly reinforced soil with vertical drains was sig-nificantly higher than that of un-reinforced soil without vertical drains, with an average value of 124.8%. Also, the Rct for un-reinforced soil with vertical drains was quite higher than that of un-reinforced soil without vertical drains, with an average value of 103.9%. In the case of higher preloads, the Rct values for randomly reinforced soil with vertical drains were moderate with an average value of 30.88%, while the same for un-reinforced soil with vertical drains was about 20.4%. The aspect-ratio of coir fibers used was 1:275.
Colchicine in acute gouty arthritis: the optimum dose?
George I Varughese, Abraham I Varghese
Arthritis Research & Therapy , 2006, DOI: 10.1186/ar2039
Abstract: This is very similar to the dosage of colchicine suggested a decade ago [2], and indeed comparable to the regimen that was also expressed in grains in Hollander's Textbook of Rheumatology in 1960 [3]. Despite the fact that there is perhaps only one double blind placebo controlled study on colchicine in acute gout where gastrointestinal side effects occurred before the relief of pain [1], and the optimal dose of colchicine still remains elusive, there has not been any significant change to the recommended dosage in acute gout nearly half a century later [1]. The suggestion to administer colchicine at frequent intervals until the development of gastrointestinal side effects is a matter of significant concern [4], from a practical perspective, in routine clinical practice.Of late, a recent systematic review has shown that there is a lack of robust data to inform the debate on the management of a common problem such as gout and, interestingly, all of the drugs used to treat gout can have serious side effects [5]. Indeed, Morris and colleagues [6] had suggested an effective yet less toxic alternative regime with colchicine in the setting of acute gout, and such anecdotal published case reports should not be underestimated and dismissed too quickly, as they remain a valid and efficient source for signal generation and are of great value for drug safety.Both the authors have been involved with and encountered patients who have been prescribed colchicine at frequent intervals as per current recommendations for acute gout, resulting in serious gastrointestinal side effects and renal impairment.
A new measure of population structure using multiple single nucleotide polymorphisms and its relationship with FST
Hongyan Xu, Bayazid Sarkar, Varghese George
BMC Research Notes , 2009, DOI: 10.1186/1756-0500-2-21
Abstract: In this study we propose an adjusted C parameter according to the sample size from each sub-population. The new measure C is based on the c parameter proposed for SNP data, which was assumed to be subpopulation-specific and common for all loci. In this study, we performed extensive simulations of samples with varying levels of population structure to investigate the properties and relationships of both measures. It is found that the two measures generally agree well.The new measure simultaneously uses the marker information across the genome. It has the advantage of easy interpretation as one measure of population structure and yet can also assess population differentiation.Large scale genome-wide association studies are promising in unraveling the genetic basis of complex diseases in humans. There are many such studies currently being carried out. However, the size of the data produces several issues and challenges in analysis and interpretation. One of the potential problems is hidden population structure in the samples. It can cause spurious associations when cases and controls differ in ancestry and is thus a confounding factor. However, the effects of population structure in real large-scale association studies are very controversial. Therefore, a systematic study is needed to quantify the levels of population structure and its effects on genetic association studies.The first step to quantify the effects of population structure is to choose an appropriate measure of population structure for human data. The commonly used measure is Wright's FST [1]. For a set of subpopulations it is generally assumed to be one value of FST. However, the estimates could be different for distinct loci. It could be a problem if population structure is adjusted with local estimates in genome-wide association studies because it could mask real association and lead to loss of power. With the available of genomic data, we would like a measure utilizing the information across markers. T
General Concepts of Capacity Based Design
Mr. Sujin S. George,,Dr. Valsson Varghese
International Journal of Innovative Technology and Exploring Engineering , 2012,
Abstract: An earthquake resisting building is one that has been deliberately designed in such a way that the structure remains safe and suffers no appreciable damage during destructive earthquake. However, it has been seen that during past earthquakes many of the buildings were collapsed due to failure of vertical members. Therefore, it is necessary to provide vertical members strong so as to sustain the design earthquake without catastrophic failure. Capacity designing is aiming towards providing vertical members stronger compared to horizontal structural elements. A structure designed with capacity design concept does not develop any suitable failure mechanism or modes of inelastic deformation which cause the failure of the structures. In capacity design of earthquake resisting structures, elements of primary lateral load resisting system are chosen suitably and designed and detailed for energy dissipation under severe inelastic deformation.
Hemiparesis and cerebellar dysfunction complicating mixed malarial infection with falciparum and vivax malaria
George Ige,Varghese L,Mathews P
Indian Journal of Medical Sciences , 2006,
Abstract:
Page 1 /11682
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.