oalib

OALib Journal期刊

ISSN: 2333-9721

费用:99美元

投稿

匹配条件: “Vijayageetha Ragupathy” ,找到相关结果约25条。
列表显示的所有文章,均可免费获取
第1页/共25条
每页显示
SIMULTANEOUS SPECTROPHOTOMETRIC DETERMINATION OF DIACEREIN AND ACECLOFENAC IN TABLETS BY CHEMOMETRIC METHODS
Vijayageetha Ragupathy,Shantha Arcot
International Research Journal of Pharmacy , 2013, DOI: 10.7897/2230-8407.04345
Abstract: Simultaneous spectrophotometric determination of diacerein and aceclofenac was performed by partial least-squares (PLS) and principal component regression (PCR) methods do not require any priori graphical treatment of the overlapping spectra of two drugs in the mixture. The absorbance values in the UV-Vis spectra were measured for the 67 wavelength points (from 234-300) in the spectral region 200–400 nm considering the intervals of 1 nm. The calibration range was found to be 1-5 μg/ml for diacerein, 2-10 μg/ml for aceclofenac with a correlation coefficient of 0.9998(PLS), 0.9995(PCR) for diacerein and 0.9999 (PLS), 0.9997 (PCR) for aceclofenac. The validation of the multivariate methods was realized by analyzing the synthetic mixtures of diacerein and aceclofenac. The numerical calculations were performed with the ‘Unscrambler 10.1 X’ software. The chemometrics analysis methods were satisfactorily applied to the simultaneous determination of diacerein and aceclofenac in the pharmaceutical formulation.
Experimental study of cooling tower performance using ceramic tile packing
Ramkumar Ramkrishnan,Ragupathy Arumugam
Processing and Application of Ceramics , 2013, DOI: 10.2298/pac1301021r
Abstract: Deterioration of the packing material is a major problem in cooling towers. In this experimental study ceramic tiles were used as a packing material. The packing material is a long life burnt clay, which is normally used as a roofing material. It prevents a common problem of the cooling tower resulting from corrosion and water quality of the tower. In this study, we investigate the use of three different types of ceramic packings and evaluate their heat and mass transfer coefficients. A simple comparison of packing behaviour is performed with all three types of packing materials. The experimental study was conducted in a forced draft cooling tower. The variations in many variables, which affect the tower efficiency, are described.
Application of Response Surface Methodology (RSM) for Optimization of Operating Parameters and Performance Evaluation of Cooling Tower Cold Water Temperature
Ramkumar RAMAKRISHNAN,Ragupathy ARUMUGAM
International Journal of Optimization and Control : Theories & Applications , 2012,
Abstract: The performance of a cooling tower was analyzed with various operating parameters tofind the minimum cold water temperature. In this study, optimization of operating parameters wasinvestigated. An experimental design was carried out based on central composite design (CCD) withresponse surface methodology (RSM). This paper presents optimum operating parameters and theminimum cold water temperature using the RSM method. The RSM was used to evaluate the effectsof operating variables and their interaction towards the attainment of their optimum conditions.Based on the analysis, air flow, hot water temperature and packing height were high significanteffect on cold water temperature. The optimum operating parameters were predicted using the RSMmethod and confirmed through experiment.
Ethnobotany genomics - discovery and innovation in a new era of exploratory research
Steven G Newmaster, Subramanyam Ragupathy
Journal of Ethnobiology and Ethnomedicine , 2010, DOI: 10.1186/1746-4269-6-2
Abstract: Ethnobotany genomics is a novel approach that is poised to lead botanical discoveries and innovations in a new era of exploratory research. The concept for this new approach is founded on the concept of 'assemblage' of biodiversity knowledge, which includes a coming together of different ways of knowing and valorizing species variation in a novel approach seeking to add value to both traditional knowledge (TK) and scientific knowledge (SK). Ethnobotany genomics draws on an ancient body of knowledge concerning the variation in the biological diversity that surrounds different cultures; combined with modern genomic tools such as DNA barcoding it also explores the natural genetic variation found among organisms. This genomic variation is explored along a gradient of variation in which any organism inhabits. We present here the first introduction to ethnobotany genomics including some background and several case studies in our lab, which define an approach to this new discipline that may evolve quickly with new ideas and technology. The motivation for this new approach is a quest to understand how the diversity of life that surrounds us can serve society-at-large with nutrition, medicine and more.Ethnobotany implicitly embodies the concept of interdisciplinary research. The term "ethnobotany" is derived from ethnology (study of culture) and botany (study of plants); it is the scientific study of the relationships that exist between people and plants. Historically, ethnobotanists documented, described and explained the complex relationships between cultures and their utility of plants. This often included how plants are used, managed and perceived across human societies as foods, medicines, cosmetics, dyes, textiles, building materials, tools, clothing or within cultural divination, rituals and religion. Much of this research assumes that TK can be imposed upon a SK classification of living things. We suggest that this is a biased approach and call for a more unified app
Valorizing the 'Irulas' traditional knowledge of medicinal plants in the Kodiakkarai Reserve Forest, India
Subramanyam Ragupathy, Steven G Newmaster
Journal of Ethnobiology and Ethnomedicine , 2009, DOI: 10.1186/1746-4269-5-10
Abstract: A mounting body of critical research is raising the credibility of Traditional Knowledge (TK) in scientific studies and natural resource management. The lack of recognition of the place and value TK in science has prevented real engagement of this knowledge in scientific endeavours including nutrition, medicine, environmental assessment and resource management practices [1]. One explanation is the lack of validation using quantitative analyses that give some measure of confidence and methods for replication. Clearly there is some asymmetry between scientific and TK, which requires new approaches to the relationship between scientific and TK. We must consider some criteria for negotiating and legitimating the validity of knowledge. Science is based on the evolution of knowledge, which is testable and ultimately generalisable, mobile and globally meaningful [2].To gain credibility, scientific studies that utilize TK must be reliable (refutable) and designed so that they can be replicated. The need for full disclosure in science demands quantitative measures of the reliability to be established. In ethnobotanical studies, this was established by Trotter and Logan [3] who developed a quantitative method to evaluate consensus among informants in order to identifying potentially effective medicinal plants. Consensus analysis provides a measure of reliability for any given claim providing refutable evidence. This evolved into the "Factor of informant Consensus" (FIC) in order to quantitatively evaluate the degree of selection of certain plants for a particular utility (eg, healing an ailment). One of the traditional intentions of FIC is to test the homogeneity among informants' knowledge [3-6]. Researchers use consensus analysis to test falsifiable hypotheses concerning informant selection and use of plants [5,7], as a decision making factor [4,8,9], to weight the relative importance of TK [10], and to estimate the competence of informants [11-13]. These studies have gaine
Physical mapping and BAC-end sequence analysis provide initial insights into the flax (Linum usitatissimum L.) genome
Raja Ragupathy, Rajkumar Rathinavelu, Sylvie Cloutier
BMC Genomics , 2011, DOI: 10.1186/1471-2164-12-217
Abstract: The physical map consists of 416 contigs spanning ~368 Mb, assembled from 32,025 fingerprints, representing roughly 54.5% to 99.4% of the estimated haploid genome (370-675 Mb). The N50 size of the contigs was estimated to be ~1,494 kb. The longest contig was ~5,562 kb comprising 437 clones. There were 96 contigs containing more than 100 clones. Approximately 54.6 Mb representing 8-14.8% of the genome was obtained from 80,337 BES. Annotation revealed that a large part of the genome consists of ribosomal DNA (~13.8%), followed by known transposable elements at 6.1%. Furthermore, ~7.4% of sequence was identified to harbour novel repeat elements. Homology searches against flax-ESTs and NCBI-ESTs suggested that ~5.6% of the transcriptome is unique to flax. A total of 4064 putative genomic SSRs were identified and are being developed as novel markers for their use in molecular breeding.The first genome-wide physical map of flax constructed with BAC clones provides a framework for accessing target loci with economic importance for marker development and positional cloning. Analysis of the BES has provided insights into the uniqueness of the flax genome. Compared to other plant genomes, the proportion of rDNA was found to be very high whereas the proportion of known transposable elements was low. The SSRs identified from BES will be valuable in saturating existing linkage maps and for anchoring physical and genetic maps. The physical map and paired-end reads from BAC clones will also serve as scaffolds to build and validate the whole genome shotgun assembly.Flax (Linum usitatissimum L.) was domesticated for its seed oil and stem fibres nearly 7,000 years ago, during the Neolithic period [1]. However, recently discovered 30,000 year old flax fibres from the upper Paleolithic period suggest that flax was used by humans prior to its domestication [2]. Today, flax is grown as an oilseed (linseed) crop or a fibre crop. Linseed oil, rich in the omega-3 fatty acid (alpha linolenic
A Novel Method of Image Compression Using Multiwavelets and Set Partitioning Algorithm
U.S. Ragupathy,A. Tamilarasi
Modern Applied Science , 2009, DOI: 10.5539/mas.v3n2p134
Abstract: Advances in wavelet transforms and quantization methods have produced algorithms capable of surpassing the existing image compression standards like the Joint Photographic Experts Group (JPEG) algorithm. The existing compression methods for JPEG standards are using DCT with arithmetic coding and DWT with Huffman coding. The DCT uses a single kernel where as wavelet offers more number of filters depends on the applications. The wavelet based Set Partitioning In Hierarchical Trees (SPIHT) algorithm gives better compression. For best performance in image compression, wavelet transforms require filters that combine a number of desirable properties, such as orthogonality and symmetry, but they cannot simultaneously possess all of these properties. The relatively new field of multiwavelets offer more design options and can combine all desirable transform features. But there are some limitations in using the SPIHT algorithm for multiwavelets coefficients. This paper presents a new method for encoding the multiwavelet decomposed images by defining coefficients suitable for SPIHT algorithm which gives better compression performance over the existing methods in many cases.
Experimental Determination of Heat Transfer Coefficient in Stirred Vessel for Coal-Water Slurry Based on the Taguchi Method
C. M. Raguraman,A. Ragupathy,L. Sivakumar
Journal of Engineering , 2013, DOI: 10.1155/2013/719296
Abstract: Heat transfer in stirred vessels is important because process fluid temperature in the vessel is one of the most significant factors for controlling the outcome of process. In this study, the effects of some important design parameters for coal-water slurry in agitated vessel used in coal gasification such as stirrer speed, location of stirrer, D/d ratio, and coal-water ratio were investigated and optimized using the Taguchi method. The experiments were planned based on Taguchi’s orthogonal array with each trial performed under different levels of design parameter. Signal-to-noise (S/N) analysis and analysis of variance (ANOVA) were carried out in order to determine the effects of process parameter and optimal factor’s level settings. Finally, confirmation tests verified that the Taguchi method achieved optimization of heat transfer coefficient in agitated vessel. 1. Introduction Research on heat transfer coefficient in agitated vessel is still critical and ongoing. Heat transfer in stirred vessel is important because process fluid temperature in the vessel is one of the most significant factors for controlling the outcome of process. Mechanically agitated vessels are widely used in mining, food, petroleum, chemical, pharmaceutical, pulp, and paper industries and are also used in coal gasification power plant [1]. The intensity of heat transfer during mixing of fluids like coal slurry depends on the type of the stirrer, the design of the vessel, and condition of the processes [2]. In this study the effects of some important parameters such as stirrer speed, location of stirrer, D/d ratio, and coal-water ratio were investigated and optimized. Performing an experiment is more suitable for determination of the real performance characteristics of a system. However to prepare an experimental setup is very expensive and some systems cannot be constructed and tested in a laboratory. Also, preparing an experimental setup is a very time-consuming procedure because of the high trial numbers. Because of these difficulties, the modeling and then testing the system using numerical analysis, ANN (artificial neural network), or optimizing the trial numbers according to the Taguchi method is more appropriate and very popular nowadays [3]. Heat transfer rates in agitated vessel have been investigated for coal-water slurry in a flat bottom vessel equipped with flat-blade impeller making an angle of 45 degree to the axis of the shaft. Also the heat transfer coefficients of the flat-blade impeller parallel to the axis of the shaft results [4] were compared. 2. Experimental
Detection of Lung Nodule Using Multiscale Wavelets and Support Vector Machine
K.P.Aarthy,U.S.Ragupathy
International Journal of Soft Computing & Engineering , 2012,
Abstract: Lung cancer is the most common and leading cause of death in both men and women. Lung nodule, an abnormality which leads to lung cancer is detected by various medical imaging techniques like X-ray, Computerized Tomography (CT), etc. Detection of lung nodules is a challenging task, since the nodules are commonly attached to the blood vessels. Many studies have shown that early diagnosis is the most efficient way to cure this disease. This paper aims to develop an efficient lung nodule detection scheme by performing nodule segmentation through multiscale wavelet based edge detection and morphological operations; classification by using a machine learning technique called Support Vector Machine (SVM). This methodology uses three different types of kernels like linear, Radial Basis Function (RBF) and polynomial, among which the RBF kernel gives better class performance with a sensitivity of 92.86% and error rate of 0.0714.
Effective Extraction of Heavy Metals from their Effluents Using Some Potential Ionic Liquids as Green Chemicals
A. Rajendran,D. Ragupathy,M. Priyadarshini,A. Magesh
Journal of Chemistry , 2011, DOI: 10.1155/2011/202380
Abstract:
第1页/共25条
每页显示


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.