Publish in OALib Journal

ISSN: 2333-9721

APC: Only $99


Search Results: 1 - 10 of 52 matches for " Vishwas Satgar "
All listed articles are free for downloading (OA Articles)
Page 1 /52
Display every page Item
Beyond Marikana: The Post-Apartheid South African State Jenseits von Marikana: Der Post-Apartheid-Staat in Südafrika
Vishwas Satgar
Africa Spectrum , 2012,
Abstract: This article situates the Marikana massacre, in which 34 mine workers were gunned down by police in South Africa, in the context of what the South African state has become, and questions the characterisation of the post-Apartheid state as a “developmental state”. This contribution first highlights what is at stake when the post-Apartheid state is portrayed as a “developmental state” and how this misrecognition of the state is ideologically constituted. Second, it argues for an approach to understanding the post-Apartheid state by locating it within the context of the rise of transnational neoliberalism and the process of indigenising neoliberalism on the African continent. Third, it examines the actual economic practices of the state that constitute it as an Afro-neoliberal state. Such economic practices are historicised to show the convergence between the post-Apartheid state and the ideal type neoliberal state coming to the fore in the context of global neoliberal restructuring and crisis management. The article concludes by recognising that South Africa’s deep globalisation and globalised state affirm a form of state practice beyond utilising market mechanisms that includes perpetrating violence to secure its existence. Marikana makes this point. Dieser Beitrag setzt das Marikana-Massaker, bei dem 34 Bergarbeiter von der Polizei erschossen wurden, in Bezug zum derzeitigen Zustand des südafrikanischen Staates. Gleichzeitig wird die Charakterisierung des Post-Apartheid-Staates als Entwicklungsstaat“ infrage gestellt. Der Autor beleuchtet zun chst, was diese Charakterisierung impliziert und inwieweit diese Fehlinterpretation ideologisch begründet ist. Er pl diert dann für einen Ansatz, der den Zustand des Post-Apartheid-Staates im Kontext des wachsenden Einflusses des Neoliberalismus sowie der Indigenisierung“ des Neoliberalismus auf dem afrikanischen Kontinent erkl rt. Schlie lich analysiert er die aktuellen wirtschaftspolitischen Entscheidungen, die den südafrikanischen Staat zu einem afro-neoliberalen“ Staat machen, und zeigt die Konvergenz zwischen diesen Entscheidungen und einem idealtypischen Staat im Rahmen von globaler neoliberaler Umstrukturierung und Krisenmanagement. Der Autor kommt zu dem Schluss, dass die tiefe Einbindung Südafrikas in die Globalisierung eine Form staatlichen Handelns bedingt, die über die Aussch pfung von Marktmechanismen hinaus auch die Gewaltanwendung zur Sicherung der eigenen Existenz einschlie t. Die Vorg nge in Marikana sind dafür ein Beleg.
Prosthetics rehabilitation of a male patient with a unique looped metal palate denture: A case report
Bhatia Vishwas,Bhatia Garima
Stomatolo?ki Glasnik Srbije , 2012, DOI: 10.2298/sgs1204205b
Abstract: Complete denture can improve both function and aesthetics. Even though mastication is highly improved, one of the most common problems for new full upper acrylic denture wearers is lack of feeling sensations such as hot and cold, loss of taste and fracture in the mid palatal region. These patients require a denture that allows them to feel sensations as close to normal as possible. The present case report discusses an alternative way of designing a metal palate for maxillary complete denture that along with fulfilling the above mentioned functions, has specially designed loops incorporated in such manner and directions to improve mechanical interlocking of acrylic within the metal loops and not to interfere with teeth arrangement.
The data paper: a mechanism to incentivize data publishing in biodiversity science
Chavan Vishwas,Penev Lyubomir
BMC Bioinformatics , 2011, DOI: 10.1186/1471-2105-12-s15-s2
Abstract: Background Free and open access to primary biodiversity data is essential for informed decision-making to achieve conservation of biodiversity and sustainable development. However, primary biodiversity data are neither easily accessible nor discoverable. Among several impediments, one is a lack of incentives to data publishers for publishing of their data resources. One such mechanism currently lacking is recognition through conventional scholarly publication of enriched metadata, which should ensure rapid discovery of 'fit-for-use' biodiversity data resources. Discussion We review the state of the art of data discovery options and the mechanisms in place for incentivizing data publishers efforts towards easy, efficient and enhanced publishing, dissemination, sharing and re-use of biodiversity data. We propose the establishment of the 'biodiversity data paper' as one possible mechanism to offer scholarly recognition for efforts and investment by data publishers in authoring rich metadata and publishing them as citable academic papers. While detailing the benefits to data publishers, we describe the objectives, work flow and outcomes of the pilot project commissioned by the Global Biodiversity Information Facility in collaboration with scholarly publishers and pioneered by Pensoft Publishers through its journals Zookeys, PhytoKeys, MycoKeys, BioRisk, NeoBiota, Nature Conservation and the forthcoming Biodiversity Data Journal. We then debate further enhancements of the data paper beyond the pilot project and attempt to forecast the future uptake of data papers as an incentivization mechanism by the stakeholder communities. Conclusions We believe that in addition to recognition for those involved in the data publishing enterprise, data papers will also expedite publishing of fit-for-use biodiversity data resources. However, uptake and establishment of the data paper as a potential mechanism of scholarly recognition requires a high degree of commitment and investment by the cross-sectional stakeholder communities.
Indicators for the Data Usage Index (DUI): an incentive for publishing primary biodiversity data through global information infrastructure
Ingwersen Peter,Chavan Vishwas
BMC Bioinformatics , 2011, DOI: 10.1186/1471-2105-12-s15-s3
Abstract: Background A professional recognition mechanism is required to encourage expedited publishing of an adequate volume of 'fit-for-use' biodiversity data. As a component of such a recognition mechanism, we propose the development of the Data Usage Index (DUI) to demonstrate to data publishers that their efforts of creating biodiversity datasets have impact by being accessed and used by a wide spectrum of user communities. Discussion We propose and give examples of a range of 14 absolute and normalized biodiversity dataset usage indicators for the development of a DUI based on search events and dataset download instances. The DUI is proposed to include relative as well as species profile weighted comparative indicators. Conclusions We believe that in addition to the recognition to the data publisher and all players involved in the data life cycle, a DUI will also provide much needed yet novel insight into how users use primary biodiversity data. A DUI consisting of a range of usage indicators obtained from the GBIF network and other relevant access points is within reach. The usage of biodiversity datasets leads to the development of a family of indicators in line with well known citation-based measurements of recognition.
Real-time System Identification of Unmanned Aerial Vehicles: A Multi-Network Approach
Vishwas Puttige,Sreenatha Anavatti
Journal of Computers , 2008, DOI: 10.4304/jcp.3.7.31-38
Abstract: In this paper, real-time system identification of an unmanned aerial vehicle (UAV) based on multiple neural networks is presented. The UAV is a multi-input multi-output (MIMO) nonlinear system. Models for such MIMO system are expected to be adaptive to dynamic behaviour and robust to environmental variations. This task of accurate modelling has been achieved with a multi-network architecture. The multi-network with dynamic selection technique allows a combination of online and offline neural network models to be used in the architecture where the most suitable outputs are selected based on a given criterion. The neural network models are based on the autoregressive technique. The online network uses a novel training scheme with memory retention. Flight test validation results for online and offline models are presented. The multi-network dynamic selection technique has been validated on real-time hardware in the loop (HIL) simulation and the results show the superiority in performance compared to the individual models.
A-posteriori error estimates for inverse problems
Vishwas Rao,Adrian Sandu
Mathematics , 2014,
Abstract: Inverse problems use physical measurements along with a computational model to estimate the parameters or state of a system of interest. Errors in measurements and uncertainties in the computational model lead to inaccurate estimates. This work develops a methodology to estimate the impact of different errors on the variational solutions of inverse problems. The focus is on time evolving systems described by ordinary differential equations, and on a particular class of inverse problems, namely, data assimilation. The computational algorithm uses first-order and second-order adjoint models. In a deterministic setting the methodology provides a posteriori error estimates for the inverse solution. In a probabilistic setting it provides an a posteriori quantification of uncertainty in the inverse solution, given the uncertainties in the model and data. Numerical experiments with the shallow water equations in spherical coordinates illustrate the use of the proposed error estimation machinery in both deterministic and probabilistic settings.
A Time-parallel Approach to Strong-constraint Four-dimensional Variational Data Assimilation
Vishwas Rao,Adrian Sandu
Computer Science , 2015,
Abstract: A parallel-in-time algorithm based on an augmented Lagrangian approach is proposed to solve four-dimensional variational (4D-Var) data assimilation problems. The assimilation window is divided into multiple sub-intervals that allows to parallelize cost function and gradient computations. Solution continuity equations across interval boundaries are added as constraints. The augmented Lagrangian approach leads to a different formulation of the variational data assimilation problem than weakly constrained 4D-Var. A combination of serial and parallel 4D-Vars to increase performance is also explored. The methodology is illustrated on data assimilation problems with Lorenz-96 and the shallow water models.
Correlation of Electric Cardiometry and Continuous Thermodilution Cardiac Output Monitoring Systems  [PDF]
Vishwas Malik, Arun Subramanian, Sandeep Chauhan, Milind Hote
World Journal of Cardiovascular Surgery (WJCS) , 2014, DOI: 10.4236/wjcs.2014.47016

Purpose: Impedance Cardiography (ICG) with its drawbacks to reliably estimate cardiac output (CO) when compared to reference methods has led to the development of a novel technique called Electrical Cardiometry (EC). The purpose of this study was to compare EC-CO with the Continuous CO (CCO) derived from Pulmonary Artery Catheter (PAC). Methods: 60 patients scheduled to undergo coronary artery surgery necessitating the placement of PAC were studied in the operating room. Standard ECG electrodes were used for EC-CO measurements. Simultaneous CO measurement from EC and PAC was done at three predefined time points and were correlated. Results: A significant high correlation was found between the EC-CO and CCO at the three time points. Bland and Altman analysis revealed a bias of 0.08 L/min, a precision of 0.15 L/min, with a narrow limit of agreement (-0.13 to 0.28 L/min). The percentage error between the methods was 3.59%. Conclusion: The agreement between EC-CO and CCO is clinically acceptable and these two techniques can be used interchangeably. Mediastinal opening has no effect on the correlation between these two modalities.

Implementing SEReleC with EGG
Vishwas J Raval,Padam Kumar
International Journal of Information Technology and Computer Science , 2012,
Abstract: The World Wide Web has immense resources for all kind of people for their specific needs. Searching on the Web using search engines such as Google, Bing, Ask have become an extremely common way of locating information. Searches are factorized by using either term or keyword sequentially or through short sentences. The challenge for the user is to come up with a set of search terms/keywords/sentence which is neither too large (making the search too specific and resulting in many false negatives) nor too small (making the search too general and resulting in many false positives) to get the desired result. No matter, how the user specifies the search query, the results retrieved, organized and presented by the search engines are in terms of millions of linked pages of which many of them might not be useful to the user fully. In fact, the end user never knows that which pages are exactly matching the query and which are not, till one check the pages individually. This task is quite tedious and a kind of drudgery. This is because of lack of refinement and any meaningful classification of search result. Providing the accurate and precise result to the end users has become Holy Grail for the search engines like Google, Bing, Ask etc. There are number of implementations arrived on web in order to provide better result to the users in the form of DuckDuckGo, Yippy, Dogpile etc. This research proposes development of a Meta search engine, called SEReleC that will provide an interface for refining and classifying the search engines’ results so as to narrow down the search results in a sequentially linked manner resulting in drastic reduction of number of pages.
Evolving a New Software Development Life Cycle Model (SDLC) incorporated with Release Management
Vishwas Massey,K. J. Satao
International Journal of Engineering and Advanced Technology , 2012,
Abstract: Software Development Life Cycle or System development Life Cycle or simply SDLC (system and software is interchanged frequently in-accordance to application scenario) is a step by step highly structured technique employed for development of any software. SDLC allows project leaders to configure and supervise the whole development process of any software. Divide and conquer technique is widely used in SDLC models. Tasks that are complex in nature are broken down into smaller manageable components. Developers employ SDLC models for analyzing, coding, testing and deployment of software system. Efficient and effective software is developed because of SDLC model, thus making the software capable of addressing expectations of the customers, clients and the end-users. Software developed by employing the suitable SDLC models is better performers in the market when compared with their competitors. SDLC Models helps in regulating the software-system development time and helps in effective cost scheduling. In this paper we have tried to develop a model which guarantees that the development and delivery (release) teams engaged in some project have strong co-ordination and collaboration leading to enhanced productivity, efficiency, effectiveness and longer market life. This can be achieved by incorporating concept of release with basic SDLC phases (steps) with the concept of release management.
Page 1 /52
Display every page Item

Copyright © 2008-2017 Open Access Library. All rights reserved.