Search Results: 1 - 10 of 100 matches for " "
All listed articles are free for downloading (OA Articles)
Page 1 /100
Display every page Item
A Hidden Markov Model for Localization Using Low-End GSM Cell Phones  [PDF]
Mohamed Ibrahim,Moustafa Youssef
Computer Science , 2010,
Abstract: Research in location determination for GSM phones has gained interest recently as it enables a wide set of location based services. RSSI-based techniques have been the preferred method for GSM localization on the handset as RSSI information is available in all cell phones. Although the GSM standard allows for a cell phone to receive signal strength information from up to seven cell towers, many of today's cell phones are low-end phones, with limited API support, that gives only information about the associated cell tower. In addition, in many places in the world, the density of cell towers is very small and therefore, the available cell tower information for localization is very limited. This raises the challenge of accurately determining the cell phone location with very limited information, mainly the RSSI of the associated cell tower. In this paper we propose a Hidden Markov Model based solution that leverages the signal strength history from only the associated cell tower to achieve accurate GSM localization. We discuss the challenges of implementing our system and present the details of our system and how it addresses the challenges. To evaluate our proposed system, we implemented it on Androidbased phones. Results for two different testbeds, representing urban and rural environments, show that our system provides at least 156% enhancement in median error in rural areas and at least 68% enhancement in median error in urban areas compared to current RSSI-based GSM localization systems
A hybrid scheme for encoding audio signal using hidden Markov models of waveforms  [PDF]
Stéphane Molla,Bruno Torrésani
Statistics , 2013, DOI: 10.1016/j.acha.2004.11.001
Abstract: This paper reports on recent results related to audiophonic signals encoding using time-scale and time-frequency transform. More precisely, non-linear, structured approximations for tonal and transient components using local cosine and wavelet bases will be described, yielding expansions of audio signals in the form tonal + transient + residual. We describe a general formulation involving hidden Markov models, together with corresponding rate estimates. Estimators for the balance transient/tonal are also discussed.
Artificial signal peptide prediction by a hidden markov model to improve protein secretion via Lactococcus lactis bacteria  [cached]
Jafar Razmara,Safaai B Deris,Rosli Bin Md Illias,Sepideh Parvizpour
Bioinformation , 2013,
Abstract: A hidden Markov model (HMM) has been utilized to predict and generate artificial secretory signal peptide sequences. The strength of signal peptides of proteins from different subcellular locations via Lactococcus lactis bacteria correlated with their HMM bit scores in the model. The results show that the HMM bit score +12 are determined as the threshold for discriminating secreteory signal sequences from the others. The model is used to generate artificial signal peptides with different bit scores for secretory proteins. The signal peptide with the maximum bit score strongly directs proteins secretion.
Hidden Semi Markov Models for Multiple Observation Sequences: The mhsmm Package for R  [PDF]
Jared O'Connell,S?ren H?jsgaard
Journal of Statistical Software , 2011,
Abstract: This paper describes the R package mhsmm which implements estimation and prediction methods for hidden Markov and semi-Markov models for multiple observation sequences. Such techniques are of interest when observed data is thought to be dependent on some unobserved (or hidden) state. Hidden Markov models only allow a geometrically distributed sojourn time in a given state, while hidden semi-Markov models extend this by allowing an arbitrary sojourn distribution. We demonstrate the software with simulation examples and an application involving the modelling of the ovarian cycle of dairy cows.
Logical Hidden Markov Models  [PDF]
L. De Raedt,K. Kersting,T. Raiko
Computer Science , 2011, DOI: 10.1613/jair.1675
Abstract: Logical hidden Markov models (LOHMMs) upgrade traditional hidden Markov models to deal with sequences of structured symbols in the form of logical atoms, rather than flat characters. This note formally introduces LOHMMs and presents solutions to the three central inference problems for LOHMMs: evaluation, most likely hidden state sequence and parameter estimation. The resulting representation and algorithms are experimentally evaluated on problems from the domain of bioinformatics.
Prediction of State of Wireless Network Using Markov and Hidden Markov Model  [cached]
MD. Osman Gani,Hasan Sarwar,Chowdhury Mofizur Rahman
Journal of Networks , 2009, DOI: 10.4304/jnw.4.10.976-984
Abstract: Optimal resource allocation and higher quality of service is a much needed requirement in case of wireless networks. In order to improve the above factors, intelligent prediction of network behavior plays a very important role. Markov Model (MM) and Hidden Markov Model (HMM) are proven prediction techniques used in many fields. In this paper, we have used Markov and Hidden Markov prediction tools to predict the number of wireless devices that are connected to a specific Access Point (AP) at a specific instant of time. Prediction has been performed in two stages. In the first stage, we have found state sequence of wireless access points (AP) in a wireless network by observing the traffic load sequence in time. It is found that a particular choice of data may lead to 91% accuracy in predicting the real scenario. In the second stage, we have used Markov Model to find out the future state sequence of the previously found sequence from first stage. The prediction of next state of an AP performed by Markov Tool shows 88.71% accuracy. It is found that Markov Model can predict with an accuracy of 95.55% if initial transition matrix is calculated directly. We have also shown that O(1) Markov Model gives slightly better accuracy in prediction compared to O(2) MM for predicting far future.
Equations for hidden Markov models  [PDF]
Alexander Schoenhuth
Mathematics , 2009,
Abstract: We will outline novel approaches to derive model invariants for hidden Markov and related models. These approaches are based on a theoretical framework that arises from viewing random processes as elements of the vector space of string functions. Theorems available from that framework then give rise to novel ideas to obtain model invariants for hidden Markov and related models.
Neuroevolution Mechanism for Hidden Markov Model  [cached]
Nabil M. Hewahi
Brain. Broad Research in Artificial Intelligence and Neuroscience , 2011,
Abstract: Hidden Markov Model (HMM) is a statistical model based on probabilities. HMM is becoming one of the major models involved in many applications such as natural language processing, handwritten recognition, image processing, prediction systems and many more. In this research we are concerned with finding out the best HMM for a certain application domain. We propose a neuroevolution process that is based first on converting the HMM to a neural network, then generating many neural networks at random where each represents a HMM. We proceed by applying genetic operators to obtain new set of neural networks where each represents HMMs, and updating the population. Finally select the best neural network based on a fitness function.
Measure Concentration of Hidden Markov Processes  [PDF]
Leonid Kontorovich
Mathematics , 2006,
Abstract: We prove what appears to be the first concentration of measure result for hidden Markov processes. Our bound is stated in terms of the contraction coefficients of the underlying Markov process, and strictly generalizes the Markov process concentration results of Marton (1996) and Samson (2000). Somewhat surprisingly, the bound turns out to be the same as for ordinary Markov processes; this property, however, fails for general hidden/observed process pairs.
Subspace estimation and prediction methods for hidden Markov models  [PDF]
Sofia Andersson,Tobias Rydén
Statistics , 2009, DOI: 10.1214/09-AOS711
Abstract: Hidden Markov models (HMMs) are probabilistic functions of finite Markov chains, or, put in other words, state space models with finite state space. In this paper, we examine subspace estimation methods for HMMs whose output lies a finite set as well. In particular, we study the geometric structure arising from the nonminimality of the linear state space representation of HMMs, and consistency of a subspace algorithm arising from a certain factorization of the singular value decomposition of the estimated linear prediction matrix. For this algorithm, we show that the estimates of the transition and emission probability matrices are consistent up to a similarity transformation, and that the $m$-step linear predictor computed from the estimated system matrices is consistent, i.e., converges to the true optimal linear $m$-step predictor.
Page 1 /100
Display every page Item

Copyright © 2008-2017 Open Access Library. All rights reserved.