oalib

Publish in OALib Journal

ISSN: 2333-9721

APC: Only $99

Submit

Any time

2018 ( 1 )

2017 ( 1 )

2016 ( 2 )

2015 ( 17 )

Custom range...

Search Results: 1 - 10 of 297 matches for " Hachem Kadri "
All listed articles are free for downloading (OA Articles)
Page 1 /297
Display every page Item
Online Learning with Multiple Operator-valued Kernels
Julien Audiffren,Hachem Kadri
Computer Science , 2013,
Abstract: We consider the problem of learning a vector-valued function f in an online learning setting. The function f is assumed to lie in a reproducing Hilbert space of operator-valued kernels. We describe two online algorithms for learning f while taking into account the output structure. A first contribution is an algorithm, ONORMA, that extends the standard kernel-based online learning algorithm NORMA from scalar-valued to operator-valued setting. We report a cumulative error bound that holds both for classification and regression. We then define a second algorithm, MONORMA, which addresses the limitation of pre-defining the output structure in ONORMA by learning sequentially a linear combination of operator-valued kernels. Our experiments show that the proposed algorithms achieve good performance results with low computational cost.
Stability of Multi-Task Kernel Regression Algorithms
Julien Audiffren,Hachem Kadri
Computer Science , 2013,
Abstract: We study the stability properties of nonlinear multi-task regression in reproducing Hilbert spaces with operator-valued kernels. Such kernels, a.k.a. multi-task kernels, are appropriate for learning prob- lems with nonscalar outputs like multi-task learning and structured out- put prediction. We show that multi-task kernel regression algorithms are uniformly stable in the general case of infinite-dimensional output spaces. We then derive under mild assumption on the kernel generaliza- tion bounds of such algorithms, and we show their consistency even with non Hilbert-Schmidt operator-valued kernels . We demonstrate how to apply the results to various multi-task kernel regression methods such as vector-valued SVR and functional ridge regression.
Equivalence of Learning Algorithms
Julien Audiffren,Hachem Kadri
Computer Science , 2014,
Abstract: The purpose of this paper is to introduce a concept of equivalence between machine learning algorithms. We define two notions of algorithmic equivalence, namely, weak and strong equivalence. These notions are of paramount importance for identifying when learning prop erties from one learning algorithm can be transferred to another. Using regularized kernel machines as a case study, we illustrate the importance of the introduced equivalence concept by analyzing the relation between kernel ridge regression (KRR) and m-power regularized least squares regression (M-RLSR) algorithms.
M-Power Regularized Least Squares Regression
Julien Audiffren,Hachem Kadri
Computer Science , 2013,
Abstract: Regularization is used to find a solution that both fits the data and is sufficiently smooth, and thereby is very effective for designing and refining learning algorithms. But the influence of its exponent remains poorly understood. In particular, it is unclear how the exponent of the reproducing kernel Hilbert space (RKHS) regularization term affects the accuracy and the efficiency of kernel-based learning algorithms. Here we consider regularized least squares regression (RLSR) with an RKHS regularization raised to the power of m, where m is a variable real exponent. We design an efficient algorithm for solving the associated minimization problem, we provide a theoretical analysis of its stability, and we {compare it %/ demonstrate its advantage with respect to computational complexity, speed of convergence and prediction accuracy to %/over} the classical kernel ridge regression algorithm where the regularization exponent m is fixed at 2. Our results show that the m-power RLSR problem can be solved efficiently, and support the suggestion that one can use a regularization term that grows significantly slower than the standard quadratic growth in the RKHS norm.}
A Generalized Kernel Approach to Structured Output Learning
Hachem Kadri,Mohammad Ghavamzadeh,Philippe Preux
Computer Science , 2012,
Abstract: We study the problem of structured output learning from a regression perspective. We first provide a general formulation of the kernel dependency estimation (KDE) problem using operator-valued kernels. We show that some of the existing formulations of this problem are special cases of our framework. We then propose a covariance-based operator-valued kernel that allows us to take into account the structure of the kernel feature space. This kernel operates on the output space and encodes the interactions between the outputs without any reference to the input space. To address this issue, we introduce a variant of our KDE method based on the conditional covariance operator that in addition to the correlation between the outputs takes into account the effects of the input variables. Finally, we evaluate the performance of our KDE approach using both covariance and conditional covariance kernels on two structured output problems, and compare it to the state-of-the-art kernel-based structured output regression methods.
One-Class SVMs Challenges in Audio Detection and Classification Applications
Asma Rabaoui,Hachem Kadri,Zied Lachiri,Noureddine Ellouze
EURASIP Journal on Advances in Signal Processing , 2008, DOI: 10.1155/2008/834973
Abstract: Support vector machines (SVMs) have gained great attention and have been used extensively and successfully in the field of sounds (events) recognition. However, the extension of SVMs to real-world signal processing applications is still an ongoing research topic. Our work consists of illustrating the potential of SVMs on recognizing impulsive audio signals belonging to a complex real-world dataset. We propose to apply optimized one-class support vector machines (1-SVMs) to tackle both sound detection and classification tasks in the sound recognition process. First, we propose an efficient and accurate approach for detecting events in a continuous audio stream. The proposed unsupervised sound detection method which does not require any pretrained models is based on the use of the exponential family model and 1-SVMs to approximate the generalized likelihood ratio. Then, we apply novel discriminative algorithms based on 1-SVMs with new dissimilarity measure in order to address a supervised sound-classification task. We compare the novel sound detection and classification methods with other popular approaches. The remarkable sound recognition results achieved in our experiments illustrate the potential of these methods and indicate that 1-SVMs are well suited for event-recognition tasks.
Multiple Operator-valued Kernel Learning
Hachem Kadri,Alain Rakotomamonjy,Francis Bach,Philippe Preux
Computer Science , 2012,
Abstract: Positive definite operator-valued kernels generalize the well-known notion of reproducing kernels, and are naturally adapted to multi-output learning situations. This paper addresses the problem of learning a finite linear combination of infinite-dimensional operator-valued kernels which are suitable for extending functional data analysis methods to nonlinear contexts. We study this problem in the case of kernel ridge regression for functional responses with an lr-norm constraint on the combination coefficients. The resulting optimization problem is more involved than those of multiple scalar-valued kernel learning since operator-valued kernels pose more technical and theoretical issues. We propose a multiple operator-valued kernel learning algorithm based on solving a system of linear operator equations by using a block coordinatedescent procedure. We experimentally validate our approach on a functional regression task in the context of finger movement prediction in brain-computer interfaces.
Multiple functional regression with both discrete and continuous covariates
Hachem Kadri,Philippe Preux,Emmanuel Duflos,Stéphane Canu
Computer Science , 2013,
Abstract: In this paper we present a nonparametric method for extending functional regression methodology to the situation where more than one functional covariate is used to predict a functional response. Borrowing the idea from Kadri et al. (2010a), the method, which support mixed discrete and continuous explanatory variables, is based on estimating a function-valued function in reproducing kernel Hilbert spaces by virtue of positive operator-valued kernels.
Functional Regularized Least Squares Classi cation with Operator-valued Kernels
Hachem Kadri,Asma Rabaoui,Philippe Preux,Emmanuel Duflos,Alain Rakotomamonjy
Computer Science , 2013,
Abstract: Although operator-valued kernels have recently received increasing interest in various machine learning and functional data analysis problems such as multi-task learning or functional regression, little attention has been paid to the understanding of their associated feature spaces. In this paper, we explore the potential of adopting an operator-valued kernel feature space perspective for the analysis of functional data. We then extend the Regularized Least Squares Classification (RLSC) algorithm to cover situations where there are multiple functions per observation. Experiments on a sound recognition problem show that the proposed method outperforms the classical RLSC algorithm.
Operator-valued Kernels for Learning from Functional Response Data
Hachem Kadri,Emmanuel Duflos,Philippe Preux,Stéphane Canu,Alain Rakotomamonjy,Julien Audiffren
Computer Science , 2015,
Abstract: In this paper we consider the problems of supervised classification and regression in the case where attributes and labels are functions: a data is represented by a set of functions, and the label is also a function. We focus on the use of reproducing kernel Hilbert space theory to learn from such functional data. Basic concepts and properties of kernel-based learning are extended to include the estimation of function-valued functions. In this setting, the representer theorem is restated, a set of rigorously defined infinite-dimensional operator-valued kernels that can be valuably applied when the data are functions is described, and a learning algorithm for nonlinear functional data analysis is introduced. The methodology is illustrated through speech and audio signal processing experiments.
Page 1 /297
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.