oalib

Publish in OALib Journal

ISSN: 2333-9721

APC: Only $99

Submit

Any time

2020 ( 19 )

2019 ( 138 )

2018 ( 173 )

2017 ( 167 )

Custom range...

Search Results: 1 - 10 of 130467 matches for " V. Dubourg "
All listed articles are free for downloading (OA Articles)
Page 1 /130467
Display every page Item
Reliability-based design optimization using kriging surrogates and subset simulation
V. Dubourg,B. Sudret,J. -M. Bourinet
Statistics , 2011, DOI: 10.1007/s00158-011-0653-8
Abstract: The aim of the present paper is to develop a strategy for solving reliability-based design optimization (RBDO) problems that remains applicable when the performance models are expensive to evaluate. Starting with the premise that simulation-based approaches are not affordable for such problems, and that the most-probable-failure-point-based approaches do not permit to quantify the error on the estimation of the failure probability, an approach based on both metamodels and advanced simulation techniques is explored. The kriging metamodeling technique is chosen in order to surrogate the performance functions because it allows one to genuinely quantify the surrogate error. The surrogate error onto the limit-state surfaces is propagated to the failure probabilities estimates in order to provide an empirical error measure. This error is then sequentially reduced by means of a population-based adaptive refinement technique until the kriging surrogates are accurate enough for reliability analysis. This original refinement strategy makes it possible to add several observations in the design of experiments at the same time. Reliability and reliability sensitivity analyses are performed by means of the subset simulation technique for the sake of numerical efficiency. The adaptive surrogate-based strategy for reliability estimation is finally involved into a classical gradient-based optimization algorithm in order to solve the RBDO problem. The kriging surrogates are built in a so-called augmented reliability space thus making them reusable from one nested RBDO iteration to the other. The strategy is compared to other approaches available in the literature on three academic examples in the field of structural mechanics.
Metamodel-based importance sampling for the simulation of rare events
V. Dubourg,F. Deheeger,B. Sudret
Statistics , 2011,
Abstract: In the field of structural reliability, the Monte-Carlo estimator is considered as the reference probability estimator. However, it is still untractable for real engineering cases since it requires a high number of runs of the model. In order to reduce the number of computer experiments, many other approaches known as reliability methods have been proposed. A certain approach consists in replacing the original experiment by a surrogate which is much faster to evaluate. Nevertheless, it is often difficult (or even impossible) to quantify the error made by this substitution. In this paper an alternative approach is developed. It takes advantage of the kriging meta-modeling and importance sampling techniques. The proposed alternative estimator is finally applied to a finite element based structural reliability analysis.
Metamodel-based importance sampling for structural reliability analysis
V. Dubourg,F. Deheeger,B. Sudret
Statistics , 2011,
Abstract: Structural reliability methods aim at computing the probability of failure of systems with respect to some prescribed performance functions. In modern engineering such functions usually resort to running an expensive-to-evaluate computational model (e.g. a finite element model). In this respect simulation methods, which may require $10^{3-6}$ runs cannot be used directly. Surrogate models such as quadratic response surfaces, polynomial chaos expansions or kriging (which are built from a limited number of runs of the original model) are then introduced as a substitute of the original model to cope with the computational cost. In practice it is almost impossible to quantify the error made by this substitution though. In this paper we propose to use a kriging surrogate of the performance function as a means to build a quasi-optimal importance sampling density. The probability of failure is eventually obtained as the product of an augmented probability computed by substituting the meta-model for the original performance function and a correction term which ensures that there is no bias in the estimation even if the meta-model is not fully accurate. The approach is applied to analytical and finite element reliability problems and proves efficient up to 100 random variables.
Reliability-based design optimization of an imperfect submarine pressure hull
V. Dubourg,J. -M. Bourinet,B. Sudret,M. Cazuguel
Statistics , 2011,
Abstract: Reliability-based design optimization (RBDO) has gained much attention in the past fifteen years as a way of introducing robustness in the process of designing structures and systems in an optimal manner. Indeed classical optimization (e.g. minimize some cost under mechanical constraints) usually leads to solutions that lie at the boundary of the admissible domain, and that are consequently rather sensitive to uncertainty in the design parameters. In contrast, RBDO aims at designing the system in a robust way by minimizing some cost function under reliability constraints. Thus RBDO methods have to mix optimization algorithms together with reliability calculations. The classical approach known as "double-loop" consists in nesting the computation of the failure probability with respect to the current design within the optimization loop. It is not applicable to industrial models (e.g. finite element models) due to the associated computational burden. In contrast, methods based on the approximation of the reliability (e.g. FORM) may not be applicable to real-world problems. In this context, an original method has been developed that tries to circumvent the abovementioned drawbacks of the existing approaches. It is based on the adaptive construction of a meta-model for the expensive-to-evaluate mechanical model, and on the subset simulation technique for the efficient and accurate computation of the failure probability and its sensitivities with respect to the design variables. The proposed methodology is briefly described in this paper before it is applied to the reliability-based design of an imperfect submarine pressure hull.
Holoprosencephaly
Christèle Dubourg, Claude Bendavid, Laurent Pasquier, Catherine Henry, Sylvie Odent, Véronique David
Orphanet Journal of Rare Diseases , 2007, DOI: 10.1186/1750-1172-2-8
Abstract: Holoprosencephaly (HPE)Midline cleft syndromeDeMyer sequenceIsolated HPE (non syndromic, non chromosomic)Familial HPEArhinencephalyCyclopiaHoloprosencephaly (HPE, MIM 236100) is a complex human brain malformation resulting from incomplete cleavage of the prosencephalon into right and left hemispheres, occurring between the 18th and the 28th day of gestation. Three levels of increasing severity are described [1]: lobar HPE, where the right and left ventricles are separated, but with some continuity across the frontal cortex; semilobar HPE with a partial separation, and the most severe form, alobar HPE, with a single brain ventricle and no interhemispheric fissure. Another milder subtype of HPE called the middle interhemispheric variant (MIHF) or syntelencephaly, has now been recognized [2,3] (Table 1). There is a continuous spectrum of abnormal separation of the hemispheres rather than clearly distinct division into these three types of malformation [4]. The forebrain malformations are generally associated with facial anomalies, ranging from anophthalmia, cyclopia or proboscis in the most severe cases, to midline cleft lip, a simple hypotelorism or even no anomalies in the less severe HPE forms [5,6] (Table 2). The HPE phenotypic spectrum also encompasses microforms including facial midline anomalies with a normal brain. This wide spectrum can be observed within the same family [7].HPE is a genetically heterogeneous anomaly and this phenotype is known to be part of different syndromes or chromosomal anomalies.Holoprosencephaly is the most common forebrain developmental anomaly in humans with prevalence of 1/16,000 in live borns [8-11], an incidence as high as 1:250 in conceptuses [12], and a worldwide distribution. But considering the advances in neuroimaging with magnetic resonance imaging (MRI), children with less severe forms, like the recently described MIHF or lobar forms, who were undiagnosed, should be now identified leading to an increasing prevalence of the
Note sur l’atelier monétaire de Chinon du VIIe au Xe siècle Note on the mint in Chinon from the 7th to the 10th Century
Fran?oise Dumas-Dubourg
Revue Archéologique du Centre de la France , 2006,
Abstract: Chinon entre dans l’histoire de la monnaie médiévale non par une émission monétaire particulière mais par le biais d’un trésor de 81 sous d’or enfoui au VIe siècle, vers 520, composé pour l’essentiel d’imitations de monnaies impériales par divers peuples barbares : ostrogoths, burgondes et francs (Robert 1882).C’est bien plus tard que des monnaies portent le nom de Chinon : une monnaie mérovingienne d’or : un tiers de sou du VIIe siècle et deux deniers d’argent du VIIIe siècle. Le tiers de so...
A generalization of Snoek's law to ferromagnetic films and composites
Olivier Acher,Sébastien Dubourg
Physics , 2007, DOI: 10.1103/PhysRevB.77.104440
Abstract: The present paper establishes characteristics of the relative magnetic permeability spectrum $\mu$(f) of magnetic materials at microwave frequencies. The integral of the imaginary part of $\mu$(f) multiplied with the frequency f gives remarkable properties. A generalisation of Snoek's law consists in this quantity being bounded by the square of the saturation magnetization multiplied with a constant. While previous results have been obtained in the case of non-conductive materials, this work is a generalization to ferromagnetic materials and ferromagnetic-based composites with significant skin effect. The influence of truncating the summation to finite upper frequencies is investigated, and estimates associated to the finite summation are provided. It is established that, in practice, the integral does not depend on the damping model under consideration. Numerical experiments are performed in the exactly solvable case of ferromagnetic thin films with uniform magnetization, and these numerical experiments are found to confirm our theoretical results. Microwave permeability measurements on soft amorphous films are reported. The relation between the integral and the saturation magnetization is verified experimentally, and some practical applications of the theoretical results are introduced. The integral can be used to determine the average magnetization orientation in materials with complex configurations of the magnetization, and furthermore to demonstrate the accuracy of microwave measurement systems. For certain applications, such as electromagnetic compatibility or radar absorbing materials, the relations established herein provide useful indications for the design of efficient materials, and simple figures of merit to compare the properties measured on various materials.
Proposition d’un modèle de tutorat pour la conception de dispositifs d’accompagnement en formation en ligne
Patricia Gounon,Pascal Leroux,Xavier Dubourg
Revue Internationale des Technologies en Pédagogie Universitaire , 2004,
Abstract: [Fran ais] Nous avons constaté dans la littérature le manque de prise en compte de l’accompagnement des apprenants dans la conception des dispositifs de formation en ligne. Ce manque nous semble très préjudiciable dans le cycle de vie d’une formation. C’est pourquoi, après avoir identifié les besoins d’accompagnement des apprenants dans une formation en ligne, nous proposons un modèle descriptif d’une activité de tutorat. Ce modèle sert de fondement pour guider la définition des spécifications du dispositif d’accompagnement des apprenants en matière de taches et d’outils supports de leurs activités. Dans la dernière partie de l’article, nous présentons la méthodologie d’application de ce modèle au cours du cycle de vie d’une formation. [English] The learners’ tutoring component in Web-based distance education is often neglected during the instructional design process. This omission is prejudicial in the life-cycle courseware. This is why we propose a descriptive tutoring model, after identifying learners support needs in a distance learning environment. This model is used as the foundation to guide specification definition of learner accompaniment environment in terms of tasks and tools supporting their activities. In the last part of this article, we present the application methodology using this model during the life-cycle courseware.
Utero-vaginal aplasia (Mayer-Rokitansky-Küster-Hauser syndrome) associated with deletions in known DiGeorge or DiGeorge-like loci
Karine Morcel, Tanguy Watrin, Laurent Pasquier, Lucie Rochard, Cédric Le Caignec, Christèle Dubourg, Philippe Loget, Bernard-Jean Paniel, Sylvie Odent, Véronique David, Isabelle Pellerin, Claude Bendavid, Daniel Guerrier
Orphanet Journal of Rare Diseases , 2011, DOI: 10.1186/1750-1172-6-9
Abstract: We searched DiGeorge critical chromosomal regions for chromosomal anomalies in a cohort of 57 subjects with uterovaginal aplasia (55 women and 2 aborted fetuses). For this candidate locus approach, we used a multiplex ligation-dependent probe amplification (MLPA) assay based on a kit designed for investigation of the chromosomal regions known to be involved in DGS.The deletions detected were validated by Duplex PCR/liquid chromatography (DP/LC) and/or array-CGH analysis.We found deletions in four probands within the four chromosomal loci 4q34-qter, 8p23.1, 10p14 and 22q11.2 implicated in almost all cases of DGS syndrome.Uterovaginal aplasia appears to be an additional feature of the broad spectrum of the DGS phenotype. The DiGeorge critical chromosomal regions may be candidate loci for a subset of MRKH syndrome (MURCS association) individuals. However, the genes mapping at the sites of these deletions involved in uterovaginal anomalies remain to be determined. These findings have consequences for clinical investigations, the care of patients and their relatives, and genetic counseling.Congenital aplasia of the uterus and the upper two thirds of the vagina is diagnosed as Mayer-Rokitansky-Küster-Hauser (MRKH) syndrome in 90% of affected women presenting with primary amenorrhea and otherwise normal secondary sexual characteristics, normal ovaries and a normal karyotype (46, XX) [1]. The incidence of MRKH syndrome has been estimated to be 1 in 4500 female births [2-4]. The uterovaginal aplasia can be isolated (type I; OMIM 277000) but it is more frequently associated with other malformations (type II; OMIM 6601076). Type II is also referred to as the MURCS (Müllerian Renal Cervico-thoracic Somite anomalies) association. The most common associated malformations involve the upper urinary tract affecting about 40% of patients [5] and the cervicothoracic spine affecting about 30 to 40% of patients [5-7]. Renal malformations include unilateral agenesis, ectopia of one or bo
How to develop a theory-driven evaluation design? Lessons learned from an adolescent sexual and reproductive health programme in West Africa
Sara B Van Belle, Bruno Marchal, Dominique Dubourg, Guy Kegels
BMC Public Health , 2010, DOI: 10.1186/1471-2458-10-741
Abstract: Based on our experience and the existing literature, we developed a six-step framework for the design of theory-driven evaluations, which we applied in the ex-post evaluation of the networking component of the intervention. The protocol was drafted with the input of the intervention designer. The programme theory, the central element of theory-driven evaluation, was constructed on the basis of semi-structured interviews with designers, implementers and beneficiaries and an analysis of the intervention's logical framework.The six-step framework proved useful as it allowed for a systematic development of the protocol. We describe the challenges at each step. We found that there is little practical guidance in the existing literature, and also a mix up of terminology of theory-driven evaluation approaches. There is a need for empirical methodological development in order to refine the tools to be used in theory driven evaluation. We conclude that ex-post evaluations of programmes can be based on such an approach if the required information on context and mechanisms is collected during the programme.Theory-driven evaluation (TDE) was invented to provide an answer to problems of evaluation approaches that are limited to before-after and input-output designs traditionally used in programme evaluation [1,2]. This was a reaction to the position of Campbell & Stanley [3], who stated that internal validity is the most essential issue in research, and Cronbach's position that evaluation cannot serve policymaking if its external validity is not guaranteed [4]. Chen and Rossi aimed at providing a perspective on evaluation that ensures both types of validity. These authors hold that for any intervention, a programme theory that explains how the planners expect the intervention to work can be described. The programme theory is the often implicit set of assumptions that steers the choice and design of an intervention. Making these assumptions explicit allows to understand what is b
Page 1 /130467
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.