Abstract:
Based on our experience and the existing literature, we developed a six-step framework for the design of theory-driven evaluations, which we applied in the ex-post evaluation of the networking component of the intervention. The protocol was drafted with the input of the intervention designer. The programme theory, the central element of theory-driven evaluation, was constructed on the basis of semi-structured interviews with designers, implementers and beneficiaries and an analysis of the intervention's logical framework.The six-step framework proved useful as it allowed for a systematic development of the protocol. We describe the challenges at each step. We found that there is little practical guidance in the existing literature, and also a mix up of terminology of theory-driven evaluation approaches. There is a need for empirical methodological development in order to refine the tools to be used in theory driven evaluation. We conclude that ex-post evaluations of programmes can be based on such an approach if the required information on context and mechanisms is collected during the programme.Theory-driven evaluation (TDE) was invented to provide an answer to problems of evaluation approaches that are limited to before-after and input-output designs traditionally used in programme evaluation [1,2]. This was a reaction to the position of Campbell & Stanley [3], who stated that internal validity is the most essential issue in research, and Cronbach's position that evaluation cannot serve policymaking if its external validity is not guaranteed [4]. Chen and Rossi aimed at providing a perspective on evaluation that ensures both types of validity. These authors hold that for any intervention, a programme theory that explains how the planners expect the intervention to work can be described. The programme theory is the often implicit set of assumptions that steers the choice and design of an intervention. Making these assumptions explicit allows to understand what is b

Abstract:
objective: to examine the reliability of reported rates of caesarean sections from developing countries and make recommendations on how data collection for surveys and health facility-based studies could be improved. methods: population-based rates for caesarean section obtained from two sources: demographic and health surveys (dhs) and health facility-based records of caesarean sections from the unmet obstetric need network, together with estimates of the number of live births, were compared for six developing countries. sensitivity analyses were conducted using several different definitions of the caesarean section rate, and the rates obtained from the two data sources were compared. findings: the dhs rates for caesarean section were consistently higher than the facility-based rates. however, in three quarters of the cases, the facility-based rates for caesarean sections fell within the 95% confidence intervals for the dhs estimate. conclusion: the importance of the differences between these two series of rates depends on the analyst's perspective. for national and global monitoring, dhs data on caesarean sections would suffice, although the imprecision of the rates would make the monitoring of trends difficult. however, the imprecision of dhs data on caesarean sections precludes their use for the purposes of programme evaluation at the regional level.

Abstract:
Introduction and Objectives Numerous studies have assessed cost-effectiveness of different treatment modalities for stable angina. Direct comparisons, however, are uncommon. We therefore set out to compare the efficacy and mean cost per patient after 1 and 3 years of follow-up, of the following treatments as assessed in randomized controlled trials (RCT): medical therapy (MT), percutaneous coronary intervention (PCI) without stent (PTCA), with bare-metal stent (BMS), with drug-eluting stent (DES), and elective coronary artery bypass graft (CABG). Methods RCT comparing at least two of the five treatments and reporting clinical and cost data were identified by a systematic search. Clinical end-points were mortality and myocardial infarction (MI). The costs described in the different trials were standardized and expressed in US $ 2008, based on purchasing power parity. A network meta-analysis was used to compare costs. Results Fifteen RCT were selected. Mortality and MI rates were similar in the five treatment groups both for 1-year and 3-year follow-up. Weighted cost per patient however differed markedly for the five treatment modalities, at both one year and three years (P<0.0001). MT was the least expensive treatment modality: US $3069 and 13 864 after one and three years of follow-up, while CABG was the most costly: US $27 003 and 28 670 after one and three years. PCI, whether with plain balloon, BMS or DES came in between, but was closer to the costs of CABG. Conclusions Appreciable savings in health expenditures can be achieved by using MT in the management of patients with stable angina.

Abstract:
Chinon entre dans l’histoire de la monnaie médiévale non par une émission monétaire particulière mais par le biais d’un trésor de 81 sous d’or enfoui au VIe siècle, vers 520, composé pour l’essentiel d’imitations de monnaies impériales par divers peuples barbares : ostrogoths, burgondes et francs (Robert 1882).C’est bien plus tard que des monnaies portent le nom de Chinon : une monnaie mérovingienne d’or : un tiers de sou du VIIe siècle et deux deniers d’argent du VIIIe siècle. Le tiers de so...

Abstract:
The present paper establishes characteristics of the relative magnetic permeability spectrum $\mu$(f) of magnetic materials at microwave frequencies. The integral of the imaginary part of $\mu$(f) multiplied with the frequency f gives remarkable properties. A generalisation of Snoek's law consists in this quantity being bounded by the square of the saturation magnetization multiplied with a constant. While previous results have been obtained in the case of non-conductive materials, this work is a generalization to ferromagnetic materials and ferromagnetic-based composites with significant skin effect. The influence of truncating the summation to finite upper frequencies is investigated, and estimates associated to the finite summation are provided. It is established that, in practice, the integral does not depend on the damping model under consideration. Numerical experiments are performed in the exactly solvable case of ferromagnetic thin films with uniform magnetization, and these numerical experiments are found to confirm our theoretical results. Microwave permeability measurements on soft amorphous films are reported. The relation between the integral and the saturation magnetization is verified experimentally, and some practical applications of the theoretical results are introduced. The integral can be used to determine the average magnetization orientation in materials with complex configurations of the magnetization, and furthermore to demonstrate the accuracy of microwave measurement systems. For certain applications, such as electromagnetic compatibility or radar absorbing materials, the relations established herein provide useful indications for the design of efficient materials, and simple figures of merit to compare the properties measured on various materials.

Abstract:
[Fran ais] Nous avons constaté dans la littérature le manque de prise en compte de l’accompagnement des apprenants dans la conception des dispositifs de formation en ligne. Ce manque nous semble très préjudiciable dans le cycle de vie d’une formation. C’est pourquoi, après avoir identifié les besoins d’accompagnement des apprenants dans une formation en ligne, nous proposons un modèle descriptif d’une activité de tutorat. Ce modèle sert de fondement pour guider la définition des spécifications du dispositif d’accompagnement des apprenants en matière de taches et d’outils supports de leurs activités. Dans la dernière partie de l’article, nous présentons la méthodologie d’application de ce modèle au cours du cycle de vie d’une formation. [English] The learners’ tutoring component in Web-based distance education is often neglected during the instructional design process. This omission is prejudicial in the life-cycle courseware. This is why we propose a descriptive tutoring model, after identifying learners support needs in a distance learning environment. This model is used as the foundation to guide specification definition of learner accompaniment environment in terms of tasks and tools supporting their activities. In the last part of this article, we present the application methodology using this model during the life-cycle courseware.

Abstract:
The aim of the present paper is to develop a strategy for solving reliability-based design optimization (RBDO) problems that remains applicable when the performance models are expensive to evaluate. Starting with the premise that simulation-based approaches are not affordable for such problems, and that the most-probable-failure-point-based approaches do not permit to quantify the error on the estimation of the failure probability, an approach based on both metamodels and advanced simulation techniques is explored. The kriging metamodeling technique is chosen in order to surrogate the performance functions because it allows one to genuinely quantify the surrogate error. The surrogate error onto the limit-state surfaces is propagated to the failure probabilities estimates in order to provide an empirical error measure. This error is then sequentially reduced by means of a population-based adaptive refinement technique until the kriging surrogates are accurate enough for reliability analysis. This original refinement strategy makes it possible to add several observations in the design of experiments at the same time. Reliability and reliability sensitivity analyses are performed by means of the subset simulation technique for the sake of numerical efficiency. The adaptive surrogate-based strategy for reliability estimation is finally involved into a classical gradient-based optimization algorithm in order to solve the RBDO problem. The kriging surrogates are built in a so-called augmented reliability space thus making them reusable from one nested RBDO iteration to the other. The strategy is compared to other approaches available in the literature on three academic examples in the field of structural mechanics.

Abstract:
In the field of structural reliability, the Monte-Carlo estimator is considered as the reference probability estimator. However, it is still untractable for real engineering cases since it requires a high number of runs of the model. In order to reduce the number of computer experiments, many other approaches known as reliability methods have been proposed. A certain approach consists in replacing the original experiment by a surrogate which is much faster to evaluate. Nevertheless, it is often difficult (or even impossible) to quantify the error made by this substitution. In this paper an alternative approach is developed. It takes advantage of the kriging meta-modeling and importance sampling techniques. The proposed alternative estimator is finally applied to a finite element based structural reliability analysis.

Abstract:
Structural reliability methods aim at computing the probability of failure of systems with respect to some prescribed performance functions. In modern engineering such functions usually resort to running an expensive-to-evaluate computational model (e.g. a finite element model). In this respect simulation methods, which may require $10^{3-6}$ runs cannot be used directly. Surrogate models such as quadratic response surfaces, polynomial chaos expansions or kriging (which are built from a limited number of runs of the original model) are then introduced as a substitute of the original model to cope with the computational cost. In practice it is almost impossible to quantify the error made by this substitution though. In this paper we propose to use a kriging surrogate of the performance function as a means to build a quasi-optimal importance sampling density. The probability of failure is eventually obtained as the product of an augmented probability computed by substituting the meta-model for the original performance function and a correction term which ensures that there is no bias in the estimation even if the meta-model is not fully accurate. The approach is applied to analytical and finite element reliability problems and proves efficient up to 100 random variables.

Abstract:
A new analysis method of the magnetization dispersion in a thin magnetic film is presented. It is based on the angular measurement of the permeability spectra and on the evaluation of the integral relation. It provides the average orientation of the magnetization in the layer and a dispersion parameter which quantifies the magnetic dispersion. The method is successfully applied on a soft CoNbZr 800nm magnetic layer which possesses a helical anisotropy profile. This helical profile is obtained by rotating continuously the sample during the sputtering deposition on a scale from R = 0 to 16 turns. The study reveals that, for about 1/2 turn, a maximal dispersion is achieved and, for more elevated rotation speed, the magnetization no longer follows the anisotropy profile but lines up along an easiest axis direction. The experimental data are well described by a one-dimensional micromagnetic model which takes both exchange coupling and a helical anisotropy into account. The analytical cases with an exchange constant null and infinite are also considered in order to gain more insight onto the observed magnetic behaviour in the soft magnetic thin film.