Abstract:
We give a hybrid two-stage design which can be useful to estimate the reliability of a parallel-series and/or by duality a series-parallel system. When the components' reliabilities are unknown, one can estimate them by sample means of Bernoulli observations. Let be the total number of observations allowed for the system. When is fixed, we show that the variance of the system reliability estimate can be lowered by allocation of the sample size at components' level. This leads to a discrete optimization problem which can be solved sequentially, assuming is large enough. First-order asymptotic optimality is proved systematically and validated Monte Carlo simulation. 1. Introduction In reliability engineering two crucial objectives are considered: to maximize an estimate of system reliability and to minimize the variance of the reliability estimate. Because system designers and users are risk averse, they generally prefer the second objective which leads to a system design with a slightly lower reliability estimate but a lower variance of that estimate, (e.g., [1]). It provides decision makers efficient rules compared to other designs which have a higher system reliability estimate, but with a high variability of that estimate. In the case of parallel-series and/or by duality series-parallel systems, the variance of the reliability estimate can be lowered by allocation of a fixed sample size (the number of observations or units tested in the system), while reliability estimate is obtained by testing components, see Berry [2]. Allocation schemes for estimation with cost, see, for example, [2–7], lead generally to a discrete optimization problem which can be solved sequentially using adaptive designs in a fixed or a Bayesian framework. Based on a decision theoretic approach, the authors seek to minimize either the variance or the Bayes risk associated to a squared error loss function. The problem of optimal reliability estimation reduces to a problem of optimal allocation of the sample sizes between Bernoulli populations. Such problems can be solved via dynamic programming but this technique becomes costly and intractable for complex systems. In the case of a two component series or parallel system, optimal procedures can be obtained and solved analytically when the coefficients of variation of the associated Bernoulli populations are known, (cf., e.g., [5, 8]). Unfortunately, these coefficients are not known in practice since they depend themselves on the unknown components’ reliabilities of the system. In [9], the author has defined a sequential allocation

Abstract:
In the paper, excess methods for improving the reliability of multi-state series-parallel systems are presented: for the hot reserve of single components, the cold reserve of single components, and the mixed (hot and cold) reserve of single components. A process is also introduced to improve the reliability of these methods by replacing their components with more reliable ones. New theorems for multi-state limit reliability functions in homogeneous and non-homogeneous series-parallel large systems composed of components with improved reliability are presented, and applied to compare the effects of these systems in different reliability improving methods.

Abstract:
In this paper, a semi-Markov model of system operation processes is proposed and its selected parameters are determined. A series-parallel multi-state system is considered, and its reliability and risk characteristics found. Subsequently, a joint model of system operation process and system multi-state reliability and risk is constructed. Moreover, the asymptotic approach to reliability and risk evaluation of a multi-state series-parallel system in its operation process is applied to a port grain transportation system.

This paper presents
a hierarchical Bayesian approach to the estimation of components’ reliability
(survival) using a Weibull model for each of them. The proposed method can be
used to estimation with general survival censored data, because the estimation
of a component’s reliability in a series (parallel) system is equivalent to the
estimation of its survival function with right- (left-) censored data. Besides
the Weibull parametric model for reliability data, independent gamma
distributions are considered at the first hierarchical level for the Weibull
parameters and independent uniform distributions over the real line as priors
for the parameters of the gammas. In order to evaluate the model, an example
and a simulation study are discussed.

Abstract:
We give a risk-averse solution to the problem of estimating the reliability of a parallel-series system. We adopt a beta-binomial model for components reliabilities, and assume that the total sample size for the experience is fixed. The allocation at subsystems or components level may be random. Based on the sampling schemes for parallel and series systems separately, we propose a hybrid sequential scheme for the parallel-series system. Asymptotic optimality of the Bayes risk associated with quadratic loss is proved with the help of martingale convergence properties.

Abstract:
The concepts of residual life time and inactivity time are extensively used in reliability theory for modeling life time data. In this paper we prove some new results on stochastic comparisons of residual life time and inactivity time in series and parallel systems. These results are in addition to the existing results of Li & Zhang (2003) and Li & Lu (2003). We also present sufficient conditions for aging properties of the residual life time and inactivity life time of series and parallel systems. Some examples from Weibull and Gompertz distributions are provided to support the results as well.

Abstract:
reliability evaluation of systems composed of non-identical multi-state components is a challenging task due to the highly combinatorial nature of the problem. the outcome is a lack of understanding regarding the reliability behavior of a system when components with different magnitudes of failure probabilities are considered, especially when extra components are added. the present work addresses this problem by presenting a simplified approach to analyze variations on reliability profiles of series and parallel systems composed of non-identical three-state components. in addition, this work presents an optimization model for allocation of non-identical components under financial and physical restrictions. the model performed satisfactorily in terms of robustness when different magnitudes of failure probabilities were tested.

Abstract:
Recent studies suggest that the minimum error entropy (MEE) criterion can outperform the traditional mean square error criterion in supervised machine learning, especially in nonlinear and non-Gaussian situations. In practice, however, one has to estimate the error entropy from the samples since in general the analytical evaluation of error entropy is not possible. By the Parzen windowing approach, the estimated error entropy converges asymptotically to the entropy of the error plus an independent random variable whose probability density function (PDF) corresponds to the kernel function in the Parzen method. This quantity of entropy is called the smoothed error entropy, and the corresponding optimality criterion is named the smoothed MEE (SMEE) criterion. In this paper, we study theoretically the SMEE criterion in supervised machine learning where the learning machine is assumed to be nonparametric and universal. Some basic properties are presented. In particular, we show that when the smoothing factor is very small, the smoothed error entropy equals approximately the true error entropy plus a scaled version of the Fisher information of error. We also investigate how the smoothing factor affects the optimal solution. In some special situations, the optimal solution under the SMEE criterion does not change with increasing smoothing factor. In general cases, when the smoothing factor tends to infinity, minimizing the smoothed error entropy will be approximately equivalent to minimizing error variance, regardless of the conditional PDF and the kernel.

Abstract:
In MCMC methods, such as the Metropolis-Hastings (MH) algorithm, the Gibbs sampler, or recent adaptive methods, many different strategies can be proposed, often associated in practice to unknown rates of convergence. In this paper we propose a simulation-based methodology to compare these rates of convergence, grounded on an entropy criterion computed from parallel (i.i.d.) simulated Markov chains coming from each candidate strategy. Our criterion determines on the very first iterations the best strategy among the candidates. Theoretically, we give for the MH algorithm general conditions under which its successive densities satisfy adequate smoothness and tail properties, so that this entropy criterion can be estimated consistently using kernel density estimate and Monte Carlo integration. Simulated examples are provided to illustrate this convergence criterion.

Abstract:
Quality assessment of Multi-objective Optimization algorithms has been a major concern in the scientific field during the last decades. The entropy metric is introduced and highlighted in computing the diversity of Multi-objective Optimization Algorithms. In this paper, the definition of the entropy metric and the approach of diversity measurement based on entropy are presented. This measurement is adopted to not only Multi-objective Evolutionary Algorithm but also Multi-objective Immune Algorithm. Besides, the key techniques of entropy metric, such as the appropriate principle of grid method, the reasonable parameter selection and the simplification of density function, are discussed and analyzed. Moreover, experimental results prove the validity and efficiency of the entropy metric. The computational effort of entropy increases at a linear rate with the number of points in the solution set, which is indeed superior to other quality indicators. Compared with Generational Distance, it is proved that the entropy metric have the capability of describing the diversity performance on a quantitative basis. Therefore, the entropy criterion can serve as a high-efficient diversity criterion of Multi-objective optimization algorithms.