This paper presents the dual specification of the least-squares method. In other words, while the traditional (primal) formulation of the method minimizes the sum of squared residuals (noise), the dual specification maximizes a quadratic function that can be interpreted as the value of sample information. The two specifications are equivalent. Before developing the methodology that describes the dual of the least-squares method, the paper gives a historical perspective of its origin that sheds light on the thinking of Gauss, its inventor. The least-squares method is firmly established as a scientific approach by Gauss, Legendre and Laplace within the space of a decade, at the beginning of the nineteenth century. Legendre was the first author to name the approach, in 1805, as “méthode des moindres carrés”, a “least-squares method”. Gauss, however, used the method as early as 1795, when he was 18 years old. Again, he adopted it in 1801 to calculate the orbit of the newly discovered planet Ceres. Gauss published his way of looking at the least-squares approach in 1809 and gave several hints that the least-squares algorithm was a minimum variance linear estimator and that it was derivable from maximum likelihood considerations. Laplace wrote a very substantial chapter about the method in his fundamental treatise on probability theory published in 1812.

Abstract:
The Maximum Likelihood method estimates the parameter values of a statistical model that maximizes the corresponding likelihood function, given the sample information. This is the primal approach that, in this paper, is presented as a mathematical programming specification whose solution requires the formulation of a Lagrange problem. A result of this setup is that the Lagrange multipliers associated with the linear statistical model (where sample observations are regarded as a set of constraints) are equal to the vector of residuals scaled by the variance of those residuals. The novel contribution of this paper consists in deriving the dual model of the Maximum Likelihood method under normality assumptions. This model minimizes a function of the variance of the error terms subject to orthogonality conditions between the model residuals and the space of explanatory variables. An intuitive interpretation of the dual problem appeals to basic elements of information theory and an economic interpretation of Lagrange multipliers to establish that the dual maximizes the net value of the sample information. This paper presents the dual ML model for a single regression and provides a numerical example of how to obtain maximum likelihood estimates of the parameters of a linear statistical model using the dual specification.

Abstract:
The purpose of this paper is to combine the estimation of output price
risk and positive mathematical programming (PMP). It reconciles the risk programming
presented by Freund with a consistent estimate of the constant absolute risk
aversion (CARA) coefficient. It extends the PMP approach to calibration of
realized production outputs and observed input prices. The results of this
specification include 1)
uniqueness of the calibrating solution, 2) elimination of the tautological calibration
constraints typical of the original PMP procedure, 3) equivalence between a phase I calibrating solution
and a solution obtained by combining phase I and phase II of the traditional
PMP procedure. In this extended PMP framework, the cost function specification
involves output quantities and input prices—contrary to the myopic cost function
of the traditional PMP approach. This extension allows for a phase III calibrating
model that replaces the usual linear technology with relations corresponding to
Shephard lemma (in the primal constraints) and the marginal cost function (in
the dual constraints). An empirical example with a sample of farms producing
four crops illustrates the novel procedure.

Abstract:
A test of the adding up condition in demand systems is crucial for determining whether a share format is admissible when the number of sample goods is smaller than the number of commodity choices available to consumers. This test requires the estimation of a demand system in a quantity format. It cannot be performed when a demand system is specified in share format. The share specification of any demand system is like a straight jacket: once worn, it forces the error covariance matrix to be singular and the adding up condition to hold whether or not the data generating process warrants it. The empirical verification of the adding up hypothesis uses a five-commodity sample selected from the Canadian Family Expenditure Survey with 4847 observations. Three specifications are considered: AIDS (Almost Ideal Demand System), QUAIDS (Quadratic AIDS) and EASI (Exact Affine Stone Index). The hypothesis is rejected in all three cases with a high level of confidence.

Abstract:
The functional mitral regurgitation is a consequence of adverse LV remodelling that occurs with a structurally normal valve and it is a marker of adverse prognosis.Diastolic dysfunction plays a major role in signs and symptoms of HF and in the risk stratification, and provides prognostic information independently in HF patients and impaired systolic function.Ultrasound lung comets are a simple echographic sign of extravascular lung water, more frequently associated with left ventricular diastolic and/or systolic dysfunction, which can integrate the clinical and pathophysiological information provided by conventional echocardiography and provide a useful information for prognostic stratification of HF patients.Contractile reserve is defined as the difference between values of an index of left ventricular contractility during peak stress and its baseline values and the presence of myocardial viability predicts a favorable outcome. A non-invasive echocardiographic method for the evaluation of force-frequency relationship has been proposed to assess the changes in contractility during stress echo.In conclusion, in HF patients, the evaluation of systolic, diastolic function and myocardial contractile reserve plays a fundamental role in the risk stratification. The highest risk is present in HF patients with a heart that is weak, big, noisy, stiff and wet.Heart failure (HF) is a complex clinical syndrome that can result from any structural or functional cardiac disorder that impairs the ability of the ventricle to fill with or eject blood [1].The syndrome of HF is a common manifestation of the later stages of various cardiovascular diseases, including coronary artery disease, hypertension, valvular disease, and primary myocardial disease.The cardinal manifestations of HF are dyspnea and fatigue, which may limit exercise tolerance, and fluid retention, which may lead to pulmonary congestion and peripheral edema. Both abnormalities can impair the functional capacity and qua

Abstract:
a major role of the serotonergic system has been hypothesized in the pathogenesis of schizophrenia, mostly based on the evidence of action of atypical antipsychotics. disturbances of serotonergic pathways have been implicated in the etiology of schizophrenia. the aim of this study was to investigate the association between schizophrenia and the g861c polymorphism in the 5-ht1db autoreceptor gene. there was conducted a case-control analysis in a sample of 196 schizophrenic patients and 143 gender, age and ethnic matched controls. no statistically differences were found in allelic or genotypic distributions between cases and controls. thus, the results do not support an association of the g861c polymorphism in the 5-ht1db autoreceptor gene with schizophrenia in the studied sample.

Abstract:
results of decentralization and its relationships to demand satisfaction are studied in an organization of agricultural research. three questionnaires relative to 2011 research projects in progress, filled by researchers, project coordinators and technical directors generated the data. performance, inputs, process and external organizational environment are found to be empirically related to decentralization as predicted by contingencial theory. centralized research benefits from human resource quality and access to information. decentralized research benefits from users' proximity. as a consequence, centralization suggests academic quality and decentralization suggests stronger impact in agricultural practice. a decentralization agenda should be beneficial to the producers' demands as far as access to the means (physical, human, economic and organizational resources) is provided as they use to be unsatisfactory in typical decentralized conditions.

Abstract:
this essay is a transcription of the final report of a survey promoted by the national health foundation - funasa 2001, on the subject of public health management, and about the effectiveness of federal sanitary policies in the state of rio grande do norte, over the period 2002-2003. this author was in charge of this research who, on analithycal study of inclusive basis and participative, evaluated the policy creation and financing schemes, management, results of the actions, and offered some proposition to the area. through an ecological study, i had as dependant value and pre-assumption of the sample basis the working hypothesis that larger amounts financing structural sanitation actions would have important impact on curbing down specific morbid-mortality indexes. the range of the sample included seven counties from seven regions with equals intervenient soil, economics and administration values and, as a reference level, seven others regions of the same intervenient pattern, but with zero or small amount of those resources. from the results of effectiveness of the financed actions, was built a ranking of the correlation of human development index (hdi-m) with those of basic sanitation and epidemiological indexes over the same period.

Abstract:
i discuss in this article the characteristics of the nutritional transition. for that, i use the data of nutritional researches of the population, concepts and analytic categories. i make a correlation between these characteristics and the specificities of our development process. then, i question: 1) the sense of this transition, which comes from a stage initially signed by high prevalence of deficits malnutrition, on gravity forms. 2) if the breakpoint used to reckon the gravity malnutrition occurrences, located under the percentile 3 of the international standard classification - of the national center of health statistics (nchs) -, is not to choose gravity as hierarchy of the nutritional attention. 3) if this procedure does not mask the real extension of the moderate and light forms of malnutrition deficits. 4) if the selected criterion does not reproduce iniquity in the malnutrition population. 5) if this transition is moving in the direction of improving, optimizing the frame of the brazilian nutritional situation or making it worst, more complex.