Abstract:
Many writings on information mix information on a given system (I S), measurable information content of a given system (I M), and the (also measurable) information content that we communicate among us on a given system ( IC). They belong to different levels and different aspects of information. The first ( IS) involves everything that one possibly can, at least potentially, know about a system, but will never learn completely. The second ( IM) contains quantitative data that one really learns about a system. The third ( IC) relates rather to the language (including mathematical) by which we transmit information on the system to one another, rather than to the system itself. The information content of a system ( IM —this is what we generally mean by information) may include all (relevant) data on each element of the system. However, we can reduce the quantity of information we need to mediate to each other ( IC), if we refer to certain symmetry principles or natural laws which the elements of the given system correspond to. Instead of listing the data for all elements separately, even in a not very extreme case, we can give a short mathematical formula that informs about the data of the individual elements of the system. This abbreviated form of information delivery includes several conventions. These conventions are protocols that we have learnt before, and do not need to be repeated each time in the given community. These conventions include the knowledge that the scientific community accumulated earlier when discovered and formulated the symmetry principle or the law of nature, the language in which those regularities were discovered and formulated, for example, the symmetry principle or the law of nature, the language in which those regularities were formulated and then accepted by the community, and the mathematical marks and abbreviations that are known only for the members of the given scientific community. We do not need to repeat the rules of the convention each time, because the conveyed information includes them, and it is there in our minds behind our communicated data on the information content. I demonstrate this by using two examples, Kepler’s laws, and the law of correspondence between the DNA codons’ triplet structure and the individual amino acids which they encode. The information content of the language by which we communicate the obtained information cannot be identified with the information content of the system that we want to characterize, and moreover, it does not include all the possible information that we could potentially

Abstract:
The paper discusses the mathematical consequences of the application of derived variables in gauge fields. Physics is aware of several phenomena, which depend first of all on velocities (like e.g., the force caused by charges moving in a magnetic field, or the Lorentz transformation). Applying the property of the second Noether theorem, that allowed generalised variables, this paper extends the article by Al-Kuwari and Taha (1991) with a new conclusion. They concluded that there are no extra conserved currents associated with local gauge invariance. We show, that in a more general case, there are further conserved Noether currents. In its method the paper reconstructs the clue introduced by Utiyama (1956, 1959) and followed by Al-Kuwari and Taha (1991) in the presence of a gauge field that depends on the co-ordinates of the velocity space. In this course we apply certain (but not full) analogies with Mills (1989). We show, that handling the space-time coordinates as implicit variables in the gauge field, reproduces the same results that have been derived in the configuration space (i.e., we do not lose information), while the proposed new treatment gives additional information extending those. The result is an extra conserved Noether current.

Abstract:
The paper introduces an alternative rishon model for a composite structure of quarks and leptons. The model builds matter from six basic blocks (and their antiparticles). For this reason it introduces new properties of rishons, called "scents", that can take two values, called masculine and feminine scents, which can appear in three colours both. The Quantum Scent Dynamics (QSD) model calculates new electric charges for the rishons. Then it discusses the construction of the known families of particles from scents, as well as the constraints and advantages of the proposed hypothetic model.

Abstract:
The problem of embedding the Tsallis, Rényi and generalized Rényi entropies in the framework of category theory and their axiomatic foundation is studied. To this end, we construct a special category MES related to measured spaces. We prove that both of the Rényi and Tsallis entropies can be imbedded in the formalism of category theory by proving that the same basic partition functional that appears in their definitions, as well as in the associated Lebesgue space norms, has good algebraic compatibility properties. We prove that this functional is both additive and multiplicative with respect to the direct product and the disjoint sum (the coproduct) in the category MES, so it is a natural candidate for the measure of information or uncertainty. We prove that the category MES can be extended to monoidal category, both with respect to the direct product as well as to the coproduct. The basic axioms of the original Rényi entropy theory are generalized and reformulated in the framework of category MES and we prove that these axioms foresee the existence of an universal exponent having the same values for all the objects of the category MES. In addition, this universal exponent is the parameter, which appears in the definition of the Tsallis and Rényi entropies. It is proved that in a similar manner, the partition functional that appears in the definition of the Generalized Rényi entropy is a multiplicative functional with respect to direct product and additive with respect to the disjoint sum, but its symmetry group is reduced compared to the case of classical Rényi entropy.

Abstract:
In this paper the behavior of an O-ring made of NBR rubber was investigated under extreme conditions. The effect of the extreme initial compression, operating pressure and extreme temperature conditions were examined. The rubber material was tested in simple tension, pure shear and equibiaxial tension modes complemented with a Dynamic Mechanical Thermal Analysis (DMTA) to capture the viscoelastic behavior of the material. For the investigation, a large-strain viscoelastic material model was developed by the authors, to take into account the large deformations caused by extreme conditions. Insufficient space during installation causes extreme initial compression consequently leading the material to crack on the contacting outer surfaces. It was found that the excessive strain and friction induced shear stress contributes primarily to this phenomenon. Extreme operating pressure causes the seal to penetrate into the gap between the shaft and the housing. This behavior damages the material and cracks appear on the seal. High strain areas were found in the proximity of the gap in the material. The analysis of the extreme operating temperature showed that during cooling the O-ring can completely loose its ability to seal at -70°C. There are three contributing factors: the speed of cooling, the temperature and the coefficient of thermal expansion.

Abstract:
The aim of this paper is to model the steady-state condition of a rotary shaft seal (RSS) system. For this, an iterative thermal-mechanical algorithm was developed based on incremental finite element analyzes. The behavior of the seal’s rubber material was taken into account by a large-strain viscoelastic, so called generalized Maxwell model, based on Dynamic Mechanical Thermal Analyses (DMTA) and tensile measurements. The pre-loaded garter spring was modelled with a bilinear material model and the shaft was assumed to be linear elastic. The density, coefficient of thermal expansion and the thermal conductance of the materials were taken into consideration during simulation. The friction between the rotary shaft seal and the shaft was simplified and modelled as a constant parameter. The iterative algorithm was evaluated at two different times, right after assembly and 1 h after assembly, so that rubber material’s stress relaxation effects are also incorporated. The results show good correlation with the literature data, which state that the permissible temperature for NBR70 (nitrile butadiene rubber) material contacting with ~80 mm shaft diameter, rotating at 2600/min is 100°C. The results show 107°C and 104°C for the two iterations. The effect of friction induced temperature, changes the width of the contact area between the seal and the shaft, and significantly reduces the contact pressure.

Abstract:
The Trans-European Dialogue in Public Administration (TED) is an academic event co-organized each year by the two key professional associations of Public Administration in Europe, the European Group on Public Administration (EGPA) and the Network of Institutes and Schools of Public Administration in Central and Eastern Europe (NISPAcee). Each TED focuses on a selected key theme of contemporary public administration, and invites, or accepts contributions of, the leading scholars of the eld.

Abstract:
The paper searches for an answer to the following questions: why had the situation in Japan and the European Union situation improved in comparison with the one in United States prior to the first oil price shock; what factors altered this tendency later, especially from the 1990s onwards; what was the role of the international economic conditions in all that? Applying the models of mathematical economics, the authors have proven their main statements by an econometric investigation. The most important conclusion that can be drawn is that in the world economic competition the situation both in Japan and the European Union was primarily determined by the changes in the world economic conditions, chiefly the oil prices in the world market and the exchange rates, what can less be said of the United States.

Abstract:
The subject of this article is the Japanese enigma: the long-lasting extraordinarily rapid economic growth, the so-called Japanese economic miracle, and then a very sharp set-back in the growth rate, the prolonged recession. The authors, using an endogenous growth model, have proven that an economic miracle did not happen in Japan either: the very rapid growth proceeded in conformity with the general regularities of economic development. The main cause of prolonged recession, according to the empirical results, is the currency shock, occurred on the basis of an international agreement in the mid-1980s, which decelerated the hitherto extremely dynamic development of Japanese exports, considerably retarding the main factor of rapid economic growth.

Abstract:
The concept of metabolic memory was first described among patients with type 1 diabetes in 2005, based on the results of the follow-up observation of the original cohort in the DCCT [1]. Although this term was already used in former experimental diabetes models and studies with isolated cells as early as the mid-1980s [2], the modern concept of metabolic memory emerged from the DCCT-EDIC. Reassuringly, late effect of previous antihyperglycaemic treatment was documented among patients with type 2 diabetes during the follow-up of the original cohort in the UKPDS [3]. The phenomenon was designated as metabolic legacy. Based on the results of recent randomized, controlled clinical trials and analyses of their follow-up periods it became obvious that the concept of metabolic memory cannot be restricted to antihyperglycaemic treatment only.In this paper, clinical evidence concerning the late effect of antihyperglycaemic treatment is summarized. Additionally, the late effects of lipid-lowering and antihypertensive treatment as well as life-style modification are also reviewed. Taken together, results from recent clinical trials suggest that the original concept of metabolic memory can be defined in a much broader context.The DCCT was a multicenter, randomized, controlled clinical trial which compared intensive insulin therapy with conventional insulin regimens in patients with type 1 diabetes [4]. Originally, 1441 patients with type 1 diabetes were randomly assigned to either intensive or conventional insulin therapy and were followed for a mean of 6.5 years between 1983 and 1993. A significant difference in HbA1c values of the groups was found (mean values in patients with intensive treatment 7.4 % and that in patients with conventional treatment 9.0 %, p？<？0.001). The risk of both development and progression of microvascular complications was significantly reduced by intensive insulin treatment. Nevertheless, due to the low incidence of cardiovascular events only a decre