Abstract:
In this paper, we will derive the following formula for the value of the gravitational constant G: (1). This equation has only 0.81% error compared to the common accepted value [1]. The parameters in the equation are the following: the fine structure constant, qthe elementary charge, the mass of the electron, the permittivity of the free space, ethe exponential function and the relation between a circumference and its diameter. Values attached:[2],

Abstract:
The fine-structure constant α [1] is a constant in physics that plays a fundamental role in the electromagnetic interaction. It is a dimensionless constant, defined as: (1)
being q the elementary charge, ε0 the vacuum permittivity, h the Planck constant and c the speed of light in vacuum. The value shown in (1) is according CODATA 2014 [2].
In this paper, it will be explained that the fine-structure constant is one of the roots of the following equation: (2)
being e the mathematical constant e (the base of the natural logarithm). One of the solutions of this equation is: (3)
This means that it is equal to the CODATA value in nine decimal digits (or the seven most significant ones if you prefer). And therefore, the difference between both values is: (4)
This coincidence is higher in orders of magnitude than the commonly accepted necessary to validate a theory towards experimentation.
As the cosine function is periodical, the Equation (2) has infinite roots and could seem the coincidence is just by chance. But as it will be shown in the paper, the separation among the different solutions is sufficiently high to disregard this possibility.
It will also be shown that another elegant way to show Equation (2) is the following (being i the imaginary unit): (5)
having of course the same root (3). The possible meaning of this other representation (5) will be explained.

Abstract:
In the history of mathematics
different methods have been used to detect if a number is prime or not. In this
paper a new one will be shown. It will be demonstrated that if the following
equation is zero for a certain number p,
this number p would be prime. And
being m an integer number higher than (the lowest, the most efficient the operation). . If the result is an integer, this result will tell
us how many permutations of two divisors, the input number has. As you can
check, no recurrent division by odd or prime numbers is done, to check if the
number is prime or has divisors. To get to this point, we will do the
following. First, we will create a domain with all the composite numbers. This
is easy, as you can just multiply one by one all the integers (greater or equal
than 2) in that domain. So, you will get all the composite numbers (not getting
any prime) in that domain. Then, we will use the Fourier transform to change
from this original domain (called discrete time domain in this regards) to the
frequency domain. There, we can check, using Parseval’s theorem, if a certain
number is there or not. The use of Parseval’s theorem leads to the above
integral. If the number p that we
want to check is not in the domain, the result of the integral is zero and the
number is a prime. If instead, the result is an integer, this integer will tell
us how many permutations of two divisors the number p has. And, in consequence information how many factors, the number p has. So, for any number p lower than 2m？- 1, you can check if it is prime or not, just making the
numerical definite integration. We will apply this integral in a computer
program to check the efficiency of the operation. We will check, if no further
developments are done, the numerical integration is inefficient computing-wise
compared with brute-force checking. To be added, is the question regarding the
level of accuracy needed (number of decimals and number of steps in the
numerical integration) to have a reliable result for large numbers. This will
be commented on the paper, but a separate study will be needed to have detailed
conclusions. Of course,

Abstract:
The secret of Being and Time and of its constant cultural and philosophical presence lies in its unusual hermeneutical richness. Being and Time becomes, so to speak, a precise seismometer capable of detecting the slips and falls of the contemporary era with surprising accuracy. It offers us an exact scan of the ethical and moral conscience of our time. Being and Time does not develop a philosophical theory among others, rather it faces the challenge of thoroughly reflecting upon the dilemma that is constantly present in philosophy, namely the question of human being and its relation to being in general. From this point of view, I would like to consider the possibility of reading this fundamental work of Heidegger as an ethics of the care, that is, as book that promotes a cultivation of the self and the other.

In a global context the artistic production has developed its critical processes from the inclusion and participation of audience in the works of art. Somehow the expansion and compression of space-time is reflected in the characteristics of many artists whose work is based on the occupying, mapping and reinterpreting of the spaces that they appropriate. This article tries to analyze the spatial and temporal turn defined by the works of artist Gabriel Orozco, The individual and collective memories, spaces of history, politics and culture are signified by the act of symbolic deterrito-rialization and reterritorialization. The article also analysed the phenomenological and material treatment of objects of Gabriel Orozco in his temporal frequency and perceptual reinterpretation in the context of cultural and political globalization in order to confront symbolic deterritorialization as a resource of his artistic production.

Abstract:
The Gibbs elasticity modulus represents an important tool to predict the foamability for transient and permanent foams like polyurethane flexible systems. Elasticity is related to foamability and so is used as a synonymous for the purpose of this paper. In this article we propose a method and a thermodynamic model to analyze the espumability of silicone surfactants in polyol binary mixtures using surface tension data. The present work describes foamability through the Gibbs elasticity modulus expressed in terms of first and second derivatives of surface pressure vs bulk composition. Furthermore, the Gibbs adsorption equation and the corresponding novel surface equation of state based on a modification of the Langmuir isotherm resulted in an elasticity equation with analytical solution. It is shown that according to foam model systems of surfactant solution in polyol used at commercial processes, optimum concentration level of surfactant obtained at this article by Gibbs adsorption equation and maximus on elasticity modulus finally match.

Abstract:
The Jazmin crude oil is located at the heart of Middle Magdalena in Colombia. It is heavy and sour crude oil with 43 wt.% of vacuum bottoms. It cannot be processed at the conventional refinery without being mixed with other lighter crudes, and should be upgraded to produce synthetic crude with higher concentration of distillates and lower acidity and carbon content. In this paper eight upgrading alternatives are presented. The alternatives include the processing of the crude, reduced crude and vacuum bottoms of the Jazmin crude oil using the following technologies: Distillation, solvent deasphalting, visbreaking, Delayed Coking, and Hydrotreating. The experiments were conducted at pilot scale, and there were used standard analysis techniques such as ASTM. In this study it was found that Jazmine crude oil and its heavy components produce high distillate yields when they were processed with thermal conversion processes. In addition those processes reduce the products acidity. Within the analyzed scheme the one corresponding to the visbreaking of the crude oil and the Delayed Coking of the vacuum bottoms from the visbreaking is perhaps the most attractive, giving 5.9 wt.% of gas, 78.2 wt.% of distillates and 15.9 wt.% of coke.

Five castrated male
Iberian pigs (100 ± 2 kg b.w.) fitted with T-shaped ileal cannulas at the terminal
ileum were used to determine the effects of legume feeding on intestinal
microbiota composition. The diets were based on defatted soybean (Glycine max), lupin (Lupinus angustifolius) or chickpea (Cicer arietinum) seed meals and
contained similar amounts of digestible energy (14.2 - 15.1 MJ·kg^{-1})
and protein (107 g·kg^{-1}). A hydrolyzed casein diet was used to determine
the bacterial counts in pigs fed on a vegetable-free diet. The composition of
the intestinal microbiota at the terminal ileum was analysed by q-PCR. Higher (P < 0.05) lactobacilli log_{10} number of copies was determined in the ileal contents of pigs fed on lupin- or
chickpea-based diets with respect to those fed on the soybean-based diet.
Bacteroides and the Clostridium coccoides/Eubacterium rectale group log_{10} number of copies was lower (

Abstract:
La información se constituye, en la época actual, en uno de los principales recursos para el desarrollo y bienestar de los individuos, por lo que su distribución y aprovechamiento debe constituirse en una prioridad social. Por ello, es necesario establecer estrategias para que las personas aprendan a utilizar estos recursos. Por otra parte, el avance científico y los paradigmas educativos actuales hablan de la importancia de la transdisciplinaridad; las ciencias de la información y las de la comunicación son por naturaleza complementarias, una se enfoca al medio informativo y la otra al proceso comunicativo; es deseable, por ende, que exista una mayor claridad y consistencia conceptual en algunos temas de relevancia común. Este trabajo constituye un esfuerzo, desde la perspectiva de la bibliotecología y las ciencias de la información, para identificar algunos posibles puntos de encuentro entre estas disciplinas, en lo que respecta al estudio y desarrollo de las competencias necesarias para manejar adecuadamente la información. Nowadays, information is one of the main resources for an individual’s development and wellbeing, therefore distributing and using information must be a top priority for society. This entails establishing strategies so people can learn to use this resource. Furthermore, scientific progress and present-day educational paradigms stress trans-disciplinary learning. Information and communication sciences are complementary by nature –one focusing on the medium and the other on the process– so there must be greater clarity and conceptual consistency in a number of key shared areas. This document is an effort, from the perspective of library science and information science, to identify some possible meeting-points between these disciplines, regarding the study and development of the necessary competencies to handle information adequately.

Abstract:
Dentro de la Neurociencia Computacional, la Neuro-ingeniería desarrolla sistemas de comunicación entre una máquina y alguna parte del sistema nervioso. Sin lugar a dudas, estos sistemas constituyen un gran reto científico, ingenieril y ético. Sin embargo, su correcto funcionamiento todavía plantea muchos problemas.