oalib

Publish in OALib Journal

ISSN: 2333-9721

APC: Only $99

Submit

Any time

2019 ( 2 )

2018 ( 4 )

2017 ( 11 )

2016 ( 13 )

Custom range...

Search Results: 1 - 10 of 5624 matches for " Anabel Martinez-Aran "
All listed articles are free for downloading (OA Articles)
Page 1 /5624
Display every page Item
The switch from conventional to atypical antipsychotic treatment should not be based exclusively on the presence of cognitive deficits. A pilot study in individuals with schizophrenia
Gabriel Selva-Vera, Vicent Balanzá-Martínez, José Salazar-Fraile, José Sánchez-Moreno, Anabel Martinez-Aran, Patricia Correa, Eduard Vieta, Rafael Tabarés-Seisdedos
BMC Psychiatry , 2010, DOI: 10.1186/1471-244x-10-47
Abstract: In this naturalistic study, we used a comprehensive neuropsychological battery of tests to assess a sample of schizophrenia patients taking either conventional (n = 13) or novel antipsychotics (n = 26) at baseline and at two years after.Continuous antipsychotic treatment regardless of class was associated with improvement on verbal fluency, executive functions, and visual and verbal memory. Patients taking atypical antipsychotics did not show greater cognitive enhancement over two years than patients taking conventional antipsychotics.Although long-term antipsychotic treatment slightly improved cognitive function, the switch from conventional to atypical antipsychotic treatment should not be based exclusively on the presence of these cognitive deficits.Cognitive disturbances are a core feature of schizophrenia and have been extensively studied in recent years [1]. Cognitive impairment is present before the onset of the illness [2] and is also found in healthy relatives of patients, although to a lesser degree [3]. In addition, this feature is not exclusively secondary to psychiatric symptoms or medication [4]. Cognitive impairment is a better predictor of future functional outcomes compared with positive symptoms [5-7].The positive action of conventional antipsychotics drugs (APDs) on cognition is considered mild or moderate [8] and is limited to certain cognitive domains such as sustained attention [9,10].Regarding novel antipsychotics, this supposed cognitive enhancement would be mediated by their capability to raise the level of dopamine and acetylcholine in prefrontal regions [11]. However, their different affinity for brain receptors may result in different procognitive profiles of each class of antipsychotics. Many studies support a cognitive enhancement of the different atypical antipsychotics: quetiapine and olanzapine [12], quetiapine and risperidone [13], ziprasidone and olanzapine [14]; olanzapine, quetiapine, and risperidone [15], risperidone and quetiap
Validity and reliability of the Functioning Assessment Short Test (FAST) in bipolar disorder
Rosa Adriane R,Sánchez-Moreno Jose,Martínez-Aran Anabel,Salamero Manel
Clinical Practice and Epidemiology in Mental Health , 2007, DOI: 10.1186/1745-0179-3-5
Abstract: Background Numerous studies have documented high rates of functional impairment among bipolar disorder (BD) patients, even during phases of remission. However, the majority of the available instruments used to assess functioning have focused on global measures of functional recovery rather than specific domains of psychosocial functioning. In this context, the Functioning Assessment Short Test (FAST) is a brief instrument designed to assess the main functioning problems experienced by psychiatric patients, particularly bipolar patients. It comprises 24 items that assess impairment or disability in six specific areas of functioning: autonomy, occupational functioning, cognitive functioning, financial issues, interpersonal relationships and leisure time. Methods 101 patients with DSM-IV TR bipolar disorder and 61 healthy controls were assessed in the Bipolar Disorder Program, Hospital Clinic of Barcelona. The psychometric properties of FAST (feasibility, internal consistency, concurrent validity, discriminant validity (euthymic vs acute patients), factorial analyses, and test-retest reliability) were analysed. Results The internal consistency obtained was very high with a Cronbach's alpha of 0.909. A highly significant negative correlation with GAF was obtained (r = -0.903; p < 0.001) pointing to a reasonable degree of concurrent validity. Test-retest reliability analysis showed a strong correlation between the two measures carried out one week apart (ICC = 0.98; p < 0.001). The total FAST scores were lower in euthymic (18.55 ± 13.19; F = 35.43; p < 0.001) patients, as compared with manic (40.44 ± 9.15) and depressive patients (43.21 ± 13.34). Conclusion The FAST showed strong psychometrics properties and was able to detect differences between euthymic and acute BD patients. In addition, it is a short (6 minutes) simple interview-administered instrument, which is easy to apply and requires only a short period of time for its application.
Tony Jones - The Teaching of the Twelve: Believing and Practicing the Primitive Christianity of the Ancient Didache Community
Matina ?aran
Kairos : Evangelical Journal of Theology , 2011,
Abstract:
Workforce Migration in the European Union: Impulse or Obstacle to General Development
Vladislav ?aran
Sfera Politicii , 2011,
Abstract: European Union over the years been the preferred destination for millions of immigrants. Destination states were and still are countries where these immigrants have highly developed social protection, tolerance and understanding by local authorities. In recent years in Europe there is a clear trend that state budgets are no longer able to care for vulnerable groups of society and provides fewer benefits and aid. Immigration is both an opportunity and a challenge for Europe. Legal immigrants are required to complete certain requirements on the labor market in the EU, because EU has an aging population and declining birth rate. Ensuring freedom of movement of persons, particularly workers and maintain a good level of security as border presents clear advantages for both EU member states and a constant worry for the purposes of potential risks and threats that migrants can bring.
On the distribution of Carmichael numbers
Aran Nayebi
Mathematics , 2009,
Abstract: Erd\H{o}s conjectured in 1956 that there are $x^{1-o(1)}$ Carmichael numbers up to $x$. Pomerance made this conjecture more precise and proposed that there are $x^{1-{\frac{\{1+o(1)\}\log\log\log x}{\log\log x}}}$ Carmichael numbers up to $x$. At the time, his data tables up to $25 \cdot 10^{9}$ appeared to support his conjecture. However, Pinch extended this data and showed that up to $10^{21}$, Pomerance's conjecture did not appear well-supported. Thus, the purpose of this paper is two-fold. First, we build upon the work of Pomerance and others to present an alternate conjecture regarding the distribution of Carmichael numbers that fits proven bounds and is better supported by Pinch's new data. Second, we provide another conjecture concerning the distribution of Carmichael numbers that sharpens Pomerance's heuristic arguments. We also extend and update counts pertaining to pseudoprimes and Carmichael numbers, and discuss the distribution of One-Parameter Quadratic-Base Test pseudoprimes.
Upper bounds on the solutions to $n = p+m^2$
Aran Nayebi
Mathematics , 2010,
Abstract: Hardy and Littlewood conjectured that every large integer $n$ that is not a square is the sum of a prime and a square. They believed that the number $\mathcal{R}(n)$ of such representations for $n = p+m^2$ is asymptotically given by \mathcal{R}(n) \sim \frac{\sqrt{n}}{\log n}\prod_{p=3}^{\infty}(1-\frac{1}{p-1}(\frac{n}{p})), where $p$ is a prime, $m$ is an integer, and $(\frac{n}{p})$ denotes the Legendre symbol. Unfortunately, as we will later point out, this conjecture is difficult to prove and not \emph{all} integers that are nonsquares can be represented as the sum of a prime and a square. Instead in this paper we prove two upper bounds for $\mathcal{R}(n)$ for $n \le N$. The first upper bound applies to \emph{all} $n \le N$. The second upper bound depends on the possible existence of the Siegel zero, and assumes its existence, and applies to all $N/2 < n \le N$ but at most $\ll N^{1-\delta_1}$ of these integers, where $N$ is a sufficiently large positive integer and $0< \delta_1 \le 0.000025$.
A Note on the Inverse Laplace Transformation of $f(t)$
Aran Nayebi
Mathematics , 2010,
Abstract: Let $\mathcal{L}\{f(t)\} = \int_{0}^{\infty}e^{-st}f(t)dt$ denote the Laplace transform of $f$. It is well-known that if $f(t)$ is a piecewise continuous function on the interval $t:[0,\infty)$ and of exponential order for $t > N$; then $\lim_{s\to\infty}F(s) = 0$, where $F(s) = \mathcal{L}\{f(t)\}$. In this paper we prove that the lesser known converse does not hold true; namely, if $F(s)$ is a continuous function in terms of $s$ for which $\lim_{s\to\infty}F(s) = 0$, then it does not follow that $F(s)$ is the Laplace transform of a piecewise continuous function of exponential order.
Practical intractability: a critique of the hypercomputation movement
Aran Nayebi
Mathematics , 2012, DOI: 10.1007/s11023-013-9317-3
Abstract: For over a decade, the hypercomputation movement has produced computational models that in theory solve the algorithmically unsolvable, but they are not physically realizable according to currently accepted physical theories. While opponents to the hypercomputation movement provide arguments against the physical realizability of specific models in order to demonstrate this, these arguments lack the generality to be a satisfactory justification against the construction of \emph{any} information-processing machine that computes beyond the universal Turing machine. To this end, I present a more mathematically concrete challenge to hypercomputability, and will show that one is immediately led into physical impossibilities, thereby demonstrating the infeasibility of hypercomputers more generally. This gives impetus to propose and justify a more plausible starting point for an extension to the classical paradigm that is physically possible, at least in principle. Instead of attempting to rely on infinities such as idealized limits of infinite time or numerical precision, or some other physically unattainable source, one should focus on extending the classical paradigm to better encapsulate modern computational problems that are not well-expressed/modeled by the closed-system paradigm of the Turing machine. I present the first steps toward this goal by considering contemporary computational problems dealing with intractability and issues surrounding cyber-physical systems, and argue that a reasonable extension to the classical paradigm should focus on these issues in order to be practically viable.
Exponential prefixed polynomial equations
Aran Nayebi
Mathematics , 2013,
Abstract: A prefixed polynomial equation is an equation of the form $P(t_1,\ldots,t_n) = 0$, where $P$ is a polynomial whose variables $t_1,\ldots,t_n$ range over the natural numbers, preceded by quantifiers over some, or all, of its variables. Here, we consider exponential prefixed polynomial equations (EPPEs), where variables can also occur as exponents. We obtain a relatively concise EPPE equivalent to the combinatorial principle of the Paris-Harrington theorem for pairs (which is independent of primitive recursive arithmetic), as well as an EPPE equivalent to Goodstein's theorem (which is independent of Peano arithmetic). Some new devices are used in addition to known methods for the elimination of bounded universal quantifiers for Diophantine predicates.
Fast matrix multiplication techniques based on the Adleman-Lipton model
Aran Nayebi
Computer Science , 2009, DOI: 10.5897/IJCER10.016
Abstract: On distributed memory electronic computers, the implementation and association of fast parallel matrix multiplication algorithms has yielded astounding results and insights. In this discourse, we use the tools of molecular biology to demonstrate the theoretical encoding of Strassen's fast matrix multiplication algorithm with DNA based on an $n$-moduli set in the residue number system, thereby demonstrating the viability of computational mathematics with DNA. As a result, a general scalable implementation of this model in the DNA computing paradigm is presented and can be generalized to the application of \emph{all} fast matrix multiplication algorithms on a DNA computer. We also discuss the practical capabilities and issues of this scalable implementation. Fast methods of matrix computations with DNA are important because they also allow for the efficient implementation of other algorithms (i.e. inversion, computing determinants, and graph theory) with DNA.
Page 1 /5624
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.